We have a server running Windows Server 2012 on AWS. We use composer to manage packages, and run the composer [install/update] commands on the main Administrator account. We run our scripts as a web user.
Everything was working find, but a new script that I am working on, cannot find any classes. It turns out that the composer package folders, only have permissions set for Admin users, the web user does not have permissions to read the folders.
This has never been a problem before, should composer be applying permissions like this? Is there a way to configure composer to allow our web user to have R/W permissions on the packages?
It just seems like a hack / workaround to have to set the permissions manually each time.
Thanks.
With permission management you have generally two options, either you run the command as the right user (web) so the files are owned by it and you have no problems, or you have to fix the file perms after if by default they are not readable. I don't really see a way around this.
If anyone is still interested, the issue was fixed here https://github.com/composer/composer/issues/1714#issuecomment-19693312, running composer self-update will resolve the issue.
Related
I have a Laravel 8 application, running on a Cent OS 8.4 VM. I'm using the jwt-auth package by Tymondesigns to implement auth on my project, it's being used an an API.
When I set up the project on the server, and deploy it, the storage folder in Laravel is moved to the shared directory as part of the deployer project, and for some reason, despite setting permissions of the storage folder, I'm seeing that not every folder, in particular the ee cache folder has the wrong permissions, and I'm getting a permission denied error thrown by the JWT auth package:
file_put_contents(/var/www/project-beacon-api/releases/37/storage/framework/cache/data/ee/67/ee673b1cd21b0cd9eca15c240d66269df17f9b3a): failed to open stream: No such file or directory
I can't understand why I'm getting this error, and for as long as I've worked with Laravel, setting permissions of the storage folder to 755 / 775 has never worked, and trying to open the website always throws a permission denied.
What am I missing in the permissions configuration, what do I need to run to solve this once and for all?
It always seems to be the ee folder!
The error is happening because of the laravel cache
Before running this command
Go to .env file in you project you will find
CACHE_DRIVER = file
then change the cache driver to this
CACHE_DRIVER = array
When it comes to using cache in Laravel you have 3 possible "families" that you should consider:
Temporary/Debug
array
Always available
file
database
APC (I would not trust this one since PHP7)
Dedicated
Redis
Memcached
Since you can easily replace the cache drivers you don't need to pick one based on your use case, but more based on your server needs/load and possibilities.
For example on your development machine, I suggest using a file, since this way you won't need any extra software clogging your PC plus you gain the ability to quickly clear the cache even if you do something really bad like breaking the artisan command. All you need to do is delete the storage/framework folder and you have a fresh instance again (make sure to regenerate the .gitignore files from your repository after that)
For your main server, you have to think about your possibilities. If you have one of those free hosting websites you almost certainly won't be able to install any new software, so you can consider using a file or database. Even though the database will probably be faster than a file, it is in most cases the weakest point of your website, and trying to push even more data into that bottleneck is not a good idea, that is why I would suggest against using it, and instead stick to files.
If you have a dedicated server then you should definitely pick Memcached or Redis. Which one of the two? It depends on many factors, and you can find a lot of comparisons online, just look for one. I personally prefer Redis because of its ability to persist data, but either one is a good solution
You need to have ownership for your entire project (user:group) [use ls -la to see ownership in your project root folder]
If not right, use chown:
sudo chown -R yourUserName:www-data /path/to/project
If ownership's ok, just set permissions for storage folder like that: sudo chmod -R 775 storage/ so you can have the right to write
I would like to make .sh file for automatic deploy web pages from github to production. I need to run composer install in it but as I run it, it throws me a warning:
"Do not run composer install as root super user!"
I found out this is because of security reasons. But I need to run also other commands which needs to e.g. delete some files and directories.
The solution I found to fix this is:
composer install --no-scripts --no-interaction
The question is: Is it enough? Is --no-script the solution or not? What is the best practice regarding running composer as root?
Best practice is not to use sudo for composer commands at all. If you need sudo for composer, it usually points at your project's file permissions not being setup correctly.
E.g. you should have a non-root user owning the projects directory, and you should run the needed commands as that user, without requiring sudo. If you need to run as root, it probably means that you did so in one of your previous runs, and already messed up your file permissions.
(Best practice is also not running install in production in any case, but at least you are not running update)
In the rarer cases where you need to run composer as a superuser, and you are not on a very constrained environment (say, building a Docker image), you should pay attention to the official guidance and not only use --no-scripts, but also the parameter --no-plugins, so you are only doing file copying and not executing other scripts.
Run as a user who has privileges to delete the "files and folders" you're talking about.
If such a user does not exist, create one, assign ownership/privileges and then run composer under that user.
Simply running it as root just to delete a handful of known folders is a weak argument.
I do not know the security implication of the following code but it seems to stop the issue; at least it removes the notice. I could be dangerous and if so please let it be noted in the comment or the answer be edited by anyone having the authority:
export COMPOSER_ALLOW_SUPERUSER=1; composer show;
I've been running a project written in Laravel which has been fun to use.
The setup I use is the vagrant box Homestead configuration so I do the majority of my work on my local machine and push up to the development server once its ready to go. During the installation I had to push up the logs & vendor folder for it to work properly but now I'm at a stage where every commit I do via the command line includes storage/logs/laravel.log which when I then pull down it asks me to stash/commit on the server because they're different.
I've added it to the .gitignore file in the root directory of the project and it looks like this:
/node_modules
/public/storage
/.idea
Homestead.json
Homestead.yaml
/storage/logs/
/vendor/
Vendor doesn't cause me any problems unless I make changes to it so its not much of a bother, its just the logs which will not stop going up. If I use a GUI tool, I can manually tell it not to go up but I want to get it to the point that I can use the terminal to push it up and not worry about logs need stashing on the server.
I believe this is the same for the .env so I imagine a solution will work for both. I have also noticed that PHPStorm have said they're ignored but tracked with git if that helps.
If you take a look at the Laravel repo on GitHub, you'll find the following .gitignore file in the storage directory:
https://github.com/laravel/laravel/blob/master/storage/logs/.gitignore
This comes with the default installation to mark the logs directory as ignored. If you've deleted this by mistake, you should be able to reinstate it and resolve the issue you're having.
Just as importantly though, your workflow isn't following best practice. With respect to "Vendor doesn't cause me and problems unless i make changes to it" - you should never make changes to your vendor directory. This folder is home to third-party packages and plugins, modifying them directly causes multiple issues, chief amongst them:
You can no longer update a modified package without breaking your application.
Other developers won't be able to replicate your installation easily.
In fact, the vendor directory shouldn't be versioned at all. The best way to handle the files within it is using a package manager, like Composer, to do it all for you. This means you can easily switch between different versions of your packages and, by versioning only the composer files, other developers can run composer install or composer update to synchronise their development environment to yours quickly and accurately.
I am working on a Laravel project where I have to generate a Nginx configuration file and store it on /etc/nginx/sites-available directory which only has write rights for the admin user, I have admin rights on the server, I just want to know if there is a way for doing this using the Process Component of Symfony stack.
Thanks a lot and bests ;)
I would recommend using linux ACL, and give PHP process rights to write into the directory. That way you don't need sudo.
Also, you will need rights to reload the nginx process. And imho having a cronjob under root user, that reloads the configuration, if it changes and is valid, is a much better option.
You should read the relevant answers, that suggest not having rights to do a sudo call from PHP, for example
https://stackoverflow.com/a/35340819/602899
https://stackoverflow.com/a/29520712/602899
Just don't do it. Find a workaround that doesn't require sudo permissions.
I just need to create a web application that could one-click packages on my ubuntu server, I really no idea, where to start.. Thought of doing it by php, and due to security issues, it wasn't a fair idea.
Sorry, I'm new to this.
Thanks for any help.
You should not do this.
To answer your question, yes, it is possible. It can be done with doing "sudo apt-get ..." within shell_exec(). This would require that the webserver has passwordless access to root powers.
Did I mention that you should not do this?
If you are trying to remotely manage a computer, you should use SSH to log in to it and avoid the unnecessary gymnastics of using PHP, or use a web-based interface like Webmin that can properly do these things for you.
You are on the right track using system()/shell_exec().
I think it "does not work" on your side because your Apache process owner does not have root permission (and you need root to be able to install packages). By the way, its the same for any other programming language implementation you use: you need root to be able to install packages.
You can set your Apache process owner to 'root', but then you'll get this when you try to restart it:
Error: Apache has not been designed to serve pages while
running as root. There are known race conditions that
will allow any local user to read any file on the system.
If you still desire to serve pages as root then
add -DBIG_SECURITY_HOLE to the CFLAGS env variable
and then rebuild the server.
It is strongly suggested that you instead modify the User
directive in your httpd.conf file to list a non-root
user.
You can compile Apache yourself to allow running Apache with root user as indicated above.
To summarize, what you're trying to do is possible BUT you are opening a really big security hole.
YOU SHOULD NOT DO THIS.
Use SSH instead.