We have an in-house Ubuntu Linux server that we use for development of PHP sites. The /var/www folder is shared via Samba on our network with settings that force anything created through this share to be www-data:www-data. Not the best, but it works.
We've recently started using Laravel - and this requires you to perform commands through artisan on the command line. I SSH into the Linux server to do this - but as a different username. Any files that are then created on the command line via artisan have my username and group attached to them and therefore cannot be edited remotely through the Samba share as they don't have the correct permissions unless I then chmod / chown them as well.
I would prefer this to happen automatically or, even better, to not be needed at all. My colleagues will start to build Laravel sites soon and it's not going to look good if they have to do this.
I have a feeling I'm doing something fundamentally wrong with the way we've got all of this set up (aside from the www-data thing) - I'd appreciate any pointers.
Related
I have a Laravel 8 application, running on a Cent OS 8.4 VM. I'm using the jwt-auth package by Tymondesigns to implement auth on my project, it's being used an an API.
When I set up the project on the server, and deploy it, the storage folder in Laravel is moved to the shared directory as part of the deployer project, and for some reason, despite setting permissions of the storage folder, I'm seeing that not every folder, in particular the ee cache folder has the wrong permissions, and I'm getting a permission denied error thrown by the JWT auth package:
file_put_contents(/var/www/project-beacon-api/releases/37/storage/framework/cache/data/ee/67/ee673b1cd21b0cd9eca15c240d66269df17f9b3a): failed to open stream: No such file or directory
I can't understand why I'm getting this error, and for as long as I've worked with Laravel, setting permissions of the storage folder to 755 / 775 has never worked, and trying to open the website always throws a permission denied.
What am I missing in the permissions configuration, what do I need to run to solve this once and for all?
It always seems to be the ee folder!
The error is happening because of the laravel cache
Before running this command
Go to .env file in you project you will find
CACHE_DRIVER = file
then change the cache driver to this
CACHE_DRIVER = array
When it comes to using cache in Laravel you have 3 possible "families" that you should consider:
Temporary/Debug
array
Always available
file
database
APC (I would not trust this one since PHP7)
Dedicated
Redis
Memcached
Since you can easily replace the cache drivers you don't need to pick one based on your use case, but more based on your server needs/load and possibilities.
For example on your development machine, I suggest using a file, since this way you won't need any extra software clogging your PC plus you gain the ability to quickly clear the cache even if you do something really bad like breaking the artisan command. All you need to do is delete the storage/framework folder and you have a fresh instance again (make sure to regenerate the .gitignore files from your repository after that)
For your main server, you have to think about your possibilities. If you have one of those free hosting websites you almost certainly won't be able to install any new software, so you can consider using a file or database. Even though the database will probably be faster than a file, it is in most cases the weakest point of your website, and trying to push even more data into that bottleneck is not a good idea, that is why I would suggest against using it, and instead stick to files.
If you have a dedicated server then you should definitely pick Memcached or Redis. Which one of the two? It depends on many factors, and you can find a lot of comparisons online, just look for one. I personally prefer Redis because of its ability to persist data, but either one is a good solution
You need to have ownership for your entire project (user:group) [use ls -la to see ownership in your project root folder]
If not right, use chown:
sudo chown -R yourUserName:www-data /path/to/project
If ownership's ok, just set permissions for storage folder like that: sudo chmod -R 775 storage/ so you can have the right to write
Good Afternoon,
I am currently working on a PHP project which requires a php script to mount a windows shared drive. Currently building using OSX with XAMPP.
exec('mount -t smbfs //user:pass#192.168.1.1/Share /Volumes/Share 2> temp/error.txt');
Now i understand why this does not work. Its due to permissions. Apache is running as user daemon. Now i could change the user that Apache runs to fix this "challenge" but want to avoid any changes to the server if possible.
I would like to reach out and see if there is a better way to go about this.
Any ideas?
Ok, So i got it working.
I just needed the web server (user daemon) to own a folder in which the share is mounted.
EG. created a folder called "tempshare" that user daemon owns and is in the same folder as the php script (don't worry, it will be placed out of the web root when complete)
exec('mount -t smbfs //user:pass#192.168.1.1/Share /path/to/tempshare 2> temp/error.txt');
Seemed to work. Any advice on security using this method?
I just need to create a web application that could one-click packages on my ubuntu server, I really no idea, where to start.. Thought of doing it by php, and due to security issues, it wasn't a fair idea.
Sorry, I'm new to this.
Thanks for any help.
You should not do this.
To answer your question, yes, it is possible. It can be done with doing "sudo apt-get ..." within shell_exec(). This would require that the webserver has passwordless access to root powers.
Did I mention that you should not do this?
If you are trying to remotely manage a computer, you should use SSH to log in to it and avoid the unnecessary gymnastics of using PHP, or use a web-based interface like Webmin that can properly do these things for you.
You are on the right track using system()/shell_exec().
I think it "does not work" on your side because your Apache process owner does not have root permission (and you need root to be able to install packages). By the way, its the same for any other programming language implementation you use: you need root to be able to install packages.
You can set your Apache process owner to 'root', but then you'll get this when you try to restart it:
Error: Apache has not been designed to serve pages while
running as root. There are known race conditions that
will allow any local user to read any file on the system.
If you still desire to serve pages as root then
add -DBIG_SECURITY_HOLE to the CFLAGS env variable
and then rebuild the server.
It is strongly suggested that you instead modify the User
directive in your httpd.conf file to list a non-root
user.
You can compile Apache yourself to allow running Apache with root user as indicated above.
To summarize, what you're trying to do is possible BUT you are opening a really big security hole.
YOU SHOULD NOT DO THIS.
Use SSH instead.
I am not much of a web developer, so apologies in advance for this dumb question.
I have a test server (Centos 6.3) with LAMP set up for me to play around. From what I understand, the server executes whatever is in /var/www/html directory. How do you edit source files in that directory ? Do you do a sudo vim "foo.php" each time you want to fix something (or add something ) ? I'd imagine that would be a pain when you are building a complex application with many files and directories .
This is what worked for me. For the record this is a Centos 6.3 server running LAMP (On Rackspace).
First, I found out that apache runs as user "apache" and group "apache" on centos systems. In other distros, I believe it runs as "www-data" in group "www-data". You can verify this by looking at /etc/httpd/conf/httpd.conf. We need to change ownership of /var/www to this user. Replace "apache" below with "www-data" if that is the case for you.
chown -hR apache:apache /var/www
Now lets make it writable by the group:
chmod -R g+rw /var/www
Add yourself to the apache group:
usermod -aG apache yourusername
Replace apache with www-data in the above if thats the case for you.
Now log out and log in - you can now edit the files, ftp them to this directory or do whatrever you want to do.
Comments welcome. TNX!
There are many approaches to modifying and deploying websites/web apps.
CentOS 6 is, by default, accessible with SSH2 on port 22. If you are on Windows you can use a combination of PuTTY and WinSCP (to manage your server, and its files, respectively). If you're on Linux or Mac OS X, SSH is already built into your system and can be accessed with Terminal. I find that using SSH is favourable over other methods because of how widely supported, secure and lightweight it is.
You'll also need a decent text editor or IDE to edit the files if you want proper syntax detection. There's tons of choices, my favourites are Notepad++ and Sublime Text 2. Not to say that I haven't edited my PHP files from time to time using the nano text editor package directly in PuTTY (yum install nano).
If you're using a edit-save-upload approach, just remember to back up your files regularly, you will find out the hard way if you fail to do so. Also, never use root unless you need to. Creating a user solely to modify your websites is good practice (adduser <username>, and give that user write access to /var/www/html).
To answer your second question:
Once you get into heavier web development you will probably want to use something like Git. Deploying with git is beyond the scope of this question so I won't get into that. In brief, you can set it up so your development environment sits locally and you can use a combination of git commit and git push to deploy.
I use a FTP client (FileZilla) to download files, edit them and then re-upload them. If you're a one (wo)man show, and on a test setup and just playing around to learn, this is probably sufficient. With more than 1 person, or going to a (test and) production setup, you should look at some more control with svn like #Markus mentioned in another answer.
You should change the permissions of that directory (with chmod) so you have write permissions, and can then read and write to that directory. Then, you don't need sudo.
dude. read up on version control and source code control systems like subversion and git. the idea is to develop on your machine, revision control the result, then deploy a known working version on the production server.
I have installed a xampp (apache + mysql + php) stack on a portable volume (truecrypt volume on a usb drive) that I use on different linux machines, for webapp development.
This volume is formatted in ext3, so I run into permissions problems as I can't be the same user on each machine. For example I tried to check the git status of a project started on the other machine and access is denied.
Last time I ran into a similar problem I did chmod 777 (or was it chown?) the whole directory, so I guess the permission status of my php files is messy. This time I am also worried about special cases like git files and the symfony web framework I installed lately.
How should I deal with this situation?
I would love to set this permission issue properly.
Thanks a lot for your attention.
Edit :
I may need to write a script to locally "sudo chown lampp:lamppusers" each time I use a different machine. Or some git voodoo trick with local-only repositories export? Any idea?
I doubt there is a way round it like that - you could try using SELinux ACLs instead of the usual permission, but I think that'd be overkill for you.
I would advise just setting the whole directory to 777 with chmod (chown changes ownership), as the user and group each user is in will be different on each machine, you can't really do better. Your only problem here is security - but as its on your local box for your own purposes, I don't see a problem. Not until you copy the files to the server, whereupon you'd best set them back correctly.
You could try authenticating against an LDAP server, you'd retrieve the uid for the username you enter, and that would be mapped to the same uid on each PC, so then you could set group or owner once. (I think, you'd have to ask someone more experienced with LDAP ud mappings).
This seems to be the best solution I have found :
sudo chown -R polypheme:lamppusers webdir
This way I can own again the whole directory with my local username and my local group.
polypheme is a username
lamppusers is a groupname
webdir is the directory containing all my web development files
Note that on computer 1 my username is polypheme and I belong to the group lamppusers, and on computer 2 my username is polypheme and I belong to the group lamppusers.
The usernames and groupnames are exactly the same.
I am not 100% sure this would work perfectly if names happened to be different.
I have not modified any chmod which is good as the problem is much more complex than simply flatten every file permission with a global
chmod -R 744 webdir
It would be the wrong thing to do with protected files in the symfony framework. And .git/ files too I guess. (gbjbaanb was right about this)
At this point, I have recovered the same configuration I used to have on the other machine, which is what I was looking for. I can work again.
Knowing this, my question suddenly seems stupid.
Well, the multi-computer context makes it tricky enough to be valuable. (To me at least)
(This answer will be validated as soon as it has been properly tested)