Linux PHP create file permissions denied - php

I am working on Ubuntu and trying to get a PHP script working that will allow the user to input a Youtube video URL, and the script will download the flv and convert it using youtube2mp3 (which I found here: http://hubpages.com/hub/Youtube-to-MP3-on-Ubuntu-Linux ). I have been getting errors which I'm sure are permissions based, and I would like to know the best and most secure way to correct them. Right now I'm calling
echo system("youtube-dl --output=testfile.flv --format=18 $url");
just to try and get the downloading portion working. What shows up on the following page is
[youtube] Setting language
[youtube] xOMEi2g_oEU: Downloading video webpage
[youtube] xOMEi2g_oEU: Downloading video info webpage
[youtube] xOMEi2g_oEU: Extracting video information
[youtube] xOMEi2g_oEU: Extracting video information
before showing the rest of my (irrelevant) output. In the apache error log, I'm getting
ERROR: unable to open for writing: [Errno 13]
Permission denied: u'testfile.flv.part'
which is obviously indicative of a permissions issue. Do I have to chown the directory in question to www-user? Is that secure? Or should I chmod the directory instead? Eventually I will be putting this on a public facing server and I don't want any vulnerabilities in my implementation. Any and all advice and answers are greatly appreciated!

This is running as the user running the php process so two things:
Make sure this user has access to the directory you are writing your testfile out to. I would specify a path that is isolated and not part of the web server directory structure which it appears to be doing now
Is $url coming from user input? If it is I would then use escapeshellcmd on the entire string to ensure there isn't the random rm -rf * command in there.

chown can be used only by a superuser so if it's convenient you can use it, but servers don't normally run as superusers so I would go for chmod.

Both of #Wes's suggestions are worth following; you don't want some goofball to supply an url like ||nc -l 8888 | sh & and log in to your system a second later.
I strongly recommend confining your configuration with a tool such as AppArmor, SElinux, TOMOYO, or SMACK. Any of these mandatory access control tools can prevent an application from writing in specific locations, executing arbitrary commands, reading private data, etc.
As I've worked on the AppArmor system for a decade, it's the one I'm most familiar with; I believe you could have a profile for your deployment put together in half a day or so. (It'd take me about ten or fifteen minutes, but like I said, I've been working on AppArmor for a decade. :)

Related

XAMPP SQLite DB using PHP, Can't get to insert [duplicate]

I have a SQLite database that I am using for a website. The problem is that when I try to INSERT INTO it, I get a PDOException
SQLSTATE[HY000]: General error: 8 attempt to write a readonly database
I SSH'd into the server and checked permissions, and the database has the permissions
-rw-rw-r--
I'm not that familiar with *nix permissions, but I'm pretty sure this means
Not a directory
Owner has read/write permissions (that's me, according to ls -l)
Group has read/write permissions
Everyone else only has read permissions
I also looked everywhere I knew to using the sqlite3 program, and found nothing relevant.
Because I didn't know with what permissions PDO is trying to open the database, I did
chmod o+w supplies.db
Now, I get another PDOException:
SQLSTATE[HY000]: General error: 14 unable to open database file
But it ONLY occurs when I try to execute an INSERT query after the database is open.
Any ideas on what is going on?
The problem, as it turns out, is that the PDO SQLite driver requires that if you are going to do a write operation (INSERT,UPDATE,DELETE,DROP, etc), then the folder the database resides in must have write permissions, as well as the actual database file.
I found this information in a comment at the very bottom of the PDO SQLite driver manual page.
This can happen when the owner of the SQLite file itself is not the same as the user running the script. Similar errors can occur if the entire directory path (meaning each directory along the way) can't be written to.
Who owns the SQLite file? You?
Who is the script running as? Apache or Nobody?
For me the issue was SELinux enforcement rather than permissions. The "read only database" error went away once I disabled enforcement, following the suggestion made by Steve V. in a comment on the accepted answer.
echo 0 >/selinux/enforce
Upon running this command, everything worked as intended (CentOS 6.3).
The specific issue I had encountered was during setup of Graphite. I had triple-checked that the apache user owned and could write to both my graphite.db and its parent directory. But until I "fixed" SELinux, all I got was a stack trace to the effect of: DatabaseError: attempt to write a readonly database
This can be caused by SELinux. If you don't want to disable SELinux completely, you need to set the db directory fcontext to httpd_sys_rw_content_t.
semanage fcontext -a -t httpd_sys_rw_content_t "/var/www/railsapp/db(/.*)?"
restorecon -v /var/www/railsapp/db
I got this error when I tried to write to a database on an Android system.
Apparently sqlite3 not only needs write permissions to the database file and the containing directory (as #austin-hyde already said in his answer) but also the environment variable TMPDIR has to point to a (possibly writable) directory.
On my Android system I set it to TMPDIR="/data/local/tmp" and now my script runs as expected :)
Edit:
If you can't set environment variables you can use one of the other methods listed here: https://www.sqlite.org/tempfiles.html#temporary_file_storage_locations
like PRAGMA temp_store_directory = 'directory-name';
In summary, I've fixed the problem by putting the database file (* .db) in a subfolder.
The subfolder and the database file within it must be a member of the
www-data group.
In the www-data group, you must have the right to write to the
subfolder and the database file.
####### Additional Notes For Similar Problem #####
I gave write permissions to my sqlite database file to other users and groups but it still didn't work.
File is in my web root directory for my .NET Core WebApi.
It looked like this:
-rw-rw-rw- 1 root root 24576 Jan 28 16:03 librestore.db
Even if I ran the service as root, I kept getting the error :
Error: SQLite Error 8: 'attempt to write a readonly database'.
I also did a chown to www-data on the librestore.db and I still received the same error.
Finally I moved up above my webroot directory and gave others write access to that directory (LibreStore - the root of my WebApi) also and then it worked.
I'm not sure why I had to give the directory write access if the specific file already had write access, but this is the only thing that worked.
But once I made that change www-data user could access the .db file and inserts succeeded.
I got the same error from IIS under windows 7. To fix this error i had to add full control permissions to IUSR account for sqlite database file. You don't need to change permissions if you use sqlite under webmatrix instead of IIS.
I used:
echo exec('whoami');
to find out who is running the script (say username), and then gave the user permissions to the entire application directory, like:
sudo chown -R :username /var/www/html/myapp
(For followers looking for an answer to a similar question)
I'm building a C# .Net Core 6.0 WPF app. I put the Sqlite.db3 on the c:\ drive for convenience while developing. To write to the database I must open Visual Studio 2019 as Administrator.
#Charles in a comment pointed out the solution to this (or at least, a botch solution). This is merely me spelling it out more clearly. Put file_put_contents('./nameofyourdb.sqlite', null); (or .db, whichever you fancy) in a .php file in the root directory of your app (or wherever you want the db to be created), then load that page which renders the php code. Now you have an sqlite db created by whichever user runs your php code, meaning your php code can write to it. Just don't forget to use sudo when interacting with this db in the console.
A good clean solution to this is to allow the file of your main user account to be written to by (in my case) the http user but this worked for me and its simple.
None of these solutions worked for me and I suppose I had a very rare case that can still happen. Had a power shortage so even with 777 permissions on folder and db file, without SELinux, I would get this error.
Turns out there was a jellyfin.pid file (not sure if it's named after the service or user as they have the same name) locking it after the power shortage. Deleted it, restarted the service and everything worked.
I got this in my browser when I changed from using http://localhost to http://145.900.50.20 (where 145.900.50.20 is my local IP address) and then changed back to localhost -- it was necessary to stay with the IP address once I had changed to that once

Bash/Shell Script for automatic backup of website

I'm brand new to shell scripting and have been searching for examples on how to create a backup script for my website but I'm unable find something or at least something I understand.
I have a Synology Diskstation server that I'd like to use to automatically (through its scheduler) take backups of my website.
I currently am doing this via Automator on my Mac in conjunction with the Transmit FTP program, but making this a command line process is where I struggle.
This is what I'm looking to do in a script:
1) Open a URL without a browser (this URL creates a mysql dump of the databases on the server to be downloaded later). example url would be http://mywebsite.com/dump.php
2) Use FTP to download all files from the server. (Currently Transmit FTP handles this as a sync function and only downloads files where the remote file date is newer than the local file. It also will remove any local files that don't exist on the remote server).
3) Create a compressed archive of the files from step 2, named as website_CURRENT-DATE
4) Move archive from step 3 to a specific folder and delete any file in this specific folder that's older than 120 Days.
Right now I don't know how to do step 1, or the synchronization in step 2 (I see how I can use wget to download the whole site, but that seems as though it will download everything each time it runs, even if its not been changed).
Steps 3 and 4 are probably easy to find via searching, but I haven't searched for that yet since I can't get past step 1.
Thanks!
Also FYI my web-host doesn't do these types of backups, so that's why I like to do my own.
Answering each of your questions in order, then:
Several options, the most common of which would be one of wget http://mywebsite.com/dump.php or curl http://mywebsite.com/dump.php.
Since you have ssh access to the server, you can very easily use rsync to grab a snapshot of the files on-disk with e. g. rsync -essh --delete --stats -zav username#mywebsite.com:/path/to/files/ /path/to/local/backup.
Once you have the snapshot from rsync, you can make a compressed, dated copy with cd /path/to/local/backup; tar cvf /path/to/archives/website-$(date +%Y-%m-%d).tgz *
find /path/to/archives -mtime +120 -type f -exec rm -f '{}' \; will remove all backups older than 120 days.

Run a .php script on my website from command line

I have been searching for an answer to this for a couple of hours with no clear answer.
I normally write .php scripts which do helpful administrative tasks on my website. I upload them to an ftp folder, and run them from my browser when I need them.
Unlike what I a used to, I am trying to run a script (someone else wrote it) to and have been told that I cannot do so from the browser and I need to do it from the command line. Basically everything is set up, but I cannot push to go button and run the script.
Any ideas? I have php installed on my local computer and can run scripts locally from browser and command line, but I do not know how to do the same for the scripts on my website.
I don't know if this helps, but my server is apache and runs off php version 5.3.3
Download Putty, from http://www.putty.org/
Run it
In "host" write your domain, and click on Open
When the black window open, it will ask you for your credentials:
Write your ssh credentials, if you have. If you dont, try with the ftp user and password.
If it doesn't work...get into your host control panel, and find out how to create an SFTP, or SSH user. If you can't find anything, contact support asking how to create that kind of user. When you have it
After you login, your are inside your server, and you can move around, as you would in linux. If you dont know the basics, find a good tutorial. Or just relay in:
ls : list the files and directories
pwd : know in what directory you are
cd DIRNAME : change to other directory inside de current one
cd .. : change to the parent directory
When you are in the directory where your script lives, just execute:
php yourscrip.php
if its a php script
php path_to_script.php
else
/path/to/script
If its the second option you will need to chmod +x /path/to/script first
These should all be run from a ssh session (or any other way of accessing a command line on the machine running the website)
to ssh to a sever use putty if on windows. Your host will be able to give extra details on how to access

Folder permissions when telling PHP to save a file to that folder?

I'm trying to use this Dagon Design PHP form to help a local non-profit publication enable their readers to submit photos. I've got the "mailer" part working -- the notifications work fine -- but the "saving a file to a folder" part isn't functioning.
On the form page, the author says "the directory must have write permissions," but I'm not sure "who" is writing to that folder -- is this PHP script considered "Owner" when it saves something on my site? Or do I need to allow save permissions for Owner, Group and Others?
I'm not sure why the script isn't saving the photos, but this seems like a good place to start. I've tried looking around on Stack for answers, but most questions seem to have to do with folder creation/permissions.
The page I'm clumsily trying to build is here, if that helps.
As Jon has said already, you don't want to allow write access to everyone.
It's also possible (depending on the hosting) that something like suEXEC is being employed - which will cause your PHP script to run as a user other than the webserver's (as reported by Dunhamzzz).
Probably your best approach, in my opinion, is a script calling whoami:
passthru('whoami');
Or alternatively you could try:
var_dump(posix_getpwuid(posix_geteuid()));
Bear in mind, this does give system information away to the world - so delete the script once you've used it!
Then, as you've correctly asserted in your question, it'll likely be the file permissions.
If you do have CLI access, you can update the permissions safely as so (first command gets the group)
id -n -g <username>
chmod 770 <directory>
chown <username>:<group> <directory>
(You may have to pre-pend "sudo" to the "chown" command above, or find other means to run it as "root"..., reply back if you get stuck.)
If you've not got access to run command-line, you'll presumably be doing this via a (S)FTP client or the alike. I'm afraid the options get a little to broad at that point, you'll have to figure it out (or reply back with the client you're using!)
As always, YMMV.
Finally, bear in mind if this is your own code, people will at some point try uploading PHP scripts (or worse). If that directory is accessible via a public URL ... you're opening the hugest of security holes! (.htaccess, or non-document root locations are your friend.)
If you are not sure how is your server configured (and this would influence who's the final file owner) then add write permission to anyone (chmod a+w folder), upload one file and ls -l to see the owner. Then you can adjust permissions to allow write access to certain users only
The PHP script that saves the files is running with the privileges of some user account on the server; the specific account depends on your OS and the web server configuration. On Linux and when PHP is running as an Apache module this user is the same user that Apache runs as.
Solving your problem reduces to determining which user account we are talking about and then ensuring that this user has permission to write to the save directory (either as owner or as a member of the group; giving write access to everyone is not the best idea).
You'll need to set the permissions of the directory to that of the webserver (probably Apache, nginx or similiar), as that's what is executing the PHP.
You can quickly find out the apache user with ps aux | grep apache, then you want to set the permssions of the upload directory to that user, something like this:
chown -R www-data:www-data images/uploads

How to PHP copy across SMB mount

I have a simple script which copies a file from one SMB mount to another. The source file system is the same, but the web server is different. I'm using PHP to process the file by copying it to a temp directory, then performing additional tasks on it. This setup was working at one point in time but it seems that it's no longer working correctly. Can someone point me in the right direction?
fstab mounts:
//192.168.0.x/share /media/folder smbfs username=user,password=mypass
//192.168.0.x/share2 /media/folder2 smbfs username=user,password=mypass
php code:
copy('/media/folder/filename.txt','/media/folder2/temp/filename.txt');
Error:
Warning: copy(/media/folder2/temp/filename.txt): failed to open stream: Permission denied in /www/myphp.php on line xx
Folder permissions (not the mount, but the source folder on the fileserver):
/media/folder = 777
/media/folder2/temp = 777
system("cp /media/folder/filename.txt /media/folder2/temp/filename.txt");
Might work for you.
sounds like a question that is specific to permissions and the OS and not PHP .. what webserver? what is the server running as? nobody:nobody? can nobody:nobody or www-root:www-root read/write data into the directories you are trying to access?
sudo su - nobody
probably wont work as it will most likely have a /bin/false shell
nobody may not be the right account .. ps auxw | grep apache | awk {'print $1'} and see which user it is running as ... then try changing over to that account with sudo
Before PHP can have access to write the files, you need to ensure the user which the webserver is running as ... has access to read/write to the directory you are trying your copy on.
I changed the command to:
copy('/media/folder/filename.txt','/tmp/filename.txt');
Apparently it's more difficult to process files on an SMB share than I thought. The file should be removed when the computer's rebooted, or possibly at regular intervals, depending on the system setup.

Categories