File access difference for php file running with cron vs browser - php

A php file (say example.php) should write something into another file (say alma.txt). If example.php is run via a cron job, everything is ok, but when I call it through the browser, it does not write to alma.txt. Why?

A cron job runs under the user that owns the crontab file that have the cron job configured.
When you run your php from the cron daemon, is your user the one that is executing the command.
When you run your php script through the browser. in the web server this script is executed by the user that launched the apache process. In ubuntu is the user www-data for example.
This looks like a file permission issue. Just to be sure grant all permission to the file alma.txt.
chmod 777 alma.txt
and try again, if everything is ok, then was a permission issue, and i will suggest you to set the right permissions to the file. Maybe add the group www-data to the file and grant 775 to it.

Related

File permissions on files created by PHP [duplicate]

I have a folder above the webroot that is used to temporarily store user files generated by a php web application. The files may, for example, be PDF's that are going to be attached to emails.
The folder permissions are set to rwxr-xr-x (0755). When executing a procedure from the web application, the files get written to this folder without any issues.
I have now also set up a cron job that calls the php script to execute that exact same procedure as above. However, the PDF cannot be saved into the above folder due to failed permissions - the cron job reports back a permission denied error.
I have tried setting the folder permissions to 0775 and still get a permission denied. However, when the permissions are 0777, then the cron job then works fine.
This seems very strange to me - why does the cron get a permission denied at 0755 but it works fine through the web app?
The probable answer is that the cron job executes under your user - and the directory is owned by apache (or www-data or nobody or whatever user your web server runs as).
To get it to work, you could set up the cron job to run as the web server user.
Something like this:
su -l www-data -c 'crontab -e'
Alternatively, you could change the permissions to 775 (read-write-execute for the owner and group, and read-execute for others) and set the group ownership of the folder to the user running the cron job.
However, you have to make sure that if you're deleting something or descending into folder which is created by apache, you could still run into problems (apache would create a file which it itself owns, and your user cannot delete it then, regardless of the directory permissions.
You could also look at some stuff like suphp or whatever is up to date - where the web server processes are ran under your username, depending on your system architecture.
It depends on which user you have defined the cronjob.
If you're root (not recommended) it should work. If you're the web-user (e.g. www-data on ubuntu) it should work as well.
sudo su - www-data
crontab -e
Permission are given to user-group-everybody. That's what the 3 characters denote.
Your php script runs as a different user&group than the cron job, so they observe different permissions.
Check chown and chgrp, or try to run the cron job with the same user.
if you are using cpanel to run a php, you can try something like this:
"php /home/algo/public_html/testcron.php" ...
just write: php (the rute of the script)/yourscritpt.php"

Running a PHP script that runs a Python script that runs a bash script, hangs on bash

I have a Python script which is encoding a video and then calling a shell script which uploads the new video to dropbox. It works fine from the command line but I needed to make it so others could execute it so I have a PHP script calling the python script.
I don't want the PHP script to run forever (it takes 15-30 mins for it to complete), I just want it to kick off the python script and be done. I figured out what I need to make that happen and like I said it works on the command line. But when it is called via PHP, the video encodes but the file never uploads. I can see the dropbox script was kicked off and is listed as a process using some percent of CPU, that percent never changes, it seems stuck/dead.
the command looks like this, being run using cmd()
script.py -options &>/logs/phptopython.log &
The shell script is kicked off using Popen
Any suggestions?
thanks
It sounds like this could be a permissions issue. Double check the permissions on the directory to which you are trying to upload the video. If you are on Linux you can modify the permissions on that directory like this:
chmod 755 /path/to/dir
This gives the file owner read, write and execute permissions (7). The group and other users get read and execute permissions (5).
Apache is likely running as a different user than when you run the command yourself in bash. A quick test to see if it's a permission issue would be to grant 777 on that directory. I wouldn't leave it that way though – it'd just be a way to quickly identify if permissions are the issue.
If the script works with 777 permissions, you could either change the owner of the directory to the user Apache runs as or add the Apache user to the directory's group and grant the group write permisssions.
Edit:
I just noticed you said you use cmd(), so I'm guessing you are on Windows. My comments might still be relevant but the chmod command won't work on Windows.

Apache runs php scripts as user nobody, cron runs php scripts as user

My default cpanel set up runs apache as user "nobody". So when I run a php script via a browser that outputs a file, that file has ownership nobody:nobody. When I run the script from a cron job logged in as user "fred", the output files are owned by fred:fred
I need both browser and cron to overwrite the same file. The issue I have is that if one "user" creates the file, the other one can't overwrite it.
Please can you let me know where the fundamental problem is and a possible solution. Permissions on the files are 0775.
Do I need to set up groups - adding the user to the same group as nobody? If so how?
How do I get the cron job to run as user nobody?
Many thanks,
Lloyd
Try you code after changing permission of the file as 0777.
But that creates a security issue as anyone can edit your file then.
I have a logic you need to implement to:
Create a shell script that copy the contents of temp file to your actual file.
From PHP you need to update only temp file and Shell can read it as temp file will have read permission for everyone.
Use ssh2_exec command to execute the shell script with your Linux UserName and Password.
For ssh2_exec manual follow this link : http://php.net/manual/en/function.ssh2-exec.php
Hope This will solve your problem.
Ok, my solution to this was to create a crontab for user "nobody".
I've got a standard cpanel installation and so went to /var/spool/cron created an entry for nobody. Ran crontab -e to edit it and install.
Now the php runs as nobody in the cron job exactly the same as if through a browser. All files written belong to nobody with only rw permissions for nobody.

How do I get a PHP file to automatically run itself?

I have a PHP file, x.php, that outputs b.xml every time it is run. The way I do this is by using crontab to run the x.php file. The problem is that due to the server's settings, the new file has permissions of 400. So I also have another crontab line to change file b.xml permissions to 777 so that x.php can run over it next time.
I feel like I am making this too complicated. Is there any way to make this a bit simpler?
Quick Answer
You'll need to chmod the file to be 777 in the x.php script.
After b.xml has been created, run this line:
chmod('path/b.xml', 0777);
Note you should always specify octals when using chmod.
A better way?
When you run a cron job, you should take special note of the user that is running the cronjob.
Generally on a shared server you will have your own login and thus the cron job runs as that user. My question to you - is that user the same as your web server? often php runs as "apache" and cron might be running as "tanner". In that case, setting b.xml to be owned by tanner, and having a permissions 400 means that only tanner can change the file.
To solve this, if you don't have access to umask, one way would be to change your cron job to run as the webserver:
su -c "php /home/jonathan/public_html/b.php" apache
This may or may not work depending if you are allowed to switch to apache as the user. do not forget to switch apache to the actual web servers username.
Now, if that doesn't work, then the alternative is to go for the 777 permissions. Keep in mind on a shared server this means anyone on that server could potentially get to that file if they knew the path.
Another way as suggested by OP:
0,10,20,30,40,50 * * * * /usr/bin/wget http://example.com/user/x.php
This way will always run as the apache (or whatever) user that apache runs as, ensuring the next time it is accessed, the file will be useable.
Ask the server admin to create a new user who owns the folder where the script writes the xml file.
Run your php script through your cron job as such user. If you run your script as the folder's owner you might change the permissions through your php script.
This should work:
// set permission
chmod('path/to/b.xml', 777);
// do other stuff
To solve this issue, I ended up just creating a cronjob such as this:
0,10,20,30,40,50 * * * * /usr/bin/wget http://example.com/user/x.php
This executed the file which created b.xml and since the user who executed the script was public, the permissions remained public as well.

Cron job and folders permissions - permission denied

I have a folder above the webroot that is used to temporarily store user files generated by a php web application. The files may, for example, be PDF's that are going to be attached to emails.
The folder permissions are set to rwxr-xr-x (0755). When executing a procedure from the web application, the files get written to this folder without any issues.
I have now also set up a cron job that calls the php script to execute that exact same procedure as above. However, the PDF cannot be saved into the above folder due to failed permissions - the cron job reports back a permission denied error.
I have tried setting the folder permissions to 0775 and still get a permission denied. However, when the permissions are 0777, then the cron job then works fine.
This seems very strange to me - why does the cron get a permission denied at 0755 but it works fine through the web app?
The probable answer is that the cron job executes under your user - and the directory is owned by apache (or www-data or nobody or whatever user your web server runs as).
To get it to work, you could set up the cron job to run as the web server user.
Something like this:
su -l www-data -c 'crontab -e'
Alternatively, you could change the permissions to 775 (read-write-execute for the owner and group, and read-execute for others) and set the group ownership of the folder to the user running the cron job.
However, you have to make sure that if you're deleting something or descending into folder which is created by apache, you could still run into problems (apache would create a file which it itself owns, and your user cannot delete it then, regardless of the directory permissions.
You could also look at some stuff like suphp or whatever is up to date - where the web server processes are ran under your username, depending on your system architecture.
It depends on which user you have defined the cronjob.
If you're root (not recommended) it should work. If you're the web-user (e.g. www-data on ubuntu) it should work as well.
sudo su - www-data
crontab -e
Permission are given to user-group-everybody. That's what the 3 characters denote.
Your php script runs as a different user&group than the cron job, so they observe different permissions.
Check chown and chgrp, or try to run the cron job with the same user.
if you are using cpanel to run a php, you can try something like this:
"php /home/algo/public_html/testcron.php" ...
just write: php (the rute of the script)/yourscritpt.php"

Categories