I am running a demo of a CMS on my server. In this demo, potential clients can try out the back-end of the CMS. This is why I created a php-script which deletes the whole CMS folder and copies a back-up back into it. This way, each time the script is run, the demo site is resored.
Thing is though, I am figuring out how to do this via cron job.
The command I use is the following (I am running CentOS).
0 * * * * php /home/USER/public_html/replaceCMS.php
This replaces all files in the folder, but also causes a 500 internal server error.
When I run the script using my browser, the problem does not appear.
I also tried unzipping a .zip with overwrite into the demo folder. Doing this with cPanel's file mananger, all went well. Doing it with unzip -o command causes the same error.
Does any of you know how come?
When the job is running as a root user, the files are likely going to be owned by root, which is not the same user as your web server. Thus, when you call the script via your browser, it is running with the user context of the webserver and not as root.
You can verify this by running an ls -l on the command line and seeing what the owner is when you run it using the cronjob versus when you access the page using your browser.
Related
I'm codding a php script, using Instagram Private PHP Api.
It's work fine via SSH under "root" user, but when I try to run it via browser or cron, I getting error: Warning: chmod(): Operation not permitted in .....
I guess that something wrong with permissions, but I am not really good in server administration and can't understand what I can do =(
Please help, how I can fix this problem?
Because Apache (or the web server you're using) executes PHP using different Linux user (usually www-data), which obviously have different permission than the user account you used in access via SSH.
To tackle the problem, you first have to know the folder / file you're going to chmod() belongs to who. If it belongs to root, then it's not suggested to chmod via any scripts that is accessible by public due to security concerns.
If it belongs to your user name, say foo, you can change the ownership of the folder / file you're going to chmod() to be accessible by www-data group using chown() in SSH console, then you chmod() command can be executed without problem.
The user that PHP runs as must have permissions to chmod the given file or directory. If you're running this script via CRON, you get to set the user that PHP runs as right in the CRON job. If you're visiting the script in a browser, PHP is likely running as php or php-fpm or the web server user.
Simply ensure that the given file or folder is owned by the user that PHP runs as.
Note: It is not recommended that you run this script as root in CRON.
If you are editing /etc/crontab, make sure the user parameter (the one after week) is root.
If you are editing crontab via crontab -e, add user parameter crontab -eu root.
I have a code on PHP that was working in a server but we migrate that code to another server and now that code is not working, specifically we want to run an .exe file using PHP with the instruction exec
I debug the script and it looks be working properly and the IUSR user and IIS_IUSR have the correct permissions and actually the exe file is running, but, when it run it need generate some files that is the part that cause the issues, the program are trying to create files on the AppPool directory for example like this:
C:\MyPath\somewebsite.com\8áª\MyProgram\
Where C:\MyPath\somewebsite.com\ is the AppPool root directory and MyProgram\ is the directory that the app is creating
Where \8᪠is generated randomly and changes all the time that we try to run the program, debugging with Process Monitor I can get an error: PATH NOT FOUND and/or NAME INVALID, on the previous server we modify the user on the IIS to run the script (that was on IIS 6 now we are on IIS 8.5) and that files was created on the home directory of the user AND without the random directory, for example:
C:\Users\MyUser\MyProgram\
where MyUser is the user that we assigned, but on that new server we get the files on the AppPool directory no matter if we change the user
I think that we can solve that if we was able to define a path for the IUSR user and set it as "home" path but I cannot found where to modify the IUSR user, I know that is a build-in user that IIS create but I'm not sure if I can edit that settings for that user.
I already mention that we used IIS but just as an extra data, we are running that over Windows Server 2012 R2
Any suggestion?
You topic / question is:
IIS does not allow run .exe file using excec command on PHP
which is the correct behavior! You don't want to run .exe files through PHP, really. You have to give the IUSR execute permissions on cmd.exe first, meany you might as well give all your virtual users administrator permissions.
lot of time after but, PHP side was ok, the problem was on the .exe file, the exe file create some files that was used for the same exe program, but it uses relative paths so when the process run makes that did not found the files generated and this caused the errors
I have a script write_get.php that I would like to execute via users remotely loading a web page. This script in turn runs
exec("sudo php save_file.php ".$arg1)
to do some file writing that requires sudo permissions. This works fine when I run write_get.php from the command line on my web server as a non-privileged user, but it doesn't work fine when I invoke the script by loading it in a web browser. The web browser presents the same message, making it appear as though there is no error, but the file created by save_file.php is never created. Everything else that needs to happen (another temp file creation that doesn't require sudo + a database insert) work fine, but everything else is in write_get rather than in the sudo-requiring save_file.
I assume the server somehow blocks this call to exec("sudo... when it's made remotely? Or if not, what's happening here? Most importantly, how can I work around this?
p.s. I understand there are probably major security concerns here, but please know there is no sensitive data/anything on this server and that the files created in the sudo-requiring script don't even contain user input, so for the moment I am more concerned with trying to do the above than with creating a safer file structure/alternate way of doing this.
What you're trying to do is a bad idea because you would need to give paswordless root access to the Apache user, which is essentially like making the Apache user equal to root. All it would take to gain root access to your server would be to upload a malicious PHP script and have that script executed by the Apache user. Instead just make the files you are writing to writable by the Apache user by executing:
chown -R www-data:www:data /var/www/html
And then instead of doing exec() just include the other PHP file in your main script.
I have a php script mailing me the contents of some logfiles.
If I run: php maillogs.php on the command line through SSH it mails me the logs perfectly fine.
When I run the same script as a cronjob, I still get mailed (so the script runs) but its seems it no longer has access to the http logs.
can I change my command in DirectAdmin so that the PHP script is ran as root so it gets access to this folder?
My current command in the DirectAdmin input field for cronjobs is:
/usr/local/bin/php /home/davine/cronjobs/maillogs.php
I think cron is not working correctly on your server. Please check cron logs of your server and try to enable SSH access of your user.
Also if you have root access, You can setup this cron under the root user if you want.
I have a PHP script involving exec() that will run fine from the command line but not in a web context. The script is simply this:
<?php exec('echo "wee" > /home/jason/wee.txt');
If I call this script wee.php and run php wee.php, it works fine and wee.txt gets written.
If I go to http://mysite.com/wee.php, the script pretends to run fine but wee.txt doesn't actually get written.
Any idea why this is happening?
The web server runs as a different user, and that user does not have permission to write to your home directory.
The other posters are correct to suggest the web server user doesn't have rights to write to your home directory. To see if they are right try modifying the code to write to /tmp/wee.txt. That should be world writable.
Another possibility is that php can be configured to disable calling exec(). See http://www.cyberciti.biz/faq/linux-unix-apache-lighttpd-phpini-disable-functions/
Your web server probably (correctly) doesn't have the appropriate permissions to write to a home directory.
Noticed you are writing to /home/jason. Note that apache will be the one running this command (i.e. www-data user if using Ubunut or Debian). Does the process have the correect rights to write to that folder?