CRON jobs keep erroring - php

We have several CRON jobs running on our apache / whm server. We are using php. The scripts all work completely fine when run from the browser.
The cron will throw back errors like: unable to include files (even when giving the absolute path).
Results will also vary, corrupting output files etc. I am really baffled, as sometimes the crons work fine as well. Seems really intermittent and they work every time perfectly when executed from the browser.
Any help would be appreciated, cheers.

As everyone has pointed out, PHP CLI and the PHP Apache Module are separate software and they have separate configuration files.
Rather than set the crontab up on the root cron tab, make sure all your permissions are correct. Debug with the user they will run from cron as. Assuming you are running Linux, you can use
sudo -i -u username
for this.

Related

Cpanel, User Cron jobs not appearing in cron log

We recently did a large server migration, and for some reason cron isn't working as expected on the new system. Some cron scripts are running as normal but others don't seem to be executing at all.
If I look at the old server and tail the cron log /var/log/cron, I can see cron scripts executed as they should.
On the new server, none of the user cron execution shows up in the log. I can see general cron messages for user accounts, and I can see ROOT crons running, but no information on user crons other than general messages such as LIST, REPLACE, etc.
I've checked permissions on the files being executed. Some of the scripts log to files or send emails, so I know that some are running and some aren't. I've also verified the cron jobs are located in /var/spool/cron.
Most of these are php scripts, and the only change we made on this server vs. the other one is running apache using mpm-itk. This shouldn't be the problem though since file permissions weren't changed and shell is running php cli.
Would be a lot of help if I could verify the scripts are being executed. No errors in apache or other logs that I can find. Any suggestions would be appreciated.

How to run a command in .bashrc file without affecting SFTP connection?

I dont know why everytime that I tried to make the .bashrc run a command affects my SFTP connection from other softwares...
This time Im just trying to start my apache server automatically because the servers restarts every day at midnight so, Its kinda annoying to go to the command line and restart my apache server.. big fail but I guess, I cant do much about my school policies or whatever they do with their servers..
What I was simply adding at the end of this file .bashrc it was this
~/apache/bin/apachectl start
but this simple command creates immediately conflict with my connection SFTP using other softwares.. So, Im not sure which is the proper way to do this..
You really really do not want to put this in your .bashrc. The bashrc gets executed for EVERY shell you start. You could potentially be running this command hundreds of times per login session.
Read the bash man page and figure out which of the various start up files you want this in, my guess would be the .bash_profile script. Even then it seems very odd to me...
http://www.linuxfromscratch.org/blfs/view/6.3/postlfs/profile.html
However, to avoid the sftp problem you need to make sure that your .bashrc script does not write anything to STDOUT. If you run a commmand inside it redirect the output to /dev/null or a logfile. In general it's a very bad idea to run any commands in the .bashrc, you should mostly be setting configuration variables.
Running commands in .bash_profile is sort of ok, but in the long run it almost always causes more problems that it solves.

Shutdown from php - giving apache permission

I am working on a embedded linux system with a web interface (apache). Basically I need to add shutdown and restart functionality to the web interface. However, I am running into permission issues when running:
exec("shutdown now") etc...when calling through the webpage(ie apache).
How the heck do I allow these commands to be called from apache?
Would prefer not to have to give apache full root permissions, but system security is not a huge deal in my case, so if that is the only way, how can I do that?
Making Apache a sudoer is a dangerous move and I'd avoid it. I think QID is close on this... the easiest solution is to set up a cron job under root that runs every X seconds and checks for a file in a directory that apache can write to. Have apache add that file when you want to shut down, and the cron script should have a trigger that (a) removes the file and (b) restarts the machine.
Just be careful that it removes the file correctly and give yourself a pretty long cron delay when you're testing, or the server will just reboot continuously and that would be a mess.
Not knowing a good way to do this, I can offer an ugly hack solution: write a tiny daemon that runs as root and accepts commands to shut the system down, and have your PHP script communicate with the daemon through a reasonably-secured channel (for your definition of reasonable; maybe send a signal, maybe write to a file that the daemon watches, maybe just a network socket, whatever).
be suer you know what you are doing:
exec("sudo ...
apache ALL=(ALL) NOPASSWD: ALL

php exec multiple commands, apache restart

I have to run 2 commands through exec();
the first command is a wrapper calling for (Plesk panel) subsription,
the second is also a plesk command,for dns.
Note: After i execute an add subscription, the apache WILL RESTART!,
So my Question is:
can i call the exec somehow, to execute both commands at linux side without loss of the second command?
Ex:
exec(("/wrapper2 3 --create ... && /wrapper2 4 --update-soa example.com ... ) > /dev/null 2>&1 );
Php will send both commands to linux to execute, or it will restart apache after the first command, and then i can't execute the second command?
Thanks
Um... I'm thinking bad deal. Generally it is a bad idea for a process to tell its parent to restart while the process needs to keep running. But, even if it were a good idea -- Apache is the parent process of PHP in that context (do ps -A, you'll not see PHP), I can't imagine that it would let you restart it and keep running at the same time.
I'd approach it this way: if you can bridge a delay, then have a cron job look for whether a specific file exists, if it does, then execute the two command that you need it to. At a worse-case scenario, make PHP output a file which has the two commands you want run and then have cron run that file.
Well from my understanding the issue lies in the fact that Apache is going to be the parent of the script that is running, when Apache gets shut down so will the script.
Barring that you can deal with a sort of derp-y setup, you can set up a cron job that looks for when it needs to restart the server (either a file you created via touch or something from PHP), which can handle everything outside of the context of Apache's process.
A sort-of-dirty idea. :(
Put the commands in a shell script and execute that script. It's less complicated and just in case you can call it with other tools as well like on apache restart or via cron.
I think why the apache restart is your command executes too long or cost to much system resource and makes apache sub process exits.
Try using fastcgi mode instead of mod_php.
You can make a shell file to execute two commands.

filesize function fails when run from cron

I get the below error when I run my scrip from cron
Warning: filesize() [function.filesize]: stat failed for /home2/sharingi/public_html/scrape/zip/dailydose/April_14_2011.zip in /home2/sharingi/public_html/scrape/zip/zip.php
However if I run the script from my browser it works fine. Some kind of a permissions problem?
It's probably an issue related to the user that your cron process runs under. Make sure that whatever cron runs as has permissions, since it's probably not the same user as your ssh account or the webserver account. You can probably figure out which user cron runs as by configuring cron to run the command whoami and email you the output.
If you can't figure out how to make that work, you might try configuring cron to wget the public url that you know works. Don't forget to turn off the file saving, and set it to quiet mode, otherwise you'll get a lot of garbage from each run.
if you're in a shared hosting environment, your cron job is probably running as your own user, so unless you yourself don't have read permissions for the file in question, I imagine that's probably not the issue.
As a probable work-around, in case you can't get to the bottom of it easily, here's a function which should allow you to get the info you need without using the php-builtin.

Categories