I get the below error when I run my scrip from cron
Warning: filesize() [function.filesize]: stat failed for /home2/sharingi/public_html/scrape/zip/dailydose/April_14_2011.zip in /home2/sharingi/public_html/scrape/zip/zip.php
However if I run the script from my browser it works fine. Some kind of a permissions problem?
It's probably an issue related to the user that your cron process runs under. Make sure that whatever cron runs as has permissions, since it's probably not the same user as your ssh account or the webserver account. You can probably figure out which user cron runs as by configuring cron to run the command whoami and email you the output.
If you can't figure out how to make that work, you might try configuring cron to wget the public url that you know works. Don't forget to turn off the file saving, and set it to quiet mode, otherwise you'll get a lot of garbage from each run.
if you're in a shared hosting environment, your cron job is probably running as your own user, so unless you yourself don't have read permissions for the file in question, I imagine that's probably not the issue.
As a probable work-around, in case you can't get to the bottom of it easily, here's a function which should allow you to get the info you need without using the php-builtin.
Related
I faced very strange problem on my hosting.
I have script and it can be triggered using URL like this
https://mywebsite.com/script.php
I need this script to be executed one time in two days.
So I created Cron job just like my hosting provider advised me.
wget -O /dev/null -q 'https://mywebsite.com/script.php'
It uses wget because script requires some extra scripts - so my hosting provider said that I need to create task like this.
It worked fine for about a month but for few weeks I have a problem.
For some reason that I and my hosting provider can't understand when I run script by opening URI in browser - script executed fine (I know it because of emails that are sended in 4 different steps of execution). But when Cron execute this scripts it executes infinitely - so I continue to receive emails for numerous times until I rename script or delete it.
Script execution time is about 2-3 minutes. So when I run it from URL and wait till it finishes - I get error on the screen that time of request (60 sec) is over. But I know that scripts executes fine till the last step.
What is the problem?
Wget
I had the same problem at some point with a php based cronjob. The Problem was, that wget itself can have a timeout. If this timeout is reached, wget will try again and again.
Try to use some wget options to make sure it runs as you want it to run.
Example:
wget -O /dev/null --tries=1 --timeout=600 'https://mywebsite.com/script.php'
--tries tells how many times it will try to execute if a timeout appears.
--timeout specifies the max exec. time in seconds.
Those options can be specified at cronjob level as well.
PHP Cronjobs
If possible it will may be a betther choice to let PHP run your cronjob directly. If you know the servers php directory you could create a cronjob like
/usr/bin/php /srv/www/yousite/html/script.php
In this case you have no third party programm like wget to rely on. If this helps depends on how the cronjob is built. If your cronjobs uses $_SERVER variables for example, this would not work.
There are some settings you want to check, before you use any PHP file as cronjob.
Keep in mind that the php configuration set within the php.ini could also have an impact on unwanted errors on PHP Cronjobs in general. In the php.ini there is a value called "max_execution_time" where the max seconds to process a php request is defined.
An other setting you might want to get your eye on is the "memory_limit" wich is also defined within the php.ini configuration. This configuration defines the max. memory a php request can use. As your cronjob seems to run for 2-3 minutes, that could mean that maybe a lot of data is stored in memory while you use it.
Be aware, that any request uses those limits. If you set them to high it may will cause problems with CPU load on your server, or with too many spawned php processes.
If you have a shared hosting service or something similar, you may not be able to change any of those settings.
We recently did a large server migration, and for some reason cron isn't working as expected on the new system. Some cron scripts are running as normal but others don't seem to be executing at all.
If I look at the old server and tail the cron log /var/log/cron, I can see cron scripts executed as they should.
On the new server, none of the user cron execution shows up in the log. I can see general cron messages for user accounts, and I can see ROOT crons running, but no information on user crons other than general messages such as LIST, REPLACE, etc.
I've checked permissions on the files being executed. Some of the scripts log to files or send emails, so I know that some are running and some aren't. I've also verified the cron jobs are located in /var/spool/cron.
Most of these are php scripts, and the only change we made on this server vs. the other one is running apache using mpm-itk. This shouldn't be the problem though since file permissions weren't changed and shell is running php cli.
Would be a lot of help if I could verify the scripts are being executed. No errors in apache or other logs that I can find. Any suggestions would be appreciated.
I dont know why everytime that I tried to make the .bashrc run a command affects my SFTP connection from other softwares...
This time Im just trying to start my apache server automatically because the servers restarts every day at midnight so, Its kinda annoying to go to the command line and restart my apache server.. big fail but I guess, I cant do much about my school policies or whatever they do with their servers..
What I was simply adding at the end of this file .bashrc it was this
~/apache/bin/apachectl start
but this simple command creates immediately conflict with my connection SFTP using other softwares.. So, Im not sure which is the proper way to do this..
You really really do not want to put this in your .bashrc. The bashrc gets executed for EVERY shell you start. You could potentially be running this command hundreds of times per login session.
Read the bash man page and figure out which of the various start up files you want this in, my guess would be the .bash_profile script. Even then it seems very odd to me...
http://www.linuxfromscratch.org/blfs/view/6.3/postlfs/profile.html
However, to avoid the sftp problem you need to make sure that your .bashrc script does not write anything to STDOUT. If you run a commmand inside it redirect the output to /dev/null or a logfile. In general it's a very bad idea to run any commands in the .bashrc, you should mostly be setting configuration variables.
Running commands in .bash_profile is sort of ok, but in the long run it almost always causes more problems that it solves.
We have several CRON jobs running on our apache / whm server. We are using php. The scripts all work completely fine when run from the browser.
The cron will throw back errors like: unable to include files (even when giving the absolute path).
Results will also vary, corrupting output files etc. I am really baffled, as sometimes the crons work fine as well. Seems really intermittent and they work every time perfectly when executed from the browser.
Any help would be appreciated, cheers.
As everyone has pointed out, PHP CLI and the PHP Apache Module are separate software and they have separate configuration files.
Rather than set the crontab up on the root cron tab, make sure all your permissions are correct. Debug with the user they will run from cron as. Assuming you are running Linux, you can use
sudo -i -u username
for this.
i have a php script which runs fine when executed by SSHing into my server and running it.
however the php script doesn't seem to run even though a cron job set to run it every 10 minutes.
How do I view errors produced by cron job ?
Remember that cron runs your script in a different working directory than what you're probably used to. Try something like the following on the command line:
cd /
./path/to/your/script.php
If it fails, modify all paths in your script correctly until it runs. Then it will run in cron.
Sometimes, the error messages are emailed to you by cron. Otherwise, it probably gets sent to a hapless system administrator who probably scrupulously saves the messages in /dev/null.
When things work from the command line and do not work from cron, the 'environment' is almost always at fault. Something that is set in the normal command line environment is not set in the cron environment. Remember, cron does not run your profile for you - you have a bare minimal set of environment variables with bare minimum values for things like PATH.
Generally, I assume that I should run a shell script from cron, and that shell script is responsible for ensuring the correct environment is in place - and that errors are trapped and logged appropriately - before running the main part of the software.
not sure but it's probably credentials related. Schedule your task to run with your user ID and see if that doesn't clear it up. If so, you'll need to create a set of "batch" credentials that you can use to schedule tasks to run beneath.