I faced very strange problem on my hosting.
I have script and it can be triggered using URL like this
https://mywebsite.com/script.php
I need this script to be executed one time in two days.
So I created Cron job just like my hosting provider advised me.
wget -O /dev/null -q 'https://mywebsite.com/script.php'
It uses wget because script requires some extra scripts - so my hosting provider said that I need to create task like this.
It worked fine for about a month but for few weeks I have a problem.
For some reason that I and my hosting provider can't understand when I run script by opening URI in browser - script executed fine (I know it because of emails that are sended in 4 different steps of execution). But when Cron execute this scripts it executes infinitely - so I continue to receive emails for numerous times until I rename script or delete it.
Script execution time is about 2-3 minutes. So when I run it from URL and wait till it finishes - I get error on the screen that time of request (60 sec) is over. But I know that scripts executes fine till the last step.
What is the problem?
Wget
I had the same problem at some point with a php based cronjob. The Problem was, that wget itself can have a timeout. If this timeout is reached, wget will try again and again.
Try to use some wget options to make sure it runs as you want it to run.
Example:
wget -O /dev/null --tries=1 --timeout=600 'https://mywebsite.com/script.php'
--tries tells how many times it will try to execute if a timeout appears.
--timeout specifies the max exec. time in seconds.
Those options can be specified at cronjob level as well.
PHP Cronjobs
If possible it will may be a betther choice to let PHP run your cronjob directly. If you know the servers php directory you could create a cronjob like
/usr/bin/php /srv/www/yousite/html/script.php
In this case you have no third party programm like wget to rely on. If this helps depends on how the cronjob is built. If your cronjobs uses $_SERVER variables for example, this would not work.
There are some settings you want to check, before you use any PHP file as cronjob.
Keep in mind that the php configuration set within the php.ini could also have an impact on unwanted errors on PHP Cronjobs in general. In the php.ini there is a value called "max_execution_time" where the max seconds to process a php request is defined.
An other setting you might want to get your eye on is the "memory_limit" wich is also defined within the php.ini configuration. This configuration defines the max. memory a php request can use. As your cronjob seems to run for 2-3 minutes, that could mean that maybe a lot of data is stored in memory while you use it.
Be aware, that any request uses those limits. If you set them to high it may will cause problems with CPU load on your server, or with too many spawned php processes.
If you have a shared hosting service or something similar, you may not be able to change any of those settings.
Related
I wrote some tvguide scrapers in php. I run them from a script that is executed by a cronjob. This script is run every minute, and checks if a scraper needs to start. This way i can alter and manage these scraping jobs without having to modify the cron itself.
These scraping scripts vary in runtime, some take no more than 1 minute, and others can take up to 4 hours. When i run them one after another there is no problem. But when i try to run two script simultaniously - one or both scripts hang. Resulting in a email from the cron:
sh: line 1: 700865 Hangup /usr/local/bin/php /home/id789533/domains/erdesigns.eu/public_html/tvg_schedules/scraper.php --country=dk --provider=1 --scraper=tv2 2>&1
Where the /usr/local/..... is the command for the script, and which is called from the scheduler script.
I just cant find anything related to this message, and i have no idea how to fix it. I can send the script itself if needed.
All advise and help would be apreciated.
[Edit] I also took a look at the resource usage, and the load never gets higher than 150mb and 15% load. I have a limit of 400% and 1GB.
I execute the scripts from the php script like so:
shell_exec(sprintf("/usr/local/bin/php %s 2>&1", $scraper));
where $scraper is the filename. It executes the script like it should, but after a while i get the message sh: line 1: 000000 Hangup
I know for sure that it is not allocating to much memory, someone who can direct me to the right way? I dont know where to look right now.
PHP is a language intended for the web with features like a cap for maximum execution time to make sure scripts do not run indefinitely and thereby blocking resources. Therefore, PHP is not the best choice for this task.
If it is only a short script I would advice you to convert it into a BASH or Python script. However, if you want to stick to PHP, check your php.ini file for settings restricting execution time.
Gist
I am trying to run a cron job in PHP. But the exec and system functions didn't work. (Because the server administrator doesn't allow me to use these two functions due to the security reasons) However, I still need to use PHP to run it, because the running time is decided by the formula and it's uncertain. As a result, how can I run the cron job without using exec or system function in PHP?
My thought
The only way I have come up with is adding the cron job via the cPanel, which is set to “once per minute.” That means setup a cron job for every minute to check the target PHP page whether it has anything to do at that moment.
Issue
There's a possible issue if it always checks the page every minute per day, won't it damage the host? (Maybe it will cause a damage to CPU or maybe occupy the memory space, etc.) If so, how to improve it?
You can call a bash script containing a call to PHP-CLI and then the file:
#!/usr/bin/php
/path/to/my/php/file.php
Go to the command line of the server and type "which php". Replace /usr/bin/php above with that which is returned. Then you call this bash script in your CRON.
As you say you have a cpanel server, im guessing its a linux box. You run a cronjob from a linux box by typing crontab -e and then to run a php script. */5 * * * * /usr/bin/php /path/to/file.php to un it every 5 minutes.
I would like to ask you about differences between wp-cron and server cron. Basically I have a php script which takes some minutes to complete. If I run it through browser, browser shows error something about timeout.
So I put it into server cron (paid hosting) and it is working.
I was trying set_time_limit... when was accessing through browser but it did not work...
Anyway I want to execute this script lets say every 2 weeks. I know that wordpress has wp-cron, but my question is it is good as server cron? Will it work for long time scripts?
Because of my point of view the wp-cron is initialized by user (browser) so it should apply limits in php.ini right?
Because it is paid hosting I can not modify a lot of options...
So what is the difference between these crons?
Thanks
Yes, wp-cron is initialized by the browser. Therefore, it will only execute for 30 seconds by default. Therefore, it is not meant for running scripts that may take longer than a few second.
Server cron runs your script from the command line which does not have same time constraint.
From everything you have said your script is a good candidate to run as a server cron.
I have to run a pretty heavy task on PHP once a week (script that curls to various locations (websites, API's), gathers, sorts data and inserts it into a db). The whole script takes about 10 to 15 mintues to run on my mac (localhost) - guessing it'll run a bit faster on a server. Nevertheless - I'm currently looping through with AJAX, so when each task is finished, next one is launched. Now I need to run it weekly, automatically. So I think I can't do it with AJAX Anymore.
Do I have to just set the php.ini to let a script run for 30 mintues or there is a better way to do it ?
The maximum execution time of the PHP script is determined by the amount of time in which no output has been generated. So writing data into STDOUT (e.g. to a logfile) will keep the script running.
However, if you're running the script from command line, the max-execution-time will be defaulted to zero anyway and as already suggested, I'd start the script with a cronjob instead of an AJAX-Request or similar methods. I actually do that for most of my php-scripts performing administrative tasks like synchronizing data across several databases or similar purposes.
php.ini has nothing to do with scheduling jobs. It's simply definining PHP's startup settings. What you want is a cron job, as your title says.
For OSX cron setup, see http://hintsforums.macworld.com/showthread.php?s=&threadid=39005
I am working on cron jobs for my php app and planning to use cron via wget/curl.
Some of my php cron jobs can take 2-3 hours. How to let php work 2-3 hours from cron tab ? Is it good practice to run such long time jobs via cron wget/curl ? Anyone got experience on this ? I also have email queue and i needs to be run every 10 seconds, but cron tab is minute level. Any suggest on this case ?
Thanks for reading.
When you use wget/curl, you are requesting for a page from a webserver. Every webserver will have a time out period, so this might time out.
Also some of the hosting providers may stop the process running beyond certain minutes ( basically done to control the rogue threads).
So it is not advisable to schedule using wget/curl if the job takes more than few minutes.
Try scheduling it using actual scheduler. You can run php from command line
php [options] [-f] [--]
[args...]
php command should be on the path.
You can use the following at the start of your script to tell PHP to effectively never time out:
set_time_limit(0);
What may be wiser is, if crontab is going to run the script every 24 hours, set the timeout to 24 hours so that you don't get two copies running at the same time.
set_time_limit(24*60*60);
Crontab only allows minute-level execution because, that's the most often you should really be launching a script - there are startup/shutdown costs that make more rapid scheduling inefficient.
If your application requires a queue to be checked every 10 seconds a better strategy might be to have a single long-running script that does that checking and uses sleep() occasionally to stop itself from hogging system resources.
On a UNIX system such a script should really run as a daemon - have a look at the PEAR System_Daemon package to see how that can be accomplished simply.
I've just run into the problem with a long running PHP script executed via cron and wget. The script terminates after 15 minutes due to the default timeout in wget.
However I believe this method does in fact work if you set up the correct timeouts. The PHP script you run needs to be set to have an unlimited run time. Best to set this at the start of your long running script.
set_time_limit(0);
When using wget you also need to remove its timeout by passing -T0. e.g.
wget -q0 -T0 http://yourhost/job.php
Be very careful not to overload your server with long running scripts.
If using wget to call the long running cron job, consider some of it's defaults that aren't desirable in this situation. By default it'll --tries 20 times if no data is read within --read-timeout of 900 seconds. So at a maximum set tries to 1 to ensure the cron job is only called once, since you of course wouldn't want a long running script called multiple times while it's still running.
wget -t 1 https://url/
If on cPanel and you don't care for any output emails after each cron job run, also use the following flags -O and -q to hide output, and redirect errors with 2>&1.
wget -O /dev/null -q -t 1 https://url/ 2>&1