I have a Debian linux server that runs a large number of cron jobs via a 3rd party scheduler. Many of these cron jobs coincide at exactly on the hour xx:00:00 and half hour xx:30:00 which causes large spikes on the load.
While I am not observing any abnormal behaviour of the server, the more jobs that are added, the higher the load is becoming.
The jobs are small PHP CLI commands run locally on the server that take <30s to run.
I was thinking of adding random sleep xx; values to the beginning of each PHP CLI command of up to around 120s so that some of the commands are more spread out.
The current command launched is
/usr/bin/php /var/www/html/test.php
which I was thinking of changing to, for example,
sleep 45; /usr/bin/php /var/www/html/test.php
My question is, is this just 'false economy' as once the command is loaded with the sleep xx; added to it, is it just wasting resource sitting there counting down and waiting to execute and should I take the approach of "if it ain't broke, don't try and fix it"? Is it better to just let the server queue up the commands and let it process them in due course?
Thank you.
Related
I faced very strange problem on my hosting.
I have script and it can be triggered using URL like this
https://mywebsite.com/script.php
I need this script to be executed one time in two days.
So I created Cron job just like my hosting provider advised me.
wget -O /dev/null -q 'https://mywebsite.com/script.php'
It uses wget because script requires some extra scripts - so my hosting provider said that I need to create task like this.
It worked fine for about a month but for few weeks I have a problem.
For some reason that I and my hosting provider can't understand when I run script by opening URI in browser - script executed fine (I know it because of emails that are sended in 4 different steps of execution). But when Cron execute this scripts it executes infinitely - so I continue to receive emails for numerous times until I rename script or delete it.
Script execution time is about 2-3 minutes. So when I run it from URL and wait till it finishes - I get error on the screen that time of request (60 sec) is over. But I know that scripts executes fine till the last step.
What is the problem?
Wget
I had the same problem at some point with a php based cronjob. The Problem was, that wget itself can have a timeout. If this timeout is reached, wget will try again and again.
Try to use some wget options to make sure it runs as you want it to run.
Example:
wget -O /dev/null --tries=1 --timeout=600 'https://mywebsite.com/script.php'
--tries tells how many times it will try to execute if a timeout appears.
--timeout specifies the max exec. time in seconds.
Those options can be specified at cronjob level as well.
PHP Cronjobs
If possible it will may be a betther choice to let PHP run your cronjob directly. If you know the servers php directory you could create a cronjob like
/usr/bin/php /srv/www/yousite/html/script.php
In this case you have no third party programm like wget to rely on. If this helps depends on how the cronjob is built. If your cronjobs uses $_SERVER variables for example, this would not work.
There are some settings you want to check, before you use any PHP file as cronjob.
Keep in mind that the php configuration set within the php.ini could also have an impact on unwanted errors on PHP Cronjobs in general. In the php.ini there is a value called "max_execution_time" where the max seconds to process a php request is defined.
An other setting you might want to get your eye on is the "memory_limit" wich is also defined within the php.ini configuration. This configuration defines the max. memory a php request can use. As your cronjob seems to run for 2-3 minutes, that could mean that maybe a lot of data is stored in memory while you use it.
Be aware, that any request uses those limits. If you set them to high it may will cause problems with CPU load on your server, or with too many spawned php processes.
If you have a shared hosting service or something similar, you may not be able to change any of those settings.
I have a Windows 2012 Server that runs IIS and SQL Server 2012.
I am trying to run a PHP script from the command prompt. The script takes around 1 hour to run. If I run it straight from the command line like this - c:\PHP>php.exe "D:\Web\phpScript.php" it runs fine. Again it takes around 1 hour to run but it completes fine.
The thing is I need to run it from another PHP page. So this code - exec('start c:\php\php.exe "D:\Web\phpScript.php"'); in PHP runs the script. When I run it from PHP like that it runs good for around 30 minutes or so but for some reason Windows ends up killing the process after around 30 minutes.
I have been watching the task manager on Windows and cannot see any difference in the way the process runs compared to when I run it straight from the command prompt or when I use PHP to run the command. They both show up as a background process and look exactly the same in the task manager but for some reason Windows is killing the one that runs from PHP and not the one ran straight from the command prompt.
I have even tried running the PHP one in Realtime thinking maybe if it had higher priority it would not get killed but that did not help.
I am really stuck with this.
Any help would be great.
Thanks!
If it has to do with your PHP configuration, you can force allowing a certain execution time with this at the beginning of the script
set_time_limit(60*60*2); // allows 2 hours execution time
Then to execute the external file just use include('D:\Web\phpScript.php'); in the script.
http://php.net/manual/en/function.set-time-limit.php
Otherwise if its a server problem, beats me. You could, of course...run it in your web browser instead of in the command prompt, if PHP is installed on the machine.
I have a php file which pulls some data from external API's, and I want to schedule it to do so every few hours (or every few days). Some googleing led me to "scheduled tasks", but it seems I need to be running my own server to do it?
So far, all the PHP and MySQL I've done have been very simple form-filling, so I'm a little lost. Do I need to turn a computer into a server to do this, or should I look into hosts that allow you to run scripts? I'm not exactly sure what I'm looking for.
Side-question: how would I be able to prevent someone else from running the PHP script (therefor making tons of API calls)?
How are you running the script now? Windows or Linux? Linux is a no-brainer with cron: on a PHP-enabled server simply drop the PHP script somewhere, edit the crontab and away you go!
Ex. for every 2 hours
0 */2 * * * /usr/local/bin/php /path/to/script.php
Edit Re: Mac
launchd is apparently the preferred method to run scheduled tasks but I understand that OS X has cron capabilities as well being a UNIX derivative.
If you have a reasonably busy web server, you can simply check every time how long it has been since the last time you ran the script. If more than two hours, run it.
Just make sure to update the time and run the script atomically so you don't launch several copies of the script. You can do this with a file that contains the last time the script was run that you lock while you check and update it.
cronjobs are made for it... You can check the Cron Jobs in cpanel..
I am assuming your website is launched in Linu environment
http://www.cyberciti.biz/faq/how-do-i-add-jobs-to-cron-under-linux-or-unix-oses/
http://man.cx/cron
You can find much more exlaination about the Background Process
http://www.fijiwebdesign.com/blog/create-a-background-process-on-the-server-with-php.html
I have some files on my server, how to open them programatically once a day?
Let them be
http://site.com/scripts/video.php
http://site.com/scripts/music.php
Without my hands, just like sheduling (automatically).
Even if I sleep and server is working, they should open on given time.
And additionally, how to open them once a 10 seconds (for tests)?
Thanks.
The Solution is very clear when you are using a Linux server;CRON JOBS.
One can easily run a cron job by configuring it through the terminal.I saw everyone has provided the Solution,but my answer will be for the people who are novice to Linux servers and don't know much about Cron Jobs.Go to Terminal and type the below commands..
root>which php
The above line will give you the path to where PHP is in your linux systems
Now,
root>crontab e
The above line will open the Cron file in edit mode.
Enter the number of times you want to run a particular php file and what time of the day,month,week,etc.
I am providing the syntex for running a particular file every 15 mins.
So here you go,
(write this in the cron file in edit mode)
*/15 * * * * path/to/your/php path/to/the/file/you/want/to/run
Now,path/to/your/php has to be replaced by the path what you got when you typed
root>which php
And you are done just save the file and close it.You will see a messege on you terminal that a new CronJob is installed.
That's it.
If you're on a Linux/Unix host using a cron job is generally the best approach, as you can simply call the command line version of PHP as a part of the cron job. (You may need to tweak your script if it relies on $_SERVER variables, that said.)
Administration middleware (such as Plesk) often offer the ability to add cron tasks as well, although you many need to check the user/group rights that such tasks are executed with.
Finally, if you use a cron task you can simply enter the required command via the command line during the testing phase. (i.e.: Rather than force a 10 second update (which would be tricky unless you had cron execute a shell script) you could execute the script as required.)
It's not possible with pure PHP. You'll need a cron job for this - ask your provider or administrator whether they are available.
Cron has a resolution of 1 minute, though: Calling a script once every 10 seconds would have to be done e.g. using a PHP script that gets called every minute, and makes six requests every ten seconds.
Running them once a day requires a seperate program running them.
For linux servers the usual choice is a Cron Job, for Windows the Task Sheduler works fine, too.
I am working on cron jobs for my php app and planning to use cron via wget/curl.
Some of my php cron jobs can take 2-3 hours. How to let php work 2-3 hours from cron tab ? Is it good practice to run such long time jobs via cron wget/curl ? Anyone got experience on this ? I also have email queue and i needs to be run every 10 seconds, but cron tab is minute level. Any suggest on this case ?
Thanks for reading.
When you use wget/curl, you are requesting for a page from a webserver. Every webserver will have a time out period, so this might time out.
Also some of the hosting providers may stop the process running beyond certain minutes ( basically done to control the rogue threads).
So it is not advisable to schedule using wget/curl if the job takes more than few minutes.
Try scheduling it using actual scheduler. You can run php from command line
php [options] [-f] [--]
[args...]
php command should be on the path.
You can use the following at the start of your script to tell PHP to effectively never time out:
set_time_limit(0);
What may be wiser is, if crontab is going to run the script every 24 hours, set the timeout to 24 hours so that you don't get two copies running at the same time.
set_time_limit(24*60*60);
Crontab only allows minute-level execution because, that's the most often you should really be launching a script - there are startup/shutdown costs that make more rapid scheduling inefficient.
If your application requires a queue to be checked every 10 seconds a better strategy might be to have a single long-running script that does that checking and uses sleep() occasionally to stop itself from hogging system resources.
On a UNIX system such a script should really run as a daemon - have a look at the PEAR System_Daemon package to see how that can be accomplished simply.
I've just run into the problem with a long running PHP script executed via cron and wget. The script terminates after 15 minutes due to the default timeout in wget.
However I believe this method does in fact work if you set up the correct timeouts. The PHP script you run needs to be set to have an unlimited run time. Best to set this at the start of your long running script.
set_time_limit(0);
When using wget you also need to remove its timeout by passing -T0. e.g.
wget -q0 -T0 http://yourhost/job.php
Be very careful not to overload your server with long running scripts.
If using wget to call the long running cron job, consider some of it's defaults that aren't desirable in this situation. By default it'll --tries 20 times if no data is read within --read-timeout of 900 seconds. So at a maximum set tries to 1 to ensure the cron job is only called once, since you of course wouldn't want a long running script called multiple times while it's still running.
wget -t 1 https://url/
If on cPanel and you don't care for any output emails after each cron job run, also use the following flags -O and -q to hide output, and redirect errors with 2>&1.
wget -O /dev/null -q -t 1 https://url/ 2>&1