I have a question about PHP's execution time limit. I need to run a script for many hours sending HTTP requests. These requests have to be apart a certain time, so that's why the whole thing is supposed to take hours. Does someone have experience setting this kind of time limit for PHP using the line below? For example:
ini_set('max_execution_time', 28800); // 8 hours
Strange question, I know, but let me know if this would work or not. TIA!
Update: I was going to try it from the browser. I'm not familiar with running PHP scripts from the command line.. I should look into this. I did found an alternate way to get this information that could be retrieved from the HTTP request; It turns out we have a database with some of the information already locally accumulated over a long period of time.
Are you running this from browser or from CLI? If from CLI (as you should with such script), there is no exection time limit (i.e. max_execution_time is hardcoded to 0)
set_time_limit(28800);
some (shared)hosts do not allow this
what i will suggest you is maintain a log of when was your last attempt (unix time stamp), and use cron to execute a script which checks if its time to make the next HTTP request, and if yes then update the timestamp in the file to current time stamp.
Hope that helps
Related
first i know this looks like a duplicate, but i tried a lot of things suggested in asked in nearly identical questions.
I have a website which i dont host on my own, but by a hoster. I need to update my data regulary. My aproach is to let a cronjob call a php on my webspace which then updates the data which takes around 5-6 minutes at the moment.
When i open the php myself in my browser in works completely fine. (of course blank for the processing time, but it runs fully and updates all data.)
But when i let the cronjob do it (via cron-job.org) it stops after around 2 min 30 seconds without any error given. (i log the process via echo and logging to a file)
I dont have direct access to the php.ini to modify the variables in there so i tried the following:
ignore_user_abort(true);
ignore_user_abort(true);
set_time_limit(0);
ini_set('max_execution_time','600');
ini_set('max_input_time','600');
I know that cron-job.org timeouts after 30 seconds, but i understodd it so that ignore user abort solves that part.
Im absolutely clueless what to do do make this work and i dont see an other way to automate my update process.
Thanks in advance
edit: i dont want to update anything in a database so "update first 100 rows" doesnt really work for me.. im updatign a json.
I've got a script that does heavy database manipulation.
it worked all okay untill i reached a higher number of databases where to fire the manipulations. now i collide with the max_script_execution timeout.
is it possible to handle the databases one by one when i redirect to the same script with a param for the next db by header location?
or would this not affect the max_script_execution and timeout anyway?
Try this,
Add this line in the top of your php script.
ini_set('max_execution_time', 0);
But, one thing remember, it is not a good trick to follow. You should check you script to minimize the execution time.
I'm pretty new to PHP. This is my first time actually, so I apologize in advance if the question sounds dumb.
I have a php script which fetches data from an external API and updates my database regularly. I know that the best way to do this is to use a cron job. However, I am using an infinite loop which sleeps for a particular time between each update.
Now, I need to allow the user (admin) to start / stop the script and also allow them to change the time interval between each update. The admin does this through a UI. What is the best way to do this? How do I execute the script in the background when the user clicks start and how do I stop the script?
Thanks.
I think the ideal solution would be the following:
Have the background job run as a cronjob every minute (instead of a loop which can cause memory leaks). PHP was not designed to be a daemon.
Set a DB flag in the cronjob for on/off, everytime it runs it checks if its on or off, if off it exists if on it continue.
In the UI you turn that flag on or off depending on what the admin needs.
I think that is the best way to go (and easiest).
You might want to take a look at Threading in PHP -> http://www.php.net/manual/en/class.thread.php
An other option is to either set a SESSION (which is a little tricky, since you won't catch it in the same call) or you generate a file wich tells the script to stop/exit/fail/what ever if that file exists.
I have some code that I know will reach max timeout at some point.
I tried using try-catch to handle this error, but I have just been informed timeouts can't be caught this way.
Is there a way I can catch the error, or count processing time and handle the error just before it reaches timeout?
As rightly mentioned by iTom, first you need to figure out, why your code is taking 30 seconds.
If it is expected & you are doing some sort to database updation then in drupal, we have provision to handle this. You need to use Drupal batch APIs.
Check here for more details: https://api.drupal.org/api/drupal/includes!form.inc/group/batch/7
Functions allowing forms processing to be spread out over several page
requests, thus ensuring that the processing does not get interrupted
because of a PHP timeout, while allowing the user to receive feedback
on the progress of the ongoing operations.
Maximum execution time is a PHP Error not a PHP Exception, therefore your error handling code will be unable to catch an exception that doesn't actually exist. The Execution time limit is really a last resort for the PHP Server to kill a function that's basically gone out of control.
You really need to look into why your database code is taking 30~ seconds to execute and resolve the code/database issue. Another (not recommended) option would be to increase the Maximum execution time in PHP to something suitable for your code.
You can echo time() at before and after of function to count processing
Say I want a php script the run the function itstime() every hour. Other than setting up a cron job on the server how can I do this?
Or is this not possible because php scripts need to be opened to be ran?
Thanks.
EDIT: I want to write some information to a log file and depending on some variables send it in an email.
Well, I definitely recommend the cron job for the purpose. But theoretically, you could have a PHP script run itstime(), sleep for an hour, and loop. This is pretty crazy, though.
The script would look like:
<?php
include('whatever.php');
while(true) {
itstime();
sleep(3600);
}
?>
One would probably run it with nohup php mycrazyscript.php & to detach it from your terminal and make it keep running, and using pkill -f mycrazyscript to terminate it.
The reason this is crazy is that now you've got a sleeping PHP process hanging around doing nothing for 59 minutes out of every hour, and you have to make sure it's both running and running at the right time, both of which cron would take care of for you.
Only appending to chaos' answer, so give him credit.
mrlanrat - see this
If you don't have access to cron, but do have access to a moderately busy webserver, you can use it to trigger your function every hour. This is how WordPress gets around needing cron. Here's how you might do it:
<?php
$lastrun = get_last_run_time(); // get value from database or filesystem
if( time()-$lastrun > 60*60 ) {
itstime();
set_last_run_time(time()); // write value back to database or filesystem
}
// The rest of your page
?>
Of course, there's a pretty serious race condition in this code as is. Solving that is left as an exercise to the reader.
https://www.setcronjob.com/ and various other sites, provide a web based solution for this. You can set it up to call a php script with your code in it every hour or whatever you want
<?php
itstime();
?>