Repeat php script after time out - php

My php script creates thumbnails of images. Sometimes when it handles a lot of images, the script have to run a long time and ends after 60 seconds because of the time limit on my server.
Can I tell the script to time out after 59sek and then repeat itself?
I need some ideas.
Edit:
I don't think my web hosting allows me to change max_execution_time

I can't believe this is my answer..
loopMe.php:
<meta http-equiv="refresh" content="65;url=loopMe.php">
<?php
/* code that is timing out here */
?>
Your best bet though may be to look at set_time_limit() like others have suggested.
Important Note: this loop will never end. You could however have a session variable, say, $_SESSION['stopLoop'] to flag whenever the script successfully ends. Before printing the meta-refresh line you can check the value of that.

If you don't want to modify your code, use the set_time_limit(0), which sets to have no time limit.
set_time_limit(0)
http://php.net/manual/en/function.set-time-limit.php
I wouldn't recommend to use though, if your system grows this would take a very long time processing. It is better to recode in such a way that you run that only when your server is on low traffic and control how much time it will process data.
For instance you could store the data you need to process in a queue and process as much as you can in a time widow once per night.

set_time_limit(0);

yeah as above suggested use set_time_limit It will increase the sript execution timeout.
If your are in loop for processing multiple files then do set_time_limit every time you before you process the file to the expected duration of time. i.e if a file is assumed to take max 60s for execution before processing use set_time_limit(60)

If your server is not runnig with php in safe mode you can use set_time_limit($time); where $time is the time you need to execute the script, for example you can use set_time_limit(240); set the timeout on 4 minutes.
Or for example you can find out in your script how much time has passed and before the timeout expires add some seconds (set_time_limit(10);) until the script finish. (if you call multiple times to set_time_limit() you will add the time of each call.)
Using this code you can calculate the elapsed time and after for each image you compare it against the server timeout.
Also you can modify your script to log the images that has been already processed and in case of timeout carry on with the task in teh same point.
You can run your script from a cron job in your server (if it's allow you) to make it run periodically. Take a look at this article

Related

Should I avoid a one hour sleep()?

I'm using the DigitalOcean API to create droplets when my web application needs the extra resources. Because of the way DigitalOcean charges (min. one hour increments), I'd like to keep a created server on for an hour so it's available for new tasks after the initial task is completed.
I'm thinking about formatting my script this way:
<?php
createDroplet($dropletid);
$time = time();
// run resource heavy task
sleep($time + 3599);
deleteDroplet($dropletid);
Is this the best way to achieve this?
It doesn't look like a good idea, but the code is so simple, nothing can compete with that. You would need to make sure your script can run, at least, for that long.
Note that sleep() should not have $time as an argument. It sleeps for the given number of seconds. $time contains many, many seconds.
I am worried that the script might get interrupted, and you will never delete the droplet. Not much you can do about that, given this script.
Also, the sleep() itself might get interrupted, causing it to sleep much shorter than you want. A better sleep would be:
$remainingTime = 3590;
do {
$remainingTime = sleep($remainingTime);
} while ($remainingTime > 0);
This will catch an interrupt of sleep(). This is under the assumption that FALSE is -1. See the manual on sleep(): http://php.net/manual/en/function.sleep.php
Then there's the problem that you want to sleep for exactly 3599 seconds, so that you're only charged one hour. I wouldn't make it so close to one hour. You have to leave some time for DigitalOcean to execute stuff and log the time. I would start with 3590 seconds and make sure that always works.
Finally: What are the alternatives? Clearly this could be a cron job. How would that work? Suppose you execute a PHP script every minute, and you have a database entry that tells you which resource to allocate at a certain start time and deallocate at a certain expire time. Then that script could do this for you with an accurary of about a minute, which should be enough. Even if the server crashes and restarts, as long as the database is intact and the script runs again, everything should go as planned. I know, this is far more work to implement, but it is the better way to do it.

PHP while loop issue

I have PHP script contains a long loop to list about 60,000 email addresses from mysql, but after 30,000, my php script stops working and brakes, and sometimes all php script is written in white page (my page has imaged background), I increased PHP memory limit to unlimited, but no help.
What's problem?
The default execution time limit is 30 seconds, or the *max_execution_time* value from php.ini. Is your script taking longer than that? If so, you can increase the limit with set_time_limit.
The better question is, what are you doing with 60,000 email addresses that is taking so long? Chances are, there is a much better way to do whatever is taking too long.
There are two things you can try here. Firstly, set_time_limit(0). Setting it to zero means there is essentially no time limit.
Secondly you might want to use ignore_user_abort(false). This will et whether a client disconnect should abort script execution.
Both will ensure that your script keeps running for as long as it needs... or until your server packs out :)
The problem is not in memory I think it's in Time for execution
set_time_limit(VERY_BIG_NUMBER);
Update:
add
ini_set("display_errors","on");
error_reporting(E_ALL);
and inspect errors if any
check "max_execution_time"
How long does the script run before breaking? 30 seconds?
Use set_time_limit() and increase the max execution time for the script.

php - how to determine execution time?

I have to process more images with big amount, (10 mb/image)
How do I determine the execution time to process all of the images in queue?
Determine the time base from the data that we have.
Set the time limit.
Run process.
And what do the execution time depend on? (Server, internet speed, type of data...?)
#All:
i have changed my way to do my issue, send 1reqeust/1image,
so with 40 images, we will be have 40 request. no need to care about excution time :D
Thanks
You can test your setup with the code
// first image
$start = time();
// ... work on image ...
$elapsed = time() - $start;
if (ini_get('max_execution_time') > $elapsed * $imageCount)
trigger_warning("may be not able to finish in time", E_USER_WARNING);
Please note two things: in the CLI-version of PHP, the max_execution_time is hardcoded to 0 / inifinity (according to this comment). Also, you may reset the timer by calling set_time_limit() again like so:
foreach ($imagelist as $image) {
// ... do image work ....
// reset timer to 30
set_time_limit(30);
}
That way, you can let your script run forever or at least until you're finished with your image processing. You must enable the appropriate overwrite rules in the apache-configuration to allow this via AllowOverride All
I would suggest (given your limited info in the question) that you try using the trial and error method - run your process and see how long it takes - increase the time limit until it completes - you might be able to shorten your process.
Be aware that the server processing time can vary a LOT depending on the current load on the server from other processess. If it's a shared server, some other user can be running some script at this exact time, making your script only perform half as well.
I think it's going to be hard to determine the execution time BEFORE the script is run.
I would upload batches (small groups) of images. The number of images would depend on some testing.
For example, run your script several times simultaneously from different pages to see if they all still complete without breaking. If it works with 5 images in the queue, write your script to process 5 images. After the first five images has processed, store them (write to database or whatever you need), wait a little bit then take the next 5 images.
If it works when you run three scripts with 5 images each at the same time, you should be safe doing it once with whatever some other user on the server is doing.
You change the time execution time limit in the file php.ini, or if you don't have access to the file you can set it in on the fly with set_time_limit(600) for 600 seconds. I would however write smarter code instead than relying on time limit.
My five cents. Good luck!

How to exit from function if it takes more than expected time

I have to call this function
$rep_id=$this->getit($domain);
but some domain takes 2/3 minutes I want to go next if it take long time. I have set set_time_limit(3000); at the begin of php page
set_time_limit() won't work as that sets the time for the script as a whole. I'm not sure if this is really possible with php but you might be able to pull it off with forking. I'm thinking something along the lines of starting a timer (using time() for a timestamp) and looping until it reaches X time, meanwhile forking your $this->geitit() as a child process. Then when the timer runs out, kill the child process. Might work, but dunno.
I guess maybe an alternative is to make it a separate script with a specified timeout using that set_time_limit() and then call it from a main script using exec()
Assuming the call to getit() is in the main thread (i.e. not being spawned in a new thread), the execution of it will likely block the rest of the script. However, if getit() is self-aware with its own timer, you could design it to dump out if that execution limit is reached, returning an error code indicating the problem. If you decide to embark on this change, consider adding an optional paramater to specify the time limit on call.

Avoid PHP execution time limit

I need to create a script in PHP language which performs permutation of numbers. But PHP has an execution time limit set to 60 seconds. How can I run the script so that if you need to run more than 60 sesunde, not been interrupted by the server. I know I can change the maximum execution time limit in php, but I want to hear another version that does not require to know in advance the execution time of a script.
A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.
Any advice is welcome. An example code would be useful.
Thanks.
First I need to enter a number, lets say 25. After this the script is launch and it need to do the following: for every number <= than 25 it will create a file with the numbers generated at the current stage; for the next number it will open the previuos created file, and will create another file base on the lines of the opened file and so on. Because this take to long, I need to avoid the script beeing intrerupted by the server.
#emanuel:
I guess when your friend told you "A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.", he/she must have meant "Split your script computation into x pieces of work and run it separately"
For example with this script you can execute it 150 times to achieve a 150! (factorising) and show the result:
// script name: calc.php
<?php
session_start();
if(!isset($_SESSION['times'])){
$_SESSION['times'] = 1;
$_SESSION['result'] = 0;
}elseif($_SESSION['times'] < 150){
$_SESSION['times']++;
$_SESSION['result'] = $_SESSION['result'] * $_SESSION['times'];
header('Location: calc.php');
}elseif($_SESSION['times'] == 150){
echo "The Result is: " . $_SESSION['result'];
die();
}
?>
BTW (#Davmuz), you can only use set_time_limit() function on Apache servers, it's not a valid function on Microsoft IIS servers.
set_time_limit(0)
You could try to put the calls you want to make in a queue, which you serialize to a file (or memory cache?) when an operation is done. Then you could use a CRON-daemon to execute this queue every sixty seconds, so it continues to do the work, and finishes the task.
The drawbacks of this approach are problems with adding to the queue, with file locking and the such, and if you need the results immediately, this can prove troublesome. If you are adding stuff to a Db, it might work out. Also, it is not very efficient.
Use set_time_limit(0) but you have to disable the safe_mode:
http://php.net/manual/en/function.set-time-limit.php
I suggest to use a fixed time (set_time_limit(300)) because if there is a problem in the script (endless loops or memory leaks) this can not be a source of problems.
The web server, like Apache, have also a maximum time limit of 300 seconds, so you have to change it. If you want to do a Comet application, it may be better to chose another web server than Apache that can have long requests times.
If you need a long execution time for a heavy algorithm, you can also implement a parallel processing: http://www.google.com/#q=php+parallel+processing
Or store the input data and computer with another external script with a cron or whatever else.

Categories