php - how to determine execution time? - php

I have to process more images with big amount, (10 mb/image)
How do I determine the execution time to process all of the images in queue?
Determine the time base from the data that we have.
Set the time limit.
Run process.
And what do the execution time depend on? (Server, internet speed, type of data...?)
#All:
i have changed my way to do my issue, send 1reqeust/1image,
so with 40 images, we will be have 40 request. no need to care about excution time :D
Thanks

You can test your setup with the code
// first image
$start = time();
// ... work on image ...
$elapsed = time() - $start;
if (ini_get('max_execution_time') > $elapsed * $imageCount)
trigger_warning("may be not able to finish in time", E_USER_WARNING);
Please note two things: in the CLI-version of PHP, the max_execution_time is hardcoded to 0 / inifinity (according to this comment). Also, you may reset the timer by calling set_time_limit() again like so:
foreach ($imagelist as $image) {
// ... do image work ....
// reset timer to 30
set_time_limit(30);
}
That way, you can let your script run forever or at least until you're finished with your image processing. You must enable the appropriate overwrite rules in the apache-configuration to allow this via AllowOverride All

I would suggest (given your limited info in the question) that you try using the trial and error method - run your process and see how long it takes - increase the time limit until it completes - you might be able to shorten your process.

Be aware that the server processing time can vary a LOT depending on the current load on the server from other processess. If it's a shared server, some other user can be running some script at this exact time, making your script only perform half as well.
I think it's going to be hard to determine the execution time BEFORE the script is run.
I would upload batches (small groups) of images. The number of images would depend on some testing.
For example, run your script several times simultaneously from different pages to see if they all still complete without breaking. If it works with 5 images in the queue, write your script to process 5 images. After the first five images has processed, store them (write to database or whatever you need), wait a little bit then take the next 5 images.
If it works when you run three scripts with 5 images each at the same time, you should be safe doing it once with whatever some other user on the server is doing.
You change the time execution time limit in the file php.ini, or if you don't have access to the file you can set it in on the fly with set_time_limit(600) for 600 seconds. I would however write smarter code instead than relying on time limit.
My five cents. Good luck!

Related

How do I observe a rate Limit in PHP?

So, I'm requesting data from an API.
So far, my API key is limited to:
10 requests every 10 seconds
500 requests every 10 minutes
Bascially, I want to request a specific value from every game the user has played.
That are, for example, about 300 games.
So I have to make 300 requests with my PHP. How can I slow them down to observe the rate limit?
(It can take time, site does not have to be fast)
I tried sleep(), which resulted in my script crashing.. Any other ways to do this?
I suggest setting up a cron job that executes every minute, or even better use Laravel scheduling rather than using sleep or usleep to imitate a cron.
Here is some information on both:
https://laravel.com/docs/5.1/scheduling
http://www.cyberciti.biz/faq/how-do-i-add-jobs-to-cron-under-linux-or-unix-oses/
This sounds like a perfect use for the set_time_limit() function. This function allows you to specify how long your script can execute, in seconds. For example, if you say set_time_limit(45); at the beginning of your script, then the script will run for a total of 45 seconds. One great feature of this function is that you can allow your script to execute indefinitely (no time limit) by saying: set_time_limit(0);.
You may want to write your script using the following general structure:
<?php
// Ignore user aborts and allow the script
// to run forever
ignore_user_abort(true);
set_time_limit(0);
// Define constant for how much time must pass between batches of connections:
define('TIME_LIMIT', 10); // Seconds between batches of API requests
$tLast = 0;
while( /* Some condition to check if there are still API connections that need to be made */ ){
if( timestamp() <= ($tLast + TIME_LIMIT) ){ // Check if TIME_LIMIT seconds have passed since the last connection batch
// TIME_LIMIT seconds have passed since the last batch of connections
/* Use cURL multi to make 10 asynchronous connections to the API */
// Once all of those connections are made and processed, save the current time:
$tLast = timestamp();
}else{
// TIME_LIMIT seconds have not yet passed
// Calculate the total number of seconds remaining until TIME_LIMIT seconds have passed:
$timeDifference = $tLast + TIME_LIMIT - timestamp();
sleep( $timeDifference ); // Sleep for the calculated number of seconds
}
} // END WHILE-LOOP
/* Do any additional processing, computing, and output */
?>
Note: In this code snippet, I am also using the ignore_user_abort() function. As noted in the comment on the code, this function just allows the script to ignore a user abort, so if the user closes the browser (or connection) while your script is still executing, the script will continue retrieving and processing the data from the API anyway. You may want to disable that in your implementation, but I will leave that up to you.
Obviously this code is very incomplete, but it should give you a decent understanding of how you could possibly implement a solution for this problem.
Don't slow the individual requests down.
Instead, you'd typically use something like Redis to keep track of requests per-IP or per-user. Once the limit is hit for a time period, reject (with a HTTP 429 status code, perhaps) until the count resets.
http://redis.io/commands/INCR coupled with http://redis.io/commands/expire would easily do the trick.

Fatal error: Maximum execution time of 30 seconds exceeded in joomla solution without changing ini file

I'm created a Joomla extension in which i'm storing records from table A to table B. My script is working fine if table A contains less data.
If table A contains large amout of data. While inserting this huge data execution is getting exceed & showing this error 'Fatal error: Maximum execution time of 30 seconds exceeded in
/mysite/libraries/joomla/database/database/mysqli.php on line 382'.
I can overcome this problem by making change in ini file, but its Joomla extension which people gonna use it in their site so i can't tell them to make change in ini file infact i don't wanna tell them.
take a look into this
http://davidwalsh.name/increase-php-script-execution-time-limit-ini_set
ini_set('max_execution_time', 300);
use this way or
set_time_limit(0);
Use the below codes at the start of the page where you wrote the query codes
set_time_limit(0);
Technically, you can increase the maximum execution time using set_time_limit. Personally, I wouldn't mess with limits other people set on their servers, assuming they put them in for a reason (performance, security - especially in a shared hosting context, where software like Joomla! is often found). Also, set_time_limit won't work if PHP is run in safe mode.
So what you're left with is splitting the task into multiple steps. For example, if your table has 100000 records and you measure that you can process about 5000 records in a reasonable amount of time, then do the operation in 20 individual steps.
Execution time for each step should be a good deal less than 30 seconds on an average system. Note that the number of steps is dynamic, you programmatically divide the number of records by a constant (figure out a useful value during testing) to get the number of steps during runtime.
You need to split your script into two parts, one that finds out the number of steps required, displays them to the user and sequentially runs one step after another, by sending AJAX requests to the second script (like: "process records 5001 to 10000"), and marking steps as done (for the user to see) when the appropriate server respone arrives (i.e. request complete).
The second part is entirely server-sided and accepts AJAX requests. This script does the actual work on the server. It must receive some kind of parameters (the "process records 5001 to 10000" request) to understand which step it's supposed to process. When it's done with its step, it returns a "success" (or possibly "failure") code to the client script, so that it can notify the user.
There are variations on this theme, for instance you can build a script which redirects the user to itself, but with different parameters, so it's aware where it left off and can pick up from there with the next step. In general, you'd want the solution that gives the user the most information and control possible.

Repeat php script after time out

My php script creates thumbnails of images. Sometimes when it handles a lot of images, the script have to run a long time and ends after 60 seconds because of the time limit on my server.
Can I tell the script to time out after 59sek and then repeat itself?
I need some ideas.
Edit:
I don't think my web hosting allows me to change max_execution_time
I can't believe this is my answer..
loopMe.php:
<meta http-equiv="refresh" content="65;url=loopMe.php">
<?php
/* code that is timing out here */
?>
Your best bet though may be to look at set_time_limit() like others have suggested.
Important Note: this loop will never end. You could however have a session variable, say, $_SESSION['stopLoop'] to flag whenever the script successfully ends. Before printing the meta-refresh line you can check the value of that.
If you don't want to modify your code, use the set_time_limit(0), which sets to have no time limit.
set_time_limit(0)
http://php.net/manual/en/function.set-time-limit.php
I wouldn't recommend to use though, if your system grows this would take a very long time processing. It is better to recode in such a way that you run that only when your server is on low traffic and control how much time it will process data.
For instance you could store the data you need to process in a queue and process as much as you can in a time widow once per night.
set_time_limit(0);
yeah as above suggested use set_time_limit It will increase the sript execution timeout.
If your are in loop for processing multiple files then do set_time_limit every time you before you process the file to the expected duration of time. i.e if a file is assumed to take max 60s for execution before processing use set_time_limit(60)
If your server is not runnig with php in safe mode you can use set_time_limit($time); where $time is the time you need to execute the script, for example you can use set_time_limit(240); set the timeout on 4 minutes.
Or for example you can find out in your script how much time has passed and before the timeout expires add some seconds (set_time_limit(10);) until the script finish. (if you call multiple times to set_time_limit() you will add the time of each call.)
Using this code you can calculate the elapsed time and after for each image you compare it against the server timeout.
Also you can modify your script to log the images that has been already processed and in case of timeout carry on with the task in teh same point.
You can run your script from a cron job in your server (if it's allow you) to make it run periodically. Take a look at this article

Can 1330316 AJAX requests crash my server?

I'm building a small PHP/Javascript app which will do some processing for all cities in all US states. This rounds up to a total of (52 x 25583) = 1330316 or less items that will need to be processed.
The processing of each item will take about 2-3 seconds, so its possible that the user could have to stare at this page for 1-2 hours (or at least keep it minimized while he did other stuff).
In order to give the user the maximum feedback, I was thinking of controlling the processing of the page via javascript, basically something like this:
var current = 1;
var max = userItems.length; // 1330316 or less
process();
function process()
{
if (current >= max)
{
alert('done');
return;
}
$.post("http://example.com/process", {id: current}, function()
{
$("#current").html(current);
current ++;
process();
}
);
}
In the html i will have the following status message which will be updated whenever the process() function is called:
<div id="progress">
Please wait while items are processed.
<span id="current">0</span> / <span id="max">1330316</span> items have been processed.
</div>
Hopefully you can all see how I want this to work.
My only concern is that, if those 1330316 requests are made simultaneously to the server, is there a possibility that this crashes/brings down the server? If so, if I put in an extra wait of 2 seconds per request using sleep(3); in the server-side PHP code, will that make things better?
Or is there a different mechanism for showing the user the rapid feedback such as polling which doesn't require me to mess with apache or the server?
If you can place a cronjob in the server, I believe it'd work much better. What about using a cronjob to do the actual processing and use Javascript to update periodically the status (say, every 10 seconds)?
Then, the first step would be to trigger some flag that the cronjob PHP will check. If it's active, then the task must be performed (you could use some temporary file to tell the script which records must be processsed).
The cronjob would do the task and then, when its iteration is complete, turn off the flag.
This way, the user can even close your application and check it back later, and the server will handle all the processing, uninterrupted by client activity.
Putting a sleep inside your server-side php script can only make it worse. It leads to more processes sticking around, which turns out to increase parallel working/sleeping processes count, which adds up to increased memory usage.
Don't fear that so much processes can be done in parallel. Usually an apache server is configured to process no more than 150 requests in parallel. A well configured server does not process more requests in parallel than resources are available (good administrators do some calculations beforehand). The other requests have to wait - and given your count of requests it's probable that they are going to timeout before being processed.
Your concerns should however be about client-side resources but it looks like your script only starts a new request when the previous returned. BTW: Well behaving HTTP clients (which your browser should be) start no more than 6 requests in parallel to the same IP.
Update: Besides the above you should seriously consider redesigning your approach to mass-processing (similar to as #Joel suggested) - but this should go to another question.

Avoid PHP execution time limit

I need to create a script in PHP language which performs permutation of numbers. But PHP has an execution time limit set to 60 seconds. How can I run the script so that if you need to run more than 60 sesunde, not been interrupted by the server. I know I can change the maximum execution time limit in php, but I want to hear another version that does not require to know in advance the execution time of a script.
A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.
Any advice is welcome. An example code would be useful.
Thanks.
First I need to enter a number, lets say 25. After this the script is launch and it need to do the following: for every number <= than 25 it will create a file with the numbers generated at the current stage; for the next number it will open the previuos created file, and will create another file base on the lines of the opened file and so on. Because this take to long, I need to avoid the script beeing intrerupted by the server.
#emanuel:
I guess when your friend told you "A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.", he/she must have meant "Split your script computation into x pieces of work and run it separately"
For example with this script you can execute it 150 times to achieve a 150! (factorising) and show the result:
// script name: calc.php
<?php
session_start();
if(!isset($_SESSION['times'])){
$_SESSION['times'] = 1;
$_SESSION['result'] = 0;
}elseif($_SESSION['times'] < 150){
$_SESSION['times']++;
$_SESSION['result'] = $_SESSION['result'] * $_SESSION['times'];
header('Location: calc.php');
}elseif($_SESSION['times'] == 150){
echo "The Result is: " . $_SESSION['result'];
die();
}
?>
BTW (#Davmuz), you can only use set_time_limit() function on Apache servers, it's not a valid function on Microsoft IIS servers.
set_time_limit(0)
You could try to put the calls you want to make in a queue, which you serialize to a file (or memory cache?) when an operation is done. Then you could use a CRON-daemon to execute this queue every sixty seconds, so it continues to do the work, and finishes the task.
The drawbacks of this approach are problems with adding to the queue, with file locking and the such, and if you need the results immediately, this can prove troublesome. If you are adding stuff to a Db, it might work out. Also, it is not very efficient.
Use set_time_limit(0) but you have to disable the safe_mode:
http://php.net/manual/en/function.set-time-limit.php
I suggest to use a fixed time (set_time_limit(300)) because if there is a problem in the script (endless loops or memory leaks) this can not be a source of problems.
The web server, like Apache, have also a maximum time limit of 300 seconds, so you have to change it. If you want to do a Comet application, it may be better to chose another web server than Apache that can have long requests times.
If you need a long execution time for a heavy algorithm, you can also implement a parallel processing: http://www.google.com/#q=php+parallel+processing
Or store the input data and computer with another external script with a cron or whatever else.

Categories