I have PHP script contains a long loop to list about 60,000 email addresses from mysql, but after 30,000, my php script stops working and brakes, and sometimes all php script is written in white page (my page has imaged background), I increased PHP memory limit to unlimited, but no help.
What's problem?
The default execution time limit is 30 seconds, or the *max_execution_time* value from php.ini. Is your script taking longer than that? If so, you can increase the limit with set_time_limit.
The better question is, what are you doing with 60,000 email addresses that is taking so long? Chances are, there is a much better way to do whatever is taking too long.
There are two things you can try here. Firstly, set_time_limit(0). Setting it to zero means there is essentially no time limit.
Secondly you might want to use ignore_user_abort(false). This will et whether a client disconnect should abort script execution.
Both will ensure that your script keeps running for as long as it needs... or until your server packs out :)
The problem is not in memory I think it's in Time for execution
set_time_limit(VERY_BIG_NUMBER);
Update:
add
ini_set("display_errors","on");
error_reporting(E_ALL);
and inspect errors if any
check "max_execution_time"
How long does the script run before breaking? 30 seconds?
Use set_time_limit() and increase the max execution time for the script.
Related
every time my php script sends me "Maximum execution time of 30 seconds exceeded". Is there any way to show the custmize error to visitor instead of this big error message. i mean can we use our own error message at the place of this?
You probably have an endless loop somewhere. You could debug the PHP on your own machine (adding debug printing inside, etc.)
add this in the top of your php script
set_time_limit(0);
ini_set('max_execution_time', 0);
this means your script has unlimited time to execute the script, but you can limit it to x seconds, if you replace the 0 with a number of seconds like 500. But if you have a bug, like a endless loop the script would never stop and maybe your server could die if 100's visitor hit the script.
pin down your endless loop by inserting
die('got here...');
at top and move it downwards in your script, you'll find your problematic code quite quickly. Real debugging is smoother but needs some setting up.
A functional php script should normally take less than a second to execute.
regards,
///t
My php script creates thumbnails of images. Sometimes when it handles a lot of images, the script have to run a long time and ends after 60 seconds because of the time limit on my server.
Can I tell the script to time out after 59sek and then repeat itself?
I need some ideas.
Edit:
I don't think my web hosting allows me to change max_execution_time
I can't believe this is my answer..
loopMe.php:
<meta http-equiv="refresh" content="65;url=loopMe.php">
<?php
/* code that is timing out here */
?>
Your best bet though may be to look at set_time_limit() like others have suggested.
Important Note: this loop will never end. You could however have a session variable, say, $_SESSION['stopLoop'] to flag whenever the script successfully ends. Before printing the meta-refresh line you can check the value of that.
If you don't want to modify your code, use the set_time_limit(0), which sets to have no time limit.
set_time_limit(0)
http://php.net/manual/en/function.set-time-limit.php
I wouldn't recommend to use though, if your system grows this would take a very long time processing. It is better to recode in such a way that you run that only when your server is on low traffic and control how much time it will process data.
For instance you could store the data you need to process in a queue and process as much as you can in a time widow once per night.
set_time_limit(0);
yeah as above suggested use set_time_limit It will increase the sript execution timeout.
If your are in loop for processing multiple files then do set_time_limit every time you before you process the file to the expected duration of time. i.e if a file is assumed to take max 60s for execution before processing use set_time_limit(60)
If your server is not runnig with php in safe mode you can use set_time_limit($time); where $time is the time you need to execute the script, for example you can use set_time_limit(240); set the timeout on 4 minutes.
Or for example you can find out in your script how much time has passed and before the timeout expires add some seconds (set_time_limit(10);) until the script finish. (if you call multiple times to set_time_limit() you will add the time of each call.)
Using this code you can calculate the elapsed time and after for each image you compare it against the server timeout.
Also you can modify your script to log the images that has been already processed and in case of timeout carry on with the task in teh same point.
You can run your script from a cron job in your server (if it's allow you) to make it run periodically. Take a look at this article
My web site page contains lot of queries. So it takes lot of time to execute, and ends in an error. Could you please tell me how to increase the execution time (may be the query execution) through php coding?
Thanks in advance
Use max_execution_time:
ini_set('max_execution_time', 14000); // adjust value
Also make sure that your code is correctly, try to see if there is any room for improvement in the speed. For example, you should have a look at:
Query optimization techniques
PHP Optimization Tricks
PHP Micro Optimizations
If this website is supposed to be for normal users, you will have to optimize your queries, no way around it. No user will want to wait more than a few seconds for a page to load, and if you're already surpassing the execution time limit, it's already way too slow! Extending the time limit is not a solution.
If, OTOH, this page is supposed to be more of a maintenance script, you should run it from the CLI or as a cron job where execution time limits don't exist.
You can use set_time_limit function to set the number of seconds a script is allowed to run.
But my advice would be to rethink/rewrite your queries, because if you hit the php execution limit you may do too much work at the database, or you do the work inefficiently.
There is always room for improvement. Raising php execution time is only a temporary fix which is one of the most hated phrase in computer programming. So again, look at your database scheme and analyze your queries, check for indices, etc.
For high concurrency I/O you may also consider a NoSQL approach like APC or Memcached.
I have this code:
$theQuery = mysql_query("SELECT phrase, date from wordList WHERE group='nouns'");
while($getWords=mysql_fetch_array($theQuery)) {
echo "$getWords[phrase] created on $getWords[date]<br>";
}
The query has 75,000 results, and every time I run the code I get an error.
Several issues could be at play here, all of which are due to settings in your php.ini. Your script could be timing out since PHP defaults to a 30 second maximum for script execution. The other reason (and just as likely) is that you're hitting a script memory limit which is defaulted to 8MB per script execution.
Open php.ini and search for "Resource Limits" and make the appropriate modifications.
As a guess I'd say your timing out on the MySQL query rather than echoing out the results (from your title)
Have you tried setting an index on the group column on your MySQL table? This will slow your inserts should make a huge difference on the select.
See http://dev.mysql.com/doc/refman/5.0/en/mysql-indexes.html
php.ini is a good first step but you might find yourself on a server where you can't change those settings to something that will work. In that case it might make sense to break up what you're doing in chunks.
For example, if this is for output, you could run 25,000 results at a time using LIMIT. Stick the output in a file and then just use the file for your output. You can update your file once a night/hour/whatever with a cron job if you need the output to stay fresh.
I need to create a script in PHP language which performs permutation of numbers. But PHP has an execution time limit set to 60 seconds. How can I run the script so that if you need to run more than 60 sesunde, not been interrupted by the server. I know I can change the maximum execution time limit in php, but I want to hear another version that does not require to know in advance the execution time of a script.
A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.
Any advice is welcome. An example code would be useful.
Thanks.
First I need to enter a number, lets say 25. After this the script is launch and it need to do the following: for every number <= than 25 it will create a file with the numbers generated at the current stage; for the next number it will open the previuos created file, and will create another file base on the lines of the opened file and so on. Because this take to long, I need to avoid the script beeing intrerupted by the server.
#emanuel:
I guess when your friend told you "A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.", he/she must have meant "Split your script computation into x pieces of work and run it separately"
For example with this script you can execute it 150 times to achieve a 150! (factorising) and show the result:
// script name: calc.php
<?php
session_start();
if(!isset($_SESSION['times'])){
$_SESSION['times'] = 1;
$_SESSION['result'] = 0;
}elseif($_SESSION['times'] < 150){
$_SESSION['times']++;
$_SESSION['result'] = $_SESSION['result'] * $_SESSION['times'];
header('Location: calc.php');
}elseif($_SESSION['times'] == 150){
echo "The Result is: " . $_SESSION['result'];
die();
}
?>
BTW (#Davmuz), you can only use set_time_limit() function on Apache servers, it's not a valid function on Microsoft IIS servers.
set_time_limit(0)
You could try to put the calls you want to make in a queue, which you serialize to a file (or memory cache?) when an operation is done. Then you could use a CRON-daemon to execute this queue every sixty seconds, so it continues to do the work, and finishes the task.
The drawbacks of this approach are problems with adding to the queue, with file locking and the such, and if you need the results immediately, this can prove troublesome. If you are adding stuff to a Db, it might work out. Also, it is not very efficient.
Use set_time_limit(0) but you have to disable the safe_mode:
http://php.net/manual/en/function.set-time-limit.php
I suggest to use a fixed time (set_time_limit(300)) because if there is a problem in the script (endless loops or memory leaks) this can not be a source of problems.
The web server, like Apache, have also a maximum time limit of 300 seconds, so you have to change it. If you want to do a Comet application, it may be better to chose another web server than Apache that can have long requests times.
If you need a long execution time for a heavy algorithm, you can also implement a parallel processing: http://www.google.com/#q=php+parallel+processing
Or store the input data and computer with another external script with a cron or whatever else.