I'm having a problem with my PHP file that takes more than 30 seconds to execute.
After searching, I added set_time_limit(0); at the start of the code,cbut the file still times out with a 500 error after 30 seconds.
log: PHP Fatal error: Maximum execution time of 30 seconds exceeded in /xxx/xx/xxx.php
safe-mode : off
Check the php.ini
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
ini_set('max_execution_time', 0); //0=NOLIMIT
This is an old thread, but I thought I would post this link, as it helped me quite a bit on this issue. Essentially what it's saying is the server configuration can override the php config. From the article:
For example mod_fastcgi has an option called "-idle-timeout" which controls the idle time of the script. So if the script does not output anything to the fastcgi handler for that many seconds then fastcgi would terminate it. The setup is somewhat like this:
Apache <-> mod_fastcgi <-> php processes
The article has other examples and further explanation. Hope this helps somebody else.
I usually use set_time_limit(30) within the main loop (so each loop iteration is limited to 30 seconds rather than the whole script).
I do this in multiple database update scripts, which routinely take several minutes to complete but less than a second for each iteration - keeping the 30 second limit means the script won't get stuck in an infinite loop if I am stupid enough to create one.
I must admit that my choice of 30 seconds for the limit is somewhat arbitrary - my scripts could actually get away with 2 seconds instead, but I feel more comfortable with 30 seconds given the actual application - of course you could use whatever value you feel is suitable.
Hope this helps!
ini_set('max_execution_time', 300);
use this
Checkout this, This is from PHP MANUAL, This may help you.
If you're using PHP_CLI SAPI and getting error "Maximum execution time of N seconds exceeded" where N is an integer value, try to call set_time_limit(0) every M seconds or every iteration. For example:
<?php
require_once('db.php');
$stmt = $db->query($sql);
while ($row = $stmt->fetchRow()) {
set_time_limit(0);
// your code here
}
?>
I think you must say limit time to execution to php , try this.
ini_set('max_execution_time', 0);
Related
I am using laravel and using https://csv.thephpleague.com/ to parse csv.
My function is something like
$path = $request->file('import_file')->getRealPath();
$csv = Reader::createFromPath($path, 'r');
$csv->setHeaderOffset(0);
$csv_header = $csv->getHeader();
$sample_data = $csv->fetchOne();
$sample_data = array_values($sample_data);
$records = $reader->getRecords();
$csv_file_id = Csv_data::create([
'csv_filename' => $request->file('import_file')->getClientOriginalName(),
'csv_header' => json_encode($csv_header),
'csv_data' => json_encode($records)
]);
How can i parse large data sets, by dealing against execution time limit.
Well, am pretty new to these things, so I request just not commenting like use this and that. Up to now time is just passing by trying this and that package. So, solution with code snippets could be better.
Also I tried with,
$stmt = (new Statement())
->offset($offset)
->limit($limit)
;
But with no success. !st reason even limiting offset and running in loop by increasing offset, it shows same error of execution time. 2nd reason, its little difficult for me to end the loop with good logic.
Looking for some help. I will be available for instant reply.
Are you using a console command for this?
https://laravel.com/docs/5.6/artisan
If you run into memory limits when doing through console, you can first try to increase php memory limit.
If that is still not enough, last option is to cut up .csv into parts.
But this should not be necessary unless you are dealing with a very vert large .csv (unlikely).
By default, PHP is usually set to execute a HTTP request for 30 seconds before timing out -- to prevent a runaway script or infinite loop from processing forever. It sounds like this is what you're running into.
The quick and dirty method is to add ini_set('max_execution_time', 300); at the top of your script. This will tell php to run for 300 seconds (5 minutes) before timing out.
You can adjust that time as needed, but if it regularly takes longer than that, you may want to look at other options -- such as creating a console command (https://laravel.com/docs/5.6/artisan) or running it on a schedule.
I have this file of 10 millions words, one word on every line. I'm trying to open that file, read every line, put it in an array and count the number of occurrences for each word.
wartek
mei_atnz
sommerray
swaggyfeed
yo_bada
ronnieradke
… and so on (10M+ lines)
I can open the file, read its size, even parse it line by line and echo the line on the browser (it's very long, of course), but when I'm trying to perform any other operation, the script just refuse to execute. No error, no warning, no die(…), nothing.
Accessing the file is always OK, but it's the operations which are not performed with the same success. I tried this and it worked…
while(!feof($pointer)) {
$row = fgets($pointer);
print_r($row);
}
… but this didn't :
while(!feof($pointer)) {
$row = fgets($pointer);
array_push($dest, $row);
}
Also tried with SplFileObject or file($source, FILE_IGNORE_NEW_LINES) with the same result every time (not okay with big file, okay with small file)
Guessing that the issue is not the size (150 ko), but probably the length (10M+ lines), I chunked the file to reduce it to ~20k without any improvement, then reduced it again to ~8k lines, and it worked.
I also removed the time limit with set_time_limit(0); or removed (almost) any memory limit both in the php.ini and in my script ini_set('memory_limit', '8192M');.Regarding the errors I could have, I set the error_reporting(E_ALL); at the top of my script.
So the questions are :
is there a maximum number of lines that can be read by PHP built-in functions?
why I can echo or print_r but not perform any other operations?
I think you might be running into a long execution time:
How to increase the execution timeout in php?
Different operation take different time. Printing might be a lot easier than pushing 10M new data into an array one-by-one. It's strange that you don't get any error messages, you should receive process exceeded time somewhere.
I'm trying to develop a crontab task that every 5 seconds check my email. Normally I could request it every 1 minute instead of 5 seconds, but reading some other posts with no solution, I found one with the same problem than me. The script, after a period of time, was stopping. This is not a real problem cause I can configure a crontab task and make sleep(5) Also I have the same 1and1 server as the other question, which I'm including here.
PHP script stops running arbitrarily with no errors
The real problem I had when I tried to solve this via crontab, every minute a new PID was created, so in an hour I could get almost 50 process at the same time doing the same.
Here I include the .php file called by crontab every minute:
date_default_timezone_set('Europe/Madrid');
require_once ( $_SERVER['DOCUMENT_ROOT'] . '/folder1/path.php' );
require_once ( CLASSES . 'Builder.php');
$UIModules = Builder::getUIModules();
$UIModules->getfile();
So I found a solution by checking the PID table. The idea is if in the PID table are running 2 process, then that means the last proccess is still working, so just finish doing anything. If in the PID table there's just 1 process running, that means the latest process that was working has expired so we can use this new one. The way is something like I show on the next code:
$var_aux = exec("ps -A | grep php");
if (!isarray($var_aux)){
date_default_timezone_set('Europe/Madrid');
require_once ( $_SERVER['DOCUMENT_ROOT'] . '/folder1/path.php' );
require_once ( CLASSES . 'Builder.php');
$UIModules = Builder::getUIModules();
$UIModules->getfile();
}
I'm not sure about the condition isarray($var_aux) cause $var_aux always returns me the last PID process, so it returns a string of 28 characters, but in this case we want to return more than a process so the condition could even change to if (strlen($var) < 34). Note: I've given more margin to the len, cause sometime process take longer than 9999, so it's 1 lenght more.
The main problem I found on this is the exec sentence just print me the last process, in other words, it always returns me a string with a lenght of 28 (The PID for that script).
I don't know if what I've purposed is a crazy idea, but is it possible to get all the PID table with php?
You can use a much simpler solution than emulating crontab in php: use contab
make multiple entries to check every 5 seconds an then call your php program.
A good description of how to set up crontab to perform subminute action can be found here:
https://usu.li/how-to-run-a-cron-job-every-x-seconds
This solution only requires the maximum of 12 processes running every minute.
I have developed a metasearch engine and one of the optimisations I would like to make is to process the search APIs in parallel. Imagine that results are retrieved from Search Engine A in 0.24 seconds, SE B in 0.45 Seconds and from SE C in 0.5 seconds. With other overheads the metasearch engine can return aggregated results in about 1.5 seconds, which is viable. Now what I would like to do is to send those requests in parallel rather than in series, as at present, and get that time down to under a second. I have investigated exec, forking, threading and all, for various reasons, have failed. Now I have only spent a day or two on this so I may have missed something. Ideally i would like to implement this on a WAMP stack on my development machine (localhost) and see about implementing on a Linux webserver thereafter. Any help appreciated.
Let's take a simple example: say we have two files we want to run simultaneously. File 1:
<?php
// file1.php
echo 'File 1 - Test 1'.PHP_EOL;
$sleep = mt_rand(1, 5);
echo 'Start Time: '.date("g:i:sa").PHP_EOL;
echo 'Sleep Time: '.$sleep.' seconds.'.PHP_EOL;
sleep($sleep);
echo 'Finish Time: '.date("g:i:sa").PHP_EOL;
?>
Now, imagine file two is the same... the idea is that if run in parallel the command line output for the times should be the same, for example:
File 1 - Test 1
Start Time: 9:30:43am
Sleep Time: 4 seconds.
Finish Time: 9:30:47am
But whether I use exec, popen or whatever, I just cannot get this to work in PHP!
I would use socket_select(). Doing so, only the connection time would be cummulative as you can read from the sockets in parralel. This will give you a big performance boost.
There is one viable approach. Make a cli php file that gets in arguments what it have to do and returns whatever result is produced serialized.
In your main app you may popen as many of these workers as you need and then in a simple loop collect the outputs:
[edit] I used your worker example, just had to chmod +x and add a #!/usr/bin/php line on top:
#!/usr/bin/php
<?php
echo 'File 1 - Test 1'.PHP_EOL;
$sleep = mt_rand(1, 5);
echo 'Start Time: '.date("g:i:sa").PHP_EOL;
echo 'Sleep Time: '.$sleep.' seconds.'.PHP_EOL;
sleep($sleep);
echo 'Finish Time: '.date("g:i:sa").PHP_EOL;
?>
also modified the run script a little bit - ex.php:
#!/usr/bin/php
<?php
$pha=array();
$res=array();
$pha[1]=popen("./file1.php","r");
$res[1]='';
$pha[2]=popen("./file2.php","r");
$res[2]='';
while (list($id,$ph)=each($pha)) {
while (!feof($ph))
$res[$id].=fread($ph,8192);
pclose($ph);
}
echo $res[1].$res[2];
here is the result, when tested in cli (its the same when ex.php is called from web, but paths to file1.php and file2.php should be fixed):
$ time ./ex.php
File 1 - Test 1
Start Time: 11:00:33am
Sleep Time: 3 seconds.
Finish Time: 11:00:36am
File 2 - Test 1
Start Time: 11:00:33am
Sleep Time: 4 seconds.
Finish Time: 11:00:37am
real 0m4.062s
user 0m0.040s
sys 0m0.036s
As seen in the result one script takes 3 seconds to execute and the other takes 4. Both run for 4 seconds together in parallel.
[end edit]
In this way the slow operation will run in parallel, you will only collect the result in serial.
Finally it will take (the slowest worker time)+(time for collecting) to execute. Since the time for collecting the results and time to unserialize, etc., may be ignored you get all data for the time of the slowest request.
As a side note you may try to use the igbinary serialiser that is much faster than the built-in one.
As noted in comments:
worker.php is executed outside of the web request and you have to pass all its state via arguments. Passing arguments may also be a problem to handle all escaping, security and etc., so not-effective but simple way is to use base64.
A major drawback in this approach is that it is not easy to debug.
It can be further improved by using stream_select instead of fread and also collecting data in parallel.
Is there a way to make the loading of a page go slower? There are some processes which happen to fast to get a grip on and I would like to see them a bit slower.
Is there anything I can do to slow down the loading-time of a page?
I need this because there is one CSS-selector on which I need to change something, but I can't catch him with firebug, cause the page is loading too fast.
You can just use sleep() in PHP to make it delay the loading.
Here is an example from the PHP Manual:
<?php
// current time
echo date('h:i:s') . "\n";
// sleep for 10 seconds
sleep(10);
// wake up !
echo date('h:i:s') . "\n";
?>
http://uk.php.net/sleep
You can use sleep(seconds); (see HERE), but I suspect your application design should be improved if you need it...
Solution 1 (seconds based)
You could use
sleep($seconds);
where $seconds, as the variable name explain, are the seconds that the script have to wait.
Solution 2 (microseconds based)
You can also use
usleep($microseconds);
to delay the execution in microseconds instead of seconds.
References
sleep()
usleep()
sleep().