Is there a way to make the loading of a page go slower? There are some processes which happen to fast to get a grip on and I would like to see them a bit slower.
Is there anything I can do to slow down the loading-time of a page?
I need this because there is one CSS-selector on which I need to change something, but I can't catch him with firebug, cause the page is loading too fast.
You can just use sleep() in PHP to make it delay the loading.
Here is an example from the PHP Manual:
<?php
// current time
echo date('h:i:s') . "\n";
// sleep for 10 seconds
sleep(10);
// wake up !
echo date('h:i:s') . "\n";
?>
http://uk.php.net/sleep
You can use sleep(seconds); (see HERE), but I suspect your application design should be improved if you need it...
Solution 1 (seconds based)
You could use
sleep($seconds);
where $seconds, as the variable name explain, are the seconds that the script have to wait.
Solution 2 (microseconds based)
You can also use
usleep($microseconds);
to delay the execution in microseconds instead of seconds.
References
sleep()
usleep()
sleep().
Related
I am using laravel and using https://csv.thephpleague.com/ to parse csv.
My function is something like
$path = $request->file('import_file')->getRealPath();
$csv = Reader::createFromPath($path, 'r');
$csv->setHeaderOffset(0);
$csv_header = $csv->getHeader();
$sample_data = $csv->fetchOne();
$sample_data = array_values($sample_data);
$records = $reader->getRecords();
$csv_file_id = Csv_data::create([
'csv_filename' => $request->file('import_file')->getClientOriginalName(),
'csv_header' => json_encode($csv_header),
'csv_data' => json_encode($records)
]);
How can i parse large data sets, by dealing against execution time limit.
Well, am pretty new to these things, so I request just not commenting like use this and that. Up to now time is just passing by trying this and that package. So, solution with code snippets could be better.
Also I tried with,
$stmt = (new Statement())
->offset($offset)
->limit($limit)
;
But with no success. !st reason even limiting offset and running in loop by increasing offset, it shows same error of execution time. 2nd reason, its little difficult for me to end the loop with good logic.
Looking for some help. I will be available for instant reply.
Are you using a console command for this?
https://laravel.com/docs/5.6/artisan
If you run into memory limits when doing through console, you can first try to increase php memory limit.
If that is still not enough, last option is to cut up .csv into parts.
But this should not be necessary unless you are dealing with a very vert large .csv (unlikely).
By default, PHP is usually set to execute a HTTP request for 30 seconds before timing out -- to prevent a runaway script or infinite loop from processing forever. It sounds like this is what you're running into.
The quick and dirty method is to add ini_set('max_execution_time', 300); at the top of your script. This will tell php to run for 300 seconds (5 minutes) before timing out.
You can adjust that time as needed, but if it regularly takes longer than that, you may want to look at other options -- such as creating a console command (https://laravel.com/docs/5.6/artisan) or running it on a schedule.
I am creating simple script to test that I can echo inside my while loop before it gets 60 seconds,but the problem is it will not echo inside my loop.I don't know if it is really executed inside my while loop. Then my browser will crashed.
$timelimit = 60; //seconds
set_time_limit($timelimit);
$start_time = time(); //set startup time;
while(((time() - $start_time) < $timelimit) || !$timelimit){
echo "executing..<br/>";
}
Thank you in advance.
This is a very tight loop. It will run very fast and will create a very large output, which will eventually kill the browser (it will have hundreds of thousands of lines). You may add some delay to your loop:
while(((time() - $start_time) < $timelimit) || !$timelimit){
sleep(1); // pause for 1 second
echo "executing..<br/>";
}
In this case the output will be only 60 lines, and the browser should render it after a minute of waiting.
CPU execution is very first (approximately 10^-9s per execution). Your looping time is 60 seconds. So consider how many (may be 300915626 executions) executions will occur. During this time if you want to print something your browser will be killed.
If you're expecting to see the output as the script generates it, then you'll want to add a flush(); after your echo. However, if I recall correctly, php will still wait to send the output until it has a certain number of bytes (1024 maybe?)
I have developed a metasearch engine and one of the optimisations I would like to make is to process the search APIs in parallel. Imagine that results are retrieved from Search Engine A in 0.24 seconds, SE B in 0.45 Seconds and from SE C in 0.5 seconds. With other overheads the metasearch engine can return aggregated results in about 1.5 seconds, which is viable. Now what I would like to do is to send those requests in parallel rather than in series, as at present, and get that time down to under a second. I have investigated exec, forking, threading and all, for various reasons, have failed. Now I have only spent a day or two on this so I may have missed something. Ideally i would like to implement this on a WAMP stack on my development machine (localhost) and see about implementing on a Linux webserver thereafter. Any help appreciated.
Let's take a simple example: say we have two files we want to run simultaneously. File 1:
<?php
// file1.php
echo 'File 1 - Test 1'.PHP_EOL;
$sleep = mt_rand(1, 5);
echo 'Start Time: '.date("g:i:sa").PHP_EOL;
echo 'Sleep Time: '.$sleep.' seconds.'.PHP_EOL;
sleep($sleep);
echo 'Finish Time: '.date("g:i:sa").PHP_EOL;
?>
Now, imagine file two is the same... the idea is that if run in parallel the command line output for the times should be the same, for example:
File 1 - Test 1
Start Time: 9:30:43am
Sleep Time: 4 seconds.
Finish Time: 9:30:47am
But whether I use exec, popen or whatever, I just cannot get this to work in PHP!
I would use socket_select(). Doing so, only the connection time would be cummulative as you can read from the sockets in parralel. This will give you a big performance boost.
There is one viable approach. Make a cli php file that gets in arguments what it have to do and returns whatever result is produced serialized.
In your main app you may popen as many of these workers as you need and then in a simple loop collect the outputs:
[edit] I used your worker example, just had to chmod +x and add a #!/usr/bin/php line on top:
#!/usr/bin/php
<?php
echo 'File 1 - Test 1'.PHP_EOL;
$sleep = mt_rand(1, 5);
echo 'Start Time: '.date("g:i:sa").PHP_EOL;
echo 'Sleep Time: '.$sleep.' seconds.'.PHP_EOL;
sleep($sleep);
echo 'Finish Time: '.date("g:i:sa").PHP_EOL;
?>
also modified the run script a little bit - ex.php:
#!/usr/bin/php
<?php
$pha=array();
$res=array();
$pha[1]=popen("./file1.php","r");
$res[1]='';
$pha[2]=popen("./file2.php","r");
$res[2]='';
while (list($id,$ph)=each($pha)) {
while (!feof($ph))
$res[$id].=fread($ph,8192);
pclose($ph);
}
echo $res[1].$res[2];
here is the result, when tested in cli (its the same when ex.php is called from web, but paths to file1.php and file2.php should be fixed):
$ time ./ex.php
File 1 - Test 1
Start Time: 11:00:33am
Sleep Time: 3 seconds.
Finish Time: 11:00:36am
File 2 - Test 1
Start Time: 11:00:33am
Sleep Time: 4 seconds.
Finish Time: 11:00:37am
real 0m4.062s
user 0m0.040s
sys 0m0.036s
As seen in the result one script takes 3 seconds to execute and the other takes 4. Both run for 4 seconds together in parallel.
[end edit]
In this way the slow operation will run in parallel, you will only collect the result in serial.
Finally it will take (the slowest worker time)+(time for collecting) to execute. Since the time for collecting the results and time to unserialize, etc., may be ignored you get all data for the time of the slowest request.
As a side note you may try to use the igbinary serialiser that is much faster than the built-in one.
As noted in comments:
worker.php is executed outside of the web request and you have to pass all its state via arguments. Passing arguments may also be a problem to handle all escaping, security and etc., so not-effective but simple way is to use base64.
A major drawback in this approach is that it is not easy to debug.
It can be further improved by using stream_select instead of fread and also collecting data in parallel.
I'm having a problem with my PHP file that takes more than 30 seconds to execute.
After searching, I added set_time_limit(0); at the start of the code,cbut the file still times out with a 500 error after 30 seconds.
log: PHP Fatal error: Maximum execution time of 30 seconds exceeded in /xxx/xx/xxx.php
safe-mode : off
Check the php.ini
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
ini_set('max_execution_time', 0); //0=NOLIMIT
This is an old thread, but I thought I would post this link, as it helped me quite a bit on this issue. Essentially what it's saying is the server configuration can override the php config. From the article:
For example mod_fastcgi has an option called "-idle-timeout" which controls the idle time of the script. So if the script does not output anything to the fastcgi handler for that many seconds then fastcgi would terminate it. The setup is somewhat like this:
Apache <-> mod_fastcgi <-> php processes
The article has other examples and further explanation. Hope this helps somebody else.
I usually use set_time_limit(30) within the main loop (so each loop iteration is limited to 30 seconds rather than the whole script).
I do this in multiple database update scripts, which routinely take several minutes to complete but less than a second for each iteration - keeping the 30 second limit means the script won't get stuck in an infinite loop if I am stupid enough to create one.
I must admit that my choice of 30 seconds for the limit is somewhat arbitrary - my scripts could actually get away with 2 seconds instead, but I feel more comfortable with 30 seconds given the actual application - of course you could use whatever value you feel is suitable.
Hope this helps!
ini_set('max_execution_time', 300);
use this
Checkout this, This is from PHP MANUAL, This may help you.
If you're using PHP_CLI SAPI and getting error "Maximum execution time of N seconds exceeded" where N is an integer value, try to call set_time_limit(0) every M seconds or every iteration. For example:
<?php
require_once('db.php');
$stmt = $db->query($sql);
while ($row = $stmt->fetchRow()) {
set_time_limit(0);
// your code here
}
?>
I think you must say limit time to execution to php , try this.
ini_set('max_execution_time', 0);
I just want to print a counting from 1 to 10 at an interval of 10 sec between each integer.
eg.
$i=10; //Time delay
for($j=1;$j<11;$j++)
{
echo $j;
//do something to delay the execution by $i seconds
}
I have tried everything including flush(), ob_flush(), ob_implicit_flush() but all i get is a frozen screen untill the whole time is executed.
http://php.net/manual/en/function.sleep.php
The sleep function will interrupt execution of your script.
But have you considered using Javascript for something like this? Your script may reach maximum execution time, and will be hogging resources on the server. Use the client's resources instead!
What you want is much more javascript-related than PHP. Because PHP is serverside it is not designed to do these kind of operations. You COULD get it to work, but it would not be very pretty.
In my logic; counting from 1 to 10 should not involve the server at all. You can do this directly in the browser, hence use javascript.
you want to print the countdown while your php script is running?
if yes, then try that non-recommended fragment:
ob_start();
for($i=0;$i<10;$i++) {
echo str_repeat(" ",10000);
echo 'printing...<br />';
ob_flush();
flush();
sleep(1);
}
you see, the strange line:
echo str_repeat(" ",10000);
it seems that browsers needs some "data" before deciding to really flush your data.
Use javascript for real time counters.
Use jQuery. On $(document).ready add a delay of 10 seconds to show a specific div which would contain the info to appear after 10 seconds.
For ready - http://api.jquery.com/ready/
For delay - http://api.jquery.com/delay/
Yes, use Javascript as it's not possible to accomplish this task with PHP using HTTP because of output buffering.