Microtime inaccurate time measurement? - php

I just found out about microtime() in PHP.
I tried to check how long it will take to execute basic image load.
Here is code:
<?php
$start = microtime(true);
echo("<img src='http://example.com/public/images/new.png'/>");
$time_elapsed_secs = microtime(true) - $start;
echo($time_elapsed_secs);
?>
On average it returns: "8.8214874267578E-6" which I assume means 8.82 seconds?
Did I do something wrong? I am sure image loads faster than 8 seconds, I would definitely notice 8 seconds.
Here is image itself:

The E-6 at the end of that string means you need to move the decimal six places to the left.
By the way, the echo statement executes almost instantly, writing that HTML to the output stream. That doesn't mean the image loaded that fast in some remote browser reading the HTML stream and trying to load the image.

Related

executing script before 60 seconds

I am creating simple script to test that I can echo inside my while loop before it gets 60 seconds,but the problem is it will not echo inside my loop.I don't know if it is really executed inside my while loop. Then my browser will crashed.
$timelimit = 60; //seconds
set_time_limit($timelimit);
$start_time = time(); //set startup time;
while(((time() - $start_time) < $timelimit) || !$timelimit){
echo "executing..<br/>";
}
Thank you in advance.
This is a very tight loop. It will run very fast and will create a very large output, which will eventually kill the browser (it will have hundreds of thousands of lines). You may add some delay to your loop:
while(((time() - $start_time) < $timelimit) || !$timelimit){
sleep(1); // pause for 1 second
echo "executing..<br/>";
}
In this case the output will be only 60 lines, and the browser should render it after a minute of waiting.
CPU execution is very first (approximately 10^-9s per execution). Your looping time is 60 seconds. So consider how many (may be 300915626 executions) executions will occur. During this time if you want to print something your browser will be killed.
If you're expecting to see the output as the script generates it, then you'll want to add a flush(); after your echo. However, if I recall correctly, php will still wait to send the output until it has a certain number of bytes (1024 maybe?)

PHP performance file_get_contents() vs readfile() and cat

I am doing some benchmarking with PHP file reading functions just for my overall knowledge.
So I tested three different ways to read the whole content of a file that I thought would be very fast.
file_get_contents() well-know for its very high performance
readfile() known to be a very good alternative to file_get_contents() when it comes to outputting the data directly to stdout
exec('cat filename') one very handy and fast UNIX command
So here is my benchmarking code, note that I enabled the PHP cache system for readfile() to avoid the direct output that would totally falsify the results.
<?php
/* Using a quick PNG file to benchmark with a big file */
/* file_get_contents() benchmark */
$start = microtime(true);
$foo = file_get_contents("bla.png");
$end = microtime(true) - $start;
echo "file_get_contents() time: " . $end . "s\n";
/* readfile() benchmark */
ob_start();
$start = microtime(true);
readfile('bla.png');
$end = microtime(true) - $start;
ob_end_clean();
echo "readfile() time: " . $end . "s\n";
/* exec('cat') benchmark */
$start = microtime(true);
$bar = exec('cat bla.png');
$end = microtime(true) - $start;
echo "exec('cat filename') time: " . $end . "s\n";
?>
I have ran this code several times to confirm the results shown and every time I had the same order. Here is an example of one of them:
$ php test.php
file_get_contents() time: 0.0006861686706543s
readfile() time: 0.00085091590881348s
exec('cat filename') time: 0.0048539638519287s
As you can see file_get_contents() comes first then arrives readfile() and finally cat
As for cat even though it is a UNIX command (so fast and everything :)) I understand that calling a separate binary may cause the relative high result.
But the thing I have some difficulty to understand is that why is file_get_contents() faster than readfile()? That's about 1.3 times slower after all.
Both functions are built-in and therefore pretty well optimized and since I enabled the cache, readfile() is not "trying" to output the data to stdout but just like file_get_contents() it will put the data inside the RAM.
I am looking for a technical low-level explanation here to understand the pros and cons of file_get_contents() and readfile() besides the fact that one is designed to write directly to stdout whereas the other does a memory allocation inside the RAM.
Thanks in advance.
file_get_contents only loads the data from the file in memory, while both readfile and cat also output the data on the screen, so they just perform more operations.
If you want to compare file_get_contents to the others, add echo before it
Also, you are not freeing the memory allocated for $foo. There is a chance that if you move the file_get_contents as last test, you will get different result.
Additionally, you are using output buffering, which also cause some difference - just try to add the rest of the functions in an output buffering code to remove any differences.
When comparing different functions, the rest of the code should be the same, otherwise you are open to all kinds of influences.
file_get_contents function is generally considered faster than the readfile function when it comes to caching, as it allows data to be stored in the memory cache, whereas readfile writes the data directly to the output buffer, bypassing the memory cache.
This allows for the contents of the file to be easily manipulated and cached in memory, which can result in faster access times compared to the readfile function, which reads the file one chunk at a time and outputs the contents directly to the browser. file_get_contents can take advantage of PHP's memory caching system (opcache).
If in some cases you can't use file_get_contents, you can use the output buffering mechanism in PHP to cache the contents of a file before sending it to the client. This will allow you to use PHP's memory caching system with the readfile function. You can do this by starting an output buffer with the ob_start function before calling readfile, then flushing the buffer with the ob_end_flush function. This way, the contents of the file will be stored in the output buffer, which is part of PHP's memory caching system.

Small PHP Code - Causing Server Load? Help Optimize

I have this small bit of code (listed at the end of this message), that runs on page load. We get around 50,000 UNIQUE visitors per day (not counting repeats). It could be coincidental, but ever since implementation, there have been random server load issues.
So what I'm asking is...
1) Can someone confirm/deny whether or not the below code can in fact cause issues?
2) Can this be optimized?
Just fyi:
-- I have stuck this function in the HEADER file of a WordPress layout.
-- It is called 10+ times in the footer
-- It is a VPS server using NGINX
-- I have not checked the logs just yet
The code's purpose...
We specify a percentage to the function that tells the code to display a string that percent of the time (so if we put 60, then it means the string should show up 60% of the time). Each entry in the footer generates its own random number.
The code:
function writeRndString($theString, $percent) {
$randno = rand(1,100);
if($randno <= (int)$percent) {
echo "Random String: " . $theString;
echo "\n\n";
}
}
This is a very simple function, should be fast, even if you call it several times. Even with 50000 daily, which is about 2 pages per second.
If you can, simply remove it for a few minutes and check the server load. It could be called a lot more times than you assume :)
Maby....
You forgot a $ on: echo "Random String: " . theString;
And alitte bether maby, don't use variables you don't need in fact.
maby also use return
function writeRndString($theString, $percent) {
if (rand(1, 100) <= (int) $percent) {
return "Random String: " . $theString . "\n\n";
}
}
PHP:
<?php
echo "blablabla" . writeRndString($x, $y);
?>

Inconsistent loading time for JS and PHP

I have a PHP script being loaded by JS through JQuery's $.ajax.
I measured the execution time of the PHP script using:
$start = microtime(); // top most part of code
// all other processes that includes AES decryption
$end = microtime(); // bottom part of code
file_put_contents('LOG.TXT','TIME IT TOOK: '.($end-$start)."\n",FILE_APPEND);
It measured somewhere less than 1 second. There are no prepend/append PHP scripts.
In the JS $.ajax code, I have measured the execution time by:
success: function(response) {
console.log(date('g:i:s a') + ' time received\n');
// all other processes including AES decryption
console.log(date('g:i:s a') + ' time processed\n');
}
The time is the same for the time received and the time processed.
However, when I check the Chrome Developer Tools, it claims that the PHP script loaded for about 8 seconds.
What could be wrong in how I measured these things?
I'm certain that PHP is loading fast but how come Chrome reports that it took more than 8 seconds?
I'm using localhost and my web server is fast and this is the only time I encountered this problem. All other AJAX calls are fast.
In the PHP section, make sure you're using microtime(true) so that you're working with floating point numbers instead of strings.
Using subtraction on strings may yield incorrect results.
Example: http://ideone.com/FWkjF2
<?php
// Wrong
$start = microtime();
sleep(3);
$stop = microtime();
echo ($stop - $start) . PHP_EOL; // Prints 8.000000000008E-5
// Correct
$start = microtime(true);
sleep(3);
$stop = microtime(true);
echo ($stop - $start) . PHP_EOL; // Prints 3.0000791549683
?>

Is there a way to put the number of seconds a script takes to execute (or "page load time") in the middle of the page?

I want to put the number of seconds a search script takes to search in PHP. But, I don't want to put that number at the bottom of the page. I put this at the top of the PHP file:
$time_start = microtime(true);
And then I put this where I am echoing out the number:
$time_end = microtime(true);
echo number_format($time_end - $time_start, 5);
The problem is that this doesn't count the whole script as there's still a lot of other code under that. Is there a way to determine how long it takes to execute but echo it in another place on the page rather than the bottom?
I can think of three possibilities:
Echo the value inside a Javascript at the bottom of the page. The script can modify the DOM as soon as the page is loaded, so the value can be inserted anywhere in the document. Disadvantage: won't work for clients that don't support or ignore Javascript.
Cheat and stop the timer early. You'll have to do all time-intensive stuff before the point in the document where you want the time (i.e. preload all results it in memory and echo it afterwards), so the part that you're not measuring is negligible. Disadvantage: it's not completely accurate and the pre-loading could be a hassle or memory-inefficient.
Buffer all output with output control functions and perform a search-and-replace on the output buffer contents at the end of the script. Disadvantage: could be inefficient depending on your situation.
Not precisely, but you can get a very close approximation by using output buffering:
<?php
$time_start = microtime(true);
ob_start();
// Do stuff here
$page = ob_get_contents();
ob_end_clean();
$time_end = microtime(true);
$elapsed = number_format($time_end - $time_start, 5);
var_dump($page, $elapsed);
?>

Categories