I made a website recent and it's loading very slow. My Firebug Page Speed Score is 82/100. I think this is good. My website has 2 images and they have 100KB and some other small images for bullets, arrows and stuff that have not more that 50KB.
Anyway, my point is that the html is quite fast, but I have this html as a Wordpress theme and a new host (cheap one).
My question: How can I find the time for my Wordpress (or any PHP script) to echo out the page requested.
Can I know for sure if the hosting or the script that is making my page work slow?
Thank you!
time your script?
<?php
$start = microtime(true);
//your script here
$end = microtime(true);
$time = $end - $start;
echo('script took ' . $time . ' seconds to execute.');
?>
I use Fiddler for this. It is a handy tool that can accurately time the entire request and response, to and from the server.
Just open Fiddler, go to your site, select all of the requests related, and click the statistics tab.
Related
I have a website developed with PHP from scratch, and I want to know the load time of my php script (my page) in localhost.
is there any method or tools to check that ? I try to use Performance on Google Chrome, but it's very hard to understand, and with some search on Google I found the firebug plugins of firefox, but it's deleted.
is there any solution ?
I you want the mesure the loading time browser-side, as Madhawa commented, you can view it in the network tab of chrome's developer tools :
If you want the exact execution time of your script, it should be done server-side by checking the time before and after the script. Daniels link should point you to the right direction.
<?php
$time = microtime();
$time = explode(' ', $time);
$time = $time[1] + $time[0];
$start = $time;
echo $start;
?>
Try this
I had a site working nicely in PHP. Then an acquaintance who works at Google got in my ear about how AJAX is the thing. So now I've rejigged the site to be in just HTML & javascript on the client-side making AJAX JSON calls to PHP scripts calling MySQL on the server side. Good caching, no page reloads. It was a good idea.
My question now, however, is this: How much faster might the site run if the server side was in a compiled language (say, C or C++) rather than PHP? (My original prototype was in C - in the terminal!) The PHP scripts are all basic security processing and then database calls to return JSON. How can you benchmark relative speeds? Just using firebug POST data in the console? Or is there a better way?
Thanks :)
if it is that simple, php interpretation will be a tiny fraction of the execution time, most of which will be taken by the connection to the db server. You can always look at Facebook's HipHop - http://developers.facebook.com/blog/post/358/ though
For benchmarking, something like
// first line of your code
$start_time = microtime( $get_as_float = TRUE );
// last line of your code
$time_end = microtime( $get_as_float = TRUE );
$execution_time = $time_end - $start_time;
echo '<!-- executed in: ', $execution_time , ' seconds -->';
will go a long way
I wouldn't use the times in Firebug as those depend on your connection speed/quality etc
I am looking for a possibility to check the user connection speed. It is supposed to be saved as a cookie and javascript files as well as css files will be adapted if the speed is slow.
The possibility for testing speed i have at the moment ist the following
$kb = 512;
flush();
//
echo "<!-";
$time = explode(" ",microtime());
for($x=0;$x<$kb;$x++){
echo str_pad('', 512, '.');
flush();
}
$time_end = explode(" ",microtime());
echo "->";
$start = $time[0] + $time[1];
$finish = $time_end[0] + $time_end[1];
$deltat = $finish - $start;
return round($kb / $deltat, 3);
While it works, I do not like it to put so many characters into my code also if I echo all this I can not save the result in a cookie because there has already been an output.
Could one do something like this in a different file wor something? Do you have any solution?
Thanks in advance.
Do you have any solution?
My solution is to not bother with the speed test at all. Here's why:
You stated that the reason for the test is to determine which JS/CSS files to send. You have to keep in mind that browsers will cache these files after the first download (so long as they haven't been modified). So in effect, you are sending 256K of test data to determine if you should send, say, an additional 512K?
Just send the data and it will be cached. Unless you have MBs of JS/CSS (in which case you need a site redesign, not a speed test) the download time will be doable. Speed tests should be reserved for things such as streaming video and the like.
The only idea what i can come up is a redirect.
Measure users' speed
Redirect to index
While this isn't a nice solution it only need to measure users' speed only once so i think it's excusable.
How about using javascript to time how long it takes to load a page. Then use javascript to set the cookie.
microtime in javascript http://phpjs.org/functions/microtime:472
Using jQuery
<head>
<!-- include jquery & other html snipped -->
<script>
function microtime (get_as_float) {
// http://kevin.vanzonneveld.net
// + original by: Paulo Freitas
// * example 1: timeStamp = microtime(true);
// * results 1: timeStamp > 1000000000 && timeStamp < 2000000000
var now = new Date().getTime() / 1000;
var s = parseInt(now, 10);
return (get_as_float) ? now : (Math.round((now - s) * 1000) / 1000) + ' ' + s;
}
function setCookie(c_name, value, expiredays) {
var exdate=new Date();
exdate.setDate(exdate.getDate()+expiredays);
document.cookie=c_name+ "=" +escape(value)+
((expiredays==null) ? "" : ";expires="+exdate.toUTCString());
}
start = microtime(true);
$(window).load(function () {
// everything finished loading
end = microtime(true);
diff = end - start;
// save in a cookie for the next 30 days
setCookie('my_speed_test_cookie', diff, 30);
});
</script>
</head>
<body>
<p>some page to test how long it loads</p>
<img src="some_image_file.png">
</body>
Some pitfalls:
- The page would need to start loading first. JQuery would need to be loaded (or you can rework the above code to avoid jQuery)
testing speed on ASCII / Latin data may not give the best result, because the characters may get compressed. Besides the high level gzip compression, Some modems / lines (if not all) have basic compression that is able to detect repeating characters and tell the other end that the next 500 are repeat of ' '. I guess it would be best to use binary data that has been compressed
The problem here is that you can't really solve this nicely, and probably not in pure PHP. The approach you've taken will make the user download (512x512) = 262 144 bytes of useless data, which is much bigger than most complete pages. If the user is on a slow connection, they may assume your site is down before the speed test is over (with 10 kB/sec, it'd take half a minute before anything interesting shows up on screen!).
You could make an AJAX request for a file of a known size and time how long that takes. The problem here is that the page needs to be already loaded for that to work, so it'd only work for subsequent pages.
You could make a "loading" page (like you see on GMail when accessing it from a slow connection) that preloads the data, with a link to the low-bandwidth version (or maybe a redirect if the loading is taking too long).
Or you could save the "start" time in the cookie and make an AJAX request when the page is done loading - that would give you the actual loading time of your page; if that's, say, over 10 seconds, you may want to switch to the low-bandwidth version.
None of these, however, will get you the speed on the very first access; and sending a big empty page up front is not a very good first impression either.
you visit the first page(maybe 100kB with all external files), a session is immeadeatly started with
$_SESSION["start_time"] = time();
when page finished loading(jQuery window load or smth:) u send a request again with time,
u compute the speed (jQueryRequestTime - $_SESSION["start_time"] / PageSize) and set another session variable, the next link he clicks then can include custom css/js approved for that
ofc this is not perfect:)
After you've determined the user's speed, send javascript to the browser to set the cookie and then do a refresh or redirect in cases where the speed is below what you'd like.
The only thing I can think of would be to subscribe to a service which offers an IP to net speed lookup. These services work by building a database of IP addresses and cataloging their registered intended use. They're not always accurate, but they do provide a starting point. Look up the user's IP address against one of these and see what it returns.
Ip2Location.com provides such a database, beginning with their DB13 product.
Of course, if your goal is a mobile version of the site, user agent sniffing is a better solution.
A site I am working with is starting to get a little sluggish, and I would like to refine it. I think the problem is with the PHP, but I can't be sure. How can I see how long functions are taking to perform?
If you want to test the execution time :
<?php
$startTime = microtime(true);
// Your content to test
$endTime = microtime(true);
$elapsed = $endTime - $startTime;
echo "Execution time : $elapsed seconds";
?>
Try the profiler feature in XDebug or Zend Debugger?
Two things you can do.
place Microtime calls everywhere although its not convenient if you want to test more than one function. So there is a simpler way to do it a better solution if you want to test many functions which i assume you would like to do.
just have a class (click on link to follow tutorial) where you can test how long all your functions take. Rather than place microtime everywhere. you just use this class. which is very convenient
http://codeaid.net/php/calculate-script-execution-time-%28php-class%29
the second thing you can do is to optimize your script is by taking a look at the memory usage.
By observing the memory usage of your scripts, you may be able optimize your code better.
PHP has a garbage collector and a pretty complex memory manager. The amount of memory being used by your script. can go up and down during the execution of a script. To get the current memory usage, we can use the memory_get_usage() function, and to get the highest amount of memory used at any point, we can use the memory_get_peak_usage() function.
view plaincopy to clipboardprint?
echo "Initial: ".memory_get_usage()." bytes \n";
/* prints
Initial: 361400 bytes
*/
// let's use up some memory
for ($i = 0; $i < 100000; $i++) {
$array []= md5($i);
}
// let's remove half of the array
for ($i = 0; $i < 100000; $i++) {
unset($array[$i]);
}
echo "Final: ".memory_get_usage()." bytes \n";
/* prints
Final: 885912 bytes
*/
echo "Peak: ".memory_get_peak_usage()." bytes \n";
/* prints
Peak: 13687072 bytes
*/
http://net.tutsplus.com/tutorials/php/9-useful-php-functions-and-features-you-need-to-know/
PK
You can also make it manually, by recording microtime() value in various places, like this:
<?
$TIMER['start']=microtime(TRUE);
// some code
$query="SELECT ...";
$TIMER['before q']=microtime(TRUE);
$res=mysql_query($query);
$TIMER['after q']=microtime(TRUE);
while ($row = mysql_fetch_array($res)) {
// some code
}
$TIMER['array filled']=microtime(TRUE);
// some code
$TIMER['pagination']=microtime(TRUE);
/and so on
?>
and then visualize it
<?
if ('127.0.0.1' === $_SERVER['REMOTE_ADDR']) {
echo "<table border=1><tr><td>name</td><td>so far</td><td>delta</td><td>per cent</td></tr>";
reset($TIMER);
$start=$prev=current($TIMER);
$total=end($TIMER)-$start;
foreach($TIMER as $name => $value) {
$sofar=round($value-$start,3);
$delta=round($value-$prev,3);
$percent=round($delta/$total*100);
echo "<tr><td>$name</td><td>$sofar</td><td>$delta</td><td>$percent</td></tr>";
$prev=$value;
}
echo "</table>";
}
?>
an IP address check implies that we are doing this profiling on the working site
Though I doubt it's PHP itself. Most likely it's database. So, pay most attention to query execution timing.
however, a "site" term is very broad. It includes also JS, CSS, images and stuff. So, I'd suggest to start form FirebFug's Net page to see what part of whole page takes more time.
Of course, refining can be done only after analysis of profiling results, and cannot be advised here without it.
Your best bet is Xdebug. Im happy as it comes bundled in my PHPed IDE. I can get profiler data at the click of a button.
So maybe you could consider that.
I had similar issues and so I created 2 new tables on the database and two new functions. One was audit_sql and the other was audit_code. Because I used an SQL abstraction class it was easy to time every single SQL call (I used php microtime as some others have suggested). So, I called microtime before and after the SQL call and stored the results on the database.
Similarly with pages. I called microtime at the start and end of each page and if necessary at the start and end of functons, divs - whatever I thought might be a culprit.
The general results were:
SQL calls to MySQL were almost instantaneous and were nto a problem at all. The only thing I would say is that even I was surprised at the number being executed! The site is generated from the database - even the menus, permissions etc. To produce the home page the SQL calls were measured in the 100s.
PHP was not the culprit. This was even more instantaneous that MySQL.
The culprit was.... (big build up!) calls to You Tube and Picassa and other sites like that. I host videos and photo albums on the site (well, I don't actually store them - they are stored on YT etc.) and on the home page are thumbnails that are extracted from You Tube and the like via the You Tube PHP API/Zend Framework. Because this is all http based to the other sites, each one was taking 1, 2 or 3 seconds. This was causing those divs containing these to take between 6 and 12 seconds and the home page up to 17 seconds.
The solution - store all thumbnails on my server. The first time one has to be served from the remote site (YT, Picassa etc.) so do that and then store it on your own site. Future times, you check if you have it and if so serve it always from your server. Cuts the page load time down to 2-3 seconds tops. Granted the first person to view the first home page load after someone has loaded more videos/images will take some time, but not thereafter. People will put a long one-off page load time down to their connection/the internet in general. Too many slow loads of your site and they will stop visiting!
I hope that helps somewhat.
Is there a way to determine how long it takes a web page, and all it's content, to load with PHP?
I have already tried this:
$time_start = microtime(true);
(All the content of the web page here)
$time_end = microtime(true);
$time = $time_end - $time_start;
echo $time;
However, the problem with this (as far as I can tell) is that I'm only calculating the time it takes the php script to execute. This doesn't factor in any images or videos that are on the page.
Is there a way to determine how long it takes a web page to load including images or videos using php?
Basically what I'm trying to do is test the speed of my server with out factoring in my connection speed.
You need to download firebug and then open the "net" tab and wait for the page to finish loading and it will show you the total load time of all requests.
You might consider ab (ApacheBench). It's for testing the performance of your web server, but you can run it against a particular URL if you're just concerned about one page. One advantage is that it can run from the command line and issue multiple requests in parallel, enabling you to do some kind of load testing.
If you want to factor in how long it takes to actually load on the browser, you'll need some kind of javascript solution. One approach with code is presented in the article, Optimizing Page Load Time, which is worth reading.
Google Chrome has an amazing Audit tool built in which gives a good list of ways to improve a given site as far as speed goes.
alt text http://far.id.au/audit.png
try this
// top of the page --
<?php
$time = microtime();
$time = explode(' ', $time);
$time = $time[1] + $time[0];
$start = $time;
?>
// end of the page --
<?php
$time = microtime();
$time = explode(' ', $time);
$time = $time[1] + $time[0];
$finish = $time;
$total_time = round(($finish - $start), 4);
echo 'Page generated in '.$total_time.' seconds.';
?>
You can either use the NavigationTiming API (https://dvcs.w3.org/hg/webperf/raw-file/tip/specs/NavigationTiming2/Overview.html) and beacon the timings back - this is how boomerang works (https://github.com/lognormal/boomerang)
Alternatively use a synthetic test tool like webpagetest.org to gather timings from various browsers/locations/network variations