How can I determine the loading time of an external page using PHP?
For example, finding the loading time of https://stackoverflow.com/questions.
If it is on another server, then it is basically the same way you would time something else, only instead of calling a locally defined function, you use file_get_contents:
$t = microtime( TRUE );
file_get_contents( "your tested url" );
$t = microtime( TRUE ) - $t;
print "It took $t seconds!";
As a warning, that will also include the time it takes to make the request and receive the request (time spent over the wire). Unfortunately, unless you actually have access to that server, that cannot be helped.
Now, if you're trying to get the render time of your site, you can use ob_start and ob_end_clean:
$t = microtime( TRUE );
ob_start();
// do all of your rendering
ob_end_clean();
$t = microtime( TRUE ) - $t;
print "It took $t seconds!";
Measuring page load times must be done on the client. Implementing a full browser using PHP is not a trivial task.
Do you want to measure the page load times of some remote website?
Or do you want to instrument your own site to measure page load times?
Do you really mean page load times or do you mean the time taken to process each request?
If you're talking about request times do you mean the time taken from receiving the request to delivering it to the client
or just the time for generating the response?
Related
I have a php script making requests to some web site. I run this script from command line so no web server on my side is involved. Just pure PHP and a shell.
The response is split into pages so I need to make multiple requests to gain all the data with one script run. Obviously, the request's URL is identical except one parameter. Nothing complicated:
$base_url = '...';
$pages = ...; // a number I receive elsewhere
$delay = ...; // a delay to avoid too many requests
$p = 0;
while ($p < $pages) {
$url = $base_url . "&some_param=$p";
... // Here cURL takes it's turn because of cookies
sleep($delay);
}
The pages I get this way look all the same - like the first one that was requested. (So I get just a repetitive list multiplied by the number of pages.)
I decided that it happens because of some caching on the web server's end which persists despite of an additional random parameter I pass. Closing and reinitializing cURL session doesn't help as well.
I also noticed that if I quickly fix the initial $p value manually (so requests start from different page) and then launch the script again, the result changes. I do it quicker than $delay value.
It means that two different requests made from the same script run give same result, while two different requests made from two different script runs give different results, regardless of delay between the requests. So it can't be just caching on the responded side.
I tried to work that around and wrapped the actual request in a separate script which I run using exec() from the main script. So there is (should be, I consider) a separate shell instance for any single page request, and those requests should not share any kind of cache between them.
Despite of that, I keep getting the same page again. The code looks something like that:
$pages = ...;
$delay = ...;
$p = 0;
$command_stub = 'php get_single_page.php';
while ($p < $pages) {
$command = $command_stub . " $p";
exec($command, $response);
// $response is the same again for different $p's
sleep($delay);
}
If I again change the starting page manually in the script, I get a result for that page all over again. Until I change it once more. And so on. Several minutes may pass between two runs of the main script, and it still yields identical result until I switch the number by hand.
I can't comprehend why this is happening. Can somebody explain it?
The short answer is no. Curl certainly doesn't retain anything between executions unless configured to do so (e.g.: setting a cookie file).
I suspect the server is expecting a session token of some sort (cookie or other HTTP header are my guess). Without the session token it will just ignore the request for subsequent pages.
I want to set up a simple cache feature with php. I want the script to get data from somewhere, but not to do it on every page view, but only every hour.
I know i can have a cron job that runs a php script every hour.
But I was wondering if this can be achieved without cron, just inside the php script that created the page based on the data fetched (or cached). I'm really looking the simplest solution possible. It doesn't have to be accurate
I would use APC as well, but in either case you still need some logic. Basic file cache in PHP:
if (file_exists($cache_file) and time() - filemtime($cache_file) < 3600)
{
$content = unserialize(file_get_contents($cache_file));
}
else
{
$content = your_get_content_function_here();
file_put_contents($cache_file, serialize($content));
}
You only need to serialize/unserialize if $content is not a string (e.g. an array or object).
Why just don't use APC ?
you can do
apc_store('yourkey','yourvalue',3600);
And then you can retrive the content with:
apc_fetch();
I have a script that is very long to execute, so when i run it it hit the max execution time on my webserver and end up timing out.
To illustrate that imagine i have a for loop that make some pretty intensive manipulation one million time. How could i spread this loop execution in several parts so that i don t hit the max execution time of my Webserver?
Many thanks,
If you have an application that is going to loop a known number of times (i.e. you are sure that it's going to finish some time) you can increase time limit inside the loop:
foreach ($data as $row) {
set_time_limit(10);
// do your stuff here
}
This solution will protect you from having one run-away iteration, but will let your whole script run undisturbed as long as you need.
Best solution is to use http://php.net/manual/en/function.set-time-limit.php to change the timeout. Otherwise, you can use 301 redirects to send to an updated URL on a timeout.
$threshold = 10000;
$t = microtime();
$i = isset( $_GET['i'] ) ? $_GET['i'] : 0;
for( $i; $i < 10000000; $i++ )
{
if( microtime - $t > $threshold )
{
header('Location: http://www.example.com/?i='.$i);
exit;
}
// Your code
}
The browser will only respect a few redirects before it stops, you're better to use javascript to force a page reload.
I someday used a technique where I splitted the work from one file into three parts. It was just an array of 120.000 elements with intensive operation. I created a splitter script which stored the arrays in a database of the size of 40.000 each one. Then I created an HTML file with a redirect to the first PHP file to compute the first 40.000 elements. After computing the first 40.000 elments I had again a HTML forward to the next PHP file and so on.
Not very elegant, but it worked :-)
If you have the right permissions on your hosting server, you could use the php interpreter to execute a php script and have it run in the background.
See Asynchronous shell exec in PHP.
if you are running a script that needs to execute for unknown time, you can use:
set_time_limit(0);
If possible you can make the script so that it handles a portion of the wanted operations. Once it completes say 10%, you via AJAX call the script again to execute the next 10%. But there are circumstances where this is not an ideal solution, it really depends on what you are doing.
I used this method to create a web-based crawler which only ran on my computer for instance. If it had to do the operations at once it would time out as well. So it was split into 200 "tasks", each called via Ajax once the previous completes. Works perfectly, and it's been over a year since it started running (crawling?)
i have this function that gives me an output of a number. (the number is my total amount of downloads from my iphone themes.)
because the code has to make so many requests, it loads the page very slowly.
what would be the best way for me to go about the code loading into a variable and than calling it on the second page refresh. so it dosnt take so long to load?
or any other method will do. i just want it to not take so long to load!
also this isnt on my server so i cant use $.ajax
<?php
function all_downloads() {
$allThemes = array(
'com.modmyi.batterytheme',
'com.modmyi.connectiontheme',
'com.modmyi.icontheme',
'com.modmyi.percenttheme',
'com.modmyi.statusnotifiertheme',
'com.modmyi.cnote',
'com.modmyi.iaccescnotekb',
'com.modmyi.cnotelite',
'com.modmyi.multibrowsericon',
'com.modmyi.changeappstoreiconwithinstallous'
);
$total = 0;
foreach($allThemes as $com_modmyi){
$theme = file_get_contents( "http://modmyi.com/cstats/index.php?package=".$com_modmyi.'&output=number');
$theme = str_replace(",","", $theme);
$almost_done += $theme;
$rock_your_phone = 301; //From c-note and Multi Lock Screen Theme on Rock Your Phone
$total = ($almost_done + $rock_your_phone);
}
echo number_format($total);
}
?>
call the function with AJAX !
Th basic idea of using ajax is to to help make web applications function more like desktop applications
most actions that an end-user takes in his or her browser send a request back to the web server. The server then processes that request, perhaps sends out further requests, and eventually responds with whatever the user requested. <--whats your problem is ** **(slow !)
with AJAX you can call a PHP function with out reloading a page
please go though this tutorial which is really simple
I am looking for a possibility to check the user connection speed. It is supposed to be saved as a cookie and javascript files as well as css files will be adapted if the speed is slow.
The possibility for testing speed i have at the moment ist the following
$kb = 512;
flush();
//
echo "<!-";
$time = explode(" ",microtime());
for($x=0;$x<$kb;$x++){
echo str_pad('', 512, '.');
flush();
}
$time_end = explode(" ",microtime());
echo "->";
$start = $time[0] + $time[1];
$finish = $time_end[0] + $time_end[1];
$deltat = $finish - $start;
return round($kb / $deltat, 3);
While it works, I do not like it to put so many characters into my code also if I echo all this I can not save the result in a cookie because there has already been an output.
Could one do something like this in a different file wor something? Do you have any solution?
Thanks in advance.
Do you have any solution?
My solution is to not bother with the speed test at all. Here's why:
You stated that the reason for the test is to determine which JS/CSS files to send. You have to keep in mind that browsers will cache these files after the first download (so long as they haven't been modified). So in effect, you are sending 256K of test data to determine if you should send, say, an additional 512K?
Just send the data and it will be cached. Unless you have MBs of JS/CSS (in which case you need a site redesign, not a speed test) the download time will be doable. Speed tests should be reserved for things such as streaming video and the like.
The only idea what i can come up is a redirect.
Measure users' speed
Redirect to index
While this isn't a nice solution it only need to measure users' speed only once so i think it's excusable.
How about using javascript to time how long it takes to load a page. Then use javascript to set the cookie.
microtime in javascript http://phpjs.org/functions/microtime:472
Using jQuery
<head>
<!-- include jquery & other html snipped -->
<script>
function microtime (get_as_float) {
// http://kevin.vanzonneveld.net
// + original by: Paulo Freitas
// * example 1: timeStamp = microtime(true);
// * results 1: timeStamp > 1000000000 && timeStamp < 2000000000
var now = new Date().getTime() / 1000;
var s = parseInt(now, 10);
return (get_as_float) ? now : (Math.round((now - s) * 1000) / 1000) + ' ' + s;
}
function setCookie(c_name, value, expiredays) {
var exdate=new Date();
exdate.setDate(exdate.getDate()+expiredays);
document.cookie=c_name+ "=" +escape(value)+
((expiredays==null) ? "" : ";expires="+exdate.toUTCString());
}
start = microtime(true);
$(window).load(function () {
// everything finished loading
end = microtime(true);
diff = end - start;
// save in a cookie for the next 30 days
setCookie('my_speed_test_cookie', diff, 30);
});
</script>
</head>
<body>
<p>some page to test how long it loads</p>
<img src="some_image_file.png">
</body>
Some pitfalls:
- The page would need to start loading first. JQuery would need to be loaded (or you can rework the above code to avoid jQuery)
testing speed on ASCII / Latin data may not give the best result, because the characters may get compressed. Besides the high level gzip compression, Some modems / lines (if not all) have basic compression that is able to detect repeating characters and tell the other end that the next 500 are repeat of ' '. I guess it would be best to use binary data that has been compressed
The problem here is that you can't really solve this nicely, and probably not in pure PHP. The approach you've taken will make the user download (512x512) = 262 144 bytes of useless data, which is much bigger than most complete pages. If the user is on a slow connection, they may assume your site is down before the speed test is over (with 10 kB/sec, it'd take half a minute before anything interesting shows up on screen!).
You could make an AJAX request for a file of a known size and time how long that takes. The problem here is that the page needs to be already loaded for that to work, so it'd only work for subsequent pages.
You could make a "loading" page (like you see on GMail when accessing it from a slow connection) that preloads the data, with a link to the low-bandwidth version (or maybe a redirect if the loading is taking too long).
Or you could save the "start" time in the cookie and make an AJAX request when the page is done loading - that would give you the actual loading time of your page; if that's, say, over 10 seconds, you may want to switch to the low-bandwidth version.
None of these, however, will get you the speed on the very first access; and sending a big empty page up front is not a very good first impression either.
you visit the first page(maybe 100kB with all external files), a session is immeadeatly started with
$_SESSION["start_time"] = time();
when page finished loading(jQuery window load or smth:) u send a request again with time,
u compute the speed (jQueryRequestTime - $_SESSION["start_time"] / PageSize) and set another session variable, the next link he clicks then can include custom css/js approved for that
ofc this is not perfect:)
After you've determined the user's speed, send javascript to the browser to set the cookie and then do a refresh or redirect in cases where the speed is below what you'd like.
The only thing I can think of would be to subscribe to a service which offers an IP to net speed lookup. These services work by building a database of IP addresses and cataloging their registered intended use. They're not always accurate, but they do provide a starting point. Look up the user's IP address against one of these and see what it returns.
Ip2Location.com provides such a database, beginning with their DB13 product.
Of course, if your goal is a mobile version of the site, user agent sniffing is a better solution.