I'm curious if there exists a method to intentionally slow the page load?
I'm testing my HTML & PHP pages on my local host right now, and I want to see how my loading gif etc will perform when the page load is slower.
I realize this is a rare request as most developers are only concerned with speeding up page loads, but I thought there might be a method, using either javascript/jQuery or PHP, to do something like this for testing purposes.
Thanks for any help!
Note: I'm testing on MAMP, so that's Apache server running on Mac OS 10.7
You can use php's sleep($seconds) function to slow down a page load. However, you would need to turn implicit output buffer flushing to "on" with ob_implicit_flush(true); if you want anything to be sent to the user's browser before the page is done being processed. Otherwise your page won't have ANY contents until it's done loading. Calling sleep alone won't do the trick.
In Chrome, you can simulate a slow Internet connection with the developer tools. under the "Network" Tab on the far right. You can use a preset like "Fast G3" or can create your own with exact numbers for upload, download and ping.
Reference: https://helpdeskgeek.com/networking/simulate-slow-internet-connection-testing/
This is what I would try:
Use a php resource as source of the image:
<img src="images/gifLoager.php" />
in gifLoader.php, read your image file, output it byte by byte with a delay in the loop.
$fp = fopen( $path, 'rb');
while(!feof($fp)) {
print(fread($fp, 1024));
flush();
Sleep(1);
}
fclose($fp);
Don't forget to appropriately set the headers before outputting the binary data.
Referrences:
http://stackoverflow.com/questions/1563069/stream-binary-file-from-mysql-to-download-with-php
http://php.net/manual/en/function.sleep.php
http://www.gamedev.net/topic/427622-php-and-file-streaming-script-with-resume-capability/
UPDATE 2015-04-09
Use Chrome 'Device Mode':
This tool has a network throttling feature that allows you to see how your page may render on a device with a slow network bandwidth. It has many other features that allow you to emulate features on various devices such as screen size and touch.
https://developer.chrome.com/devtools/docs/device-mode
Moussa has the right idea. The best way to test a slow page load is to use Chrome's developer tools. Select the network tab, and then click the dropdown that says "No throttling". Then change the page to your desired speed.
This way is superior to using the sleep function because you don't have to mess with any code and then remember to take it out. If you want to change the speed, just change the throttling level and you're set.
For more information on throttling check out the docs.
A little bit of JS setTimeout can do the trick
setTimeout(function()
{
// Delayed code in here
alert('You waited 5 seconds to see me'); // Waits 5 seconds, then alerts this
}, 5000); // 5000 = 5 seconds
You can use sleep():
<?php
// Delays for 10 seconds.
sleep(10);
?>
...
html here
...
Call sleep() in your PHP code to delay the request to the server.
you can use sloppy local web proxy to slow down your connection (it's in JAVA, so it'll probably run on your devel machine. You might also want to turn off mod_deflate for testing purposes, if you want so how your browser responds to slow HTML load (for example, dynamicly sized HTML tables, etc)
Also, see this question on webmasters.stackexchange.com
Related
Is there a way to retrieve the fully rendered html from a page with javascript post rendering ? If I use curl, it simply retrieves the base html, but lacks the post rendering of iframes, javascript processing etc.
What would be the best way to accomplish this?
As no-one else has answered (except the copmment above, but I'll come to that later) I'll try to help as much as possible.
There no "simple" answer. PHP can't process javascript/navigate the DOM natively, so you need something that can.
Your options as I see it:
If you are after screen grab (which is what I'm hoping as you also want Flash to load), I suggest you use one of the commercial APIs that are out there for doing this. You can find some in this list http://www.programmableweb.com/apitag/?q=thumbnail, for example http://www.programmableweb.com/api/convertapi-web2image
Otherwise you need to run something yourself that can handle Javascript and the DOM on, orconnected to, your server. For this, you'd need an automated browser that you can run serverside and get the information you need. Follow the list in Bergi's comment above and you'd need to test a suitable solution - the main one Selinium is great for "unit testing" on a known website, but I'm not sure on how I'd script it to handle random sites, for example. As you would (presumably) only have one "automated browser" and you don't know how long each page will take to load, you'd need to queue the requests and handle one at a time. You'd also need to ensure pop-up alert()s are handled, all the third party libraries (you say you want flash?!) installed, handle redirects, timeouts and potential memory hogs (if running this non-stop, you'll periodically want to kill your browser and restart it to clean out the memory!). Also handle virus attacks, pop-up windows and requests to close the browser completely.
Thirdly, VB has a web-browser component. I used it for a project a long time ago to do something similarish, but on a known site. Whether it's possible with .NET (to me, it' a huge security risk), and how you program for unknowns (e.g. pop-ups and Flash) I have no idea. But if you're desparate an adventurous .NET developer may be able to suggest more.
In summary - if you want more than a screen grab and can choose option 1, good luck ;)
If you're looking for something scriptable with no GUI you could use a headless browser. I've used PhantomJS for similar tasks.
If still relevant, I found that the easy way to this is using PhantomJs as a Service;
public string GetPagePhantomJs(string url)
{
using (var client = new System.Net.Http.HttpClient())
{
client.DefaultRequestHeaders.ExpectContinue = false;
var pageRequestJson = new System.Net.Http.StringContent(#"{'url':'" + url + "','renderType':'plainText','outputAsJson':false }");
var response = client.PostAsync("https://PhantomJsCloud.com/api/browser/v2/SECRET_KEY/", pageRequestJson).Result;
return response.Content.ReadAsStringAsync().Result;
}
}
It is really simple, when subscribing to the service there is a free plan that allows 500 pages/day. The SECRET_KEY is to be replaced by your own key that you will get.
Use a "terminal" browser like w3m or lynx. Even if the site you want to access needs login, this is possible, for example:
curl [-u login:pass] http://www.a_page.com | w3m -T text/html -dump
or
curl [-u login:pass] http://www.a_page.com | lynx -stdin -dump
This should give you the whole html with all frames etc.
look at this command line IECapt.exe
It has no javascript support, but lynx was useful for me in a situation where I needed to do processing of data from a webpage. This way I got the (plaintext) rendering and didn't have to filter through the raw html tags as with curl.
lynx -nonumbers -dump -width=9999999 ${url} | grep ... et cetera.
Is it possible to have print_r() displayed live. By live I mean while the script is executed. I do not want to wait the end of the script to have it displayed. Hope I am clear. Thank you in advance for your replies. Cheers. Marc
Likely you are using PHP through a webserver like Apache.
Webservers have caching implemented, they tend to send their data out in larger blocks.
Browsers also have caching implemented, they only refresh the data from time to time and at the end when they finished loading the website.
Finally PHP also has caching built in.
HTTP was not made for "live" display it's more like a static page, that's why people invented "AJAX" and Javascript to poll for changed/live events after a page was loaded.
What you can do:
To make sure the data from PHP is sent to the webserver you can use the command flush()
There is also a php setting called implicit_flush you might want to look up.
The webserver is likely using gzip/mod_gzip to compress output. You need to disable that behaviour.
Maybe that will do it: #apache_setenv('no-gzip', 1);
Add some more content than just pure text, if you put the data inside a simple "table" including </table> it's more likely browsers will display it during load.
Look in php ini for this:
output_buffering = Off
zlib.output_compression = Off
You can do this at runtime too (#ini_set('zlib.output_compression', 0);)
Some browsers will only display data if they receive a certain amount of bytes.
If I recall right 256 byte might help.
str_repeat(" ", 256); (or anything else)
I'd like to add that these steps can help solve the issue but from my experience the results are not perfect.
Every new browser and browser version might act different.
I have this:
echo "<font color=\"#000000\">text</font>";
usleep(2000000);
header("location:/otherpage.php");
?>`
Please note that this will be included in an iframe...
The problem is that it isn't echoing the echo statement, but then sleeping for two seconds and then redirecting (which is all correct except for the echo part)...
Anyone have an idea why this is happening?? It's not the iframe because when you go straight to the file (separately from the iframe) the same happens. Could it be the usleep??
Thanks
What you are doing above will not work. First, you would need to do a flush in order to make sure the data was sent. Second, though, and more important, you can't change the header after the flush, which would result in either the header not being sent or the text not being sent.
If all you want to do is change the data after a delay, did you consider doing the following:
header('Refresh: 2; url=http://my.site.com/otherpage.php);
echo "<font color=\"#000000\">text</font>";
This will send the information in the browser, instructing the browser to change to the new URL after 2 seconds.
This won't work since you can't change the header after outputting text.
The only option is to use a meta refresh or javascript when you want to exact replicate this behaviour.
But the output problem you can solve by flushing the buffer but then no redirection is possible as i mentioned before.
Another very important thing is:
DONT USE USLEEP FOR SUCH THINGS.
Why? Because when your script is heavily loaded every request which needs too much time is very bad and you will run out of php threads (depending on your php webserver implementation). So, for such timeouts you should use clientside code (if possible).
It's sleeping while on the server, and then sending the output. Also you can't send headers after echoing something.
You should use javascript or a meta redirect, this will allow you to wait a few seconds before redirecting, and the time and url for both of those can be generated by your php script.
You can't to that. I have a feeling you're misunderstanding the purpose of PHP and it's role as a server-side language.
If you want to "redirect" to another page in PHP, you do so using HTTP headers, which you did. The thing is, those are the headers, so they must be sent before any text body (like an echo statement). You're trying to do something that should be done client-side on the server.
Instead, make an HTML page with some JavaScript code to do the redirection, like that:
<script>
setTimeout(function() {
window.location = "otherpage.html";
}, 2000);
</script>
Expanding my comment into an answer:
It's not possible to redirect like this (outputting some content and then trying to send in a header) because headers must be sent before all content. PHP will also complain about this, using default settings you should see a warning.
Second, the usleep delay might not be observable due to the server buffering the content throughout the sleep and only sending it in one big chunk afterwards. In general, it isn't reliable to make the browser display content "in steps" like this, although it can be made to work more or less if you pile the hacks high enough.
May I suggest that if this kind of behavior is what you need, you should look into making a (series of?) appropriate AJAX call, which can be orchestrated perfectly.
What everyone else said. Just adding that usleep() will make clients hold connections open on your server---it's a very inefficient use of limited server resources. Your PHP should always send everything it can as quickly as possible so your web server can close the connection.
Does going from
<script type="text/javascript" src="jquery.js"></script>
to
<script type="text/javascript">
<?php echo file_get_contents('jquery.js'); ?>
</script>
really speed things up?
I am thinking it does because php can fetch and embed a file's contents faster than the client's browser can make a full request for the file, because php isn't going over the network.
Is the main difference that the traditional method can be cached?
It may be faster on the first page load, but on every subsequent load it will be much slower. In the first example, the client browser would cache the result. In the second, it can not.
If you only ever serve one single website in your client's life, then yes, because you only have one HTTP request instead of two.
If you are going to serve multiple sites which all link to the same javascript source file, then you're duplicating all this redundant data and not giving the client a chance to cache the file.
You need to transfer the bytes to the browser in both cases. The only difference is that you save a HTTP request in the latter case.
Also make sure to escape the javascript with CDATA or using htmlspecialchars.
If you include your JS lib in your HTML page, it cannot be cached by the browser. It's generally a good idea to keep the JS separate from the normal HTML code because the browser can cache it and does not need to fetch it on subsequent requests.
So to make it short, it's an optimization that works only if the page is called once by the user and jquery is not used on other pages.
Alternatively, you may want to use the jquery from google apis - with the effect that they are often in the browser's cache anyway, so there is no need to transfer the lib at all.
It does so for that ONE PAGE.
All subsequent pages using the same library (jquery.js downloaded from the same URL) SUFFER, because if you include the reference to the external file yes, it has to be downloaded in an extra connection (which is relatively cheap with HTTP\1.1 and pipelining), BUT - provided your webserver serves it with useful headers (Expires:-header far in the future), the browser caches that download, while with the "optimization" it has to retrieve it with every single content-page.
Also see pages like this one:
http://www.stevesouders.com/blog/2008/08/23/revving-filenames-dont-use-querystring/
(the keyword here is "revving" in connection with those far-future expiration dates)
The first one is better since the browser can cache the script. With the second version it will have to re-download the script every time it loads the page even if the script didn't change.
The only time the second version is an improvement for scripts that cannot be cached by the browser.
It depends on how many files use the same file. However, in most situations this will be slower than your first piece of code, mostly because jquery.js can be cached.
Yes, that would initially be a performance optimization regarding the number of HTTP-requests being used to serve the page - your page will however become a bit bigger per pageload as the jquery.js will be cached in the browser after the first download.
It does if your page is static.
But if its not static your browser will download the page very time while jquery doesn't change but still included. if you use src="jquery.js" and the page changes, the browser will load jquery from cache and not download it again so using src="jquery.js" is actually faster.
What is faster on opening and proccecing:
Having under 1 file all the jquery functions or
Each function to a separate file and call it when ever you need it?
Ex. I have a blabla.js file that has 4 functions in it.
And my xaxa.php that calls the blabla.js.
Now, when I firstly open my page its fast enough. No problem (even with cookies cleared and all)
BUT... when I first (and after) click a button that activates a part of my blabla.js all my links and functions are opening/working slower.
So should I separate my functions and load each js file where ever I need it or my problem is somewhere else?
Thank you
(As I said I start to suspect something in my structure)
So here is a sample of my jq:
$(document).ready(function(){
$(".avoid_ref_add").click(function(){
var keyValues = {
pid : $(this).parent().find('input[name="pid"]').val()
};
$.post('help_scripts/cartfunct.php', keyValues, function(rsp){
$('#content').load("p_body.php");
});
return false;
});
function remove
function update
});
and I have my items.php items2.php items3.php...
Now, MY COOKIES I HAVE THEM CLEARED NO CACHE... When I firt open the site, it loads fast and all links are fast...
But if I click that add button everything start to work REALLY slow...
IF I just refresh the whole page it starts working fast again and so on...
FOR ME is quite strange and I cannot figure what I did wrong... Because if the page was slow, it wouldn't load fast from the first time... Correct? Is it something in my code?
You should optimally put all your javascript in one javascript file.
This file could be served with gzip compression and far-future expiry headers to limit bandwidth usage (and result in faster pageloads). You could even run a minimizer on your javascript to reduce file size.
What you are really asking for is the art of minimalization/optimisation and this article is a good read: http://developer.yahoo.com/performance/rules.html
It is clearly better to hold all your JS functions in a single file and this should be faster. Think that when a different function needs to be called you want to avoid the delay inserted by the hit to the server
Execution speed should not be affected by the way you distribute functions among the files. However, download of all resources related to a page is faster when fewer files need to be downloaded. That's because each HTTP requests comes with a HTTP request header and each HTTP response come with a HTTP response header. If the keep-alive feature of HTTP is not used, additional overhead comes from establishing a TCP connection for each request.
You don't really have to separate each function to a separate file. I would suggest keeping the functions that are related to each other in the same JS file and putting others in another JS file.
This makes it easier for you to manage the code and increases the reusability/modularity of the JS file.
After that, you can use minify http://code.google.com/p/minify/ to combine the necessary JS files when you need to, thus reducing the number of requests made to retrieve the JS files.