When a user clicks a button on my site, a particular part of a page will be refreshed, but it takes a while to render the entire section. Is it possible to display/output parts of the section in real time rather than wait for the entire section to finish? How would i do this if im sending it as an ajax request for html content? Thanks
I would recommend setting cache: true in your AJAX call (if using jQuery) and either way, you're going to want to set the HTTP response headers. Here are some examples. By setting the cache-control headers and expires etc, your AJAX requests - if unchanged - will be loaded from cache instead. This will drastically speed things up.
A quick example:
if (!headers_sent()) {
// seconds, minutes, hours, days
$expires = 60*60*24*14;
header('Pragma: public');
header('Cache-Control: maxage=' . $expires);
header('Expires: ' . gmdate('D, d M Y H:i:s', time() + $expires) . ' GMT');
}
Note: this will not work with POST requests, just GET.
As said, caching your AJAX requests is a good option. Besides that, all I think you can to do make your application seem faster is to show users a progressbar while (re)loading content with AJAX.
You could implement a Pagelets technique. Essentially you could work this in a way, like below:
index.html
Ajax -> load content from PHP script which outputs the navigation bar
Ajax -> load content from PHP script which outputs the body
And have several different Ajax calls loading each different part of your site. This does have a disadvantage of increasing the amount of HTTP requests the user's browser has to make, however.
This is presuming though that different parts can be generated separately from the rest of the page. If all your content needs to be generated at the same time then this technique will be of no use to use.
This is of a good read (Facebook project named "BigPipe"): http://www.facebook.com/note.php?note_id=389414033919
Related
I have a dynamic php site which generates a json string. So if a javascript does 10 requests to this site per minute the json string is generated and echoed 10 times.
I want to limit the requests going through to my server to 1 request per minute.
I thought the cache control header would do the job, but it looks like I'm wrong.
Here is what I tried. I set my php page to this:
<?php
header("Cache-Control: max-age=60");
echo "{'test':'abc'}";
?>
Loaded the site with the browser; it returned {'test':'abc'}
Then I quickly changed the php page to:
<?php
header("Cache-Control: max-age=60");
echo "{'qwe':'123'}";
?>
Reloaded the page quickly and got: {'qwe':'123'}
So the second request went through even though the minute wasn't over. I wanted the first result to be returned from cache for one minute, without doing another request.
What am I doing wrong?
I'm writing a script that will combine and minify requested CSS files and then echo them out. I have most of the work done, however I'm stuck on one small, yet very important piece: Leveraging browser caching.
Most visitors to ours sites are new, and rarely ever come back. So really what we're worried about is caching between page requests in the same session. Like, they hit our main page and then navigate to a few other pages and leave.
The problem I'm having is that I'm storing a timestamp in the session for the last request time for each specific set of files. So if I want main.css and internet.css this request and then main.css and phone.css next page view then the timestamp of the last request will be updated, but if I requested the same set of files again, the timestamp would be unchanged from last time.
Hopefully I'm making sense. The issue is that when a file is unchanged from last request to this one, I return 304 not modified. However, the browser is not caching the css like it should. Any ideas as to why not?
You can take a look at my code here: https://gist.github.com/4235836 (I would normally put it here, but it's kinda long)
I think you should check the request header If-modified-since before sending out a 304:
if (isset($_SERVER['HTTP_IF_MODIFIED_SINCE']) &&
strtotime($_SERVER['HTTP_IF_MODIFIED_SINCE']) >= $minifier->lastModified)
{
header('HTTP/1.0 304 Not Modified');
exit;
}
Also notice the exit. If your sending out a 304, it means the client already has the latest version, so you should exit your script there.
Edit:
When using expire headers, the browser will assume it already has the latest version. So it wont even make a request to the server, unlike using the HTTP_IF_MODIFIED_SINCE header.
So you might also want to add:
header('Expires: '.gmdate('D, d M Y H:i:s \G\M\T', time() + (60 * 60 * 24)));
Then to make sure it will request a new version once the file has changed, you can do sonething like:
<link rel="stylesheet" type="text/css"
href="minify.php?v=<?php echo filemtime($theFileToMinify) ?>">
I'm developing sites and some visitor's browsers appear with old cache.
Is there a way we can clear visitor's browser cache using codes from the server side or even javascript so they don't have to clear themselves?
I cannot find the direct answer to this.
There must be a way big companies do like Facebook, Ebay etc
We have been using htaccess to determine the caching rules of the clients. We explicitly give the cache a 24h lifetime and we put no-cache rules the day before we do the update. It has helped but it is tedious and not so reliable.
Just posting it to give you ideas if no one answers, but I would really love to get the answer too. :)
First Method:
You can actually save the output of the page before you end the script, then load the cache at the start of the script.
example code:
<?php
$cachefile = 'cache/'.basename($_SERVER['PHP_SELF']).'.cache'; // e.g. cache/index.php.cache
$cachetime = 3600; // time to cache in seconds
if(file_exists($cachefile) && time()-$cachetime <= filemtime($cachefile)){
$c = #file_get_contents($cf);
echo $c;
exit;
}else{
unlink($cachefile);
}
ob_start();
// all the coding goes here
$c = ob_get_contents();
file_put_contents($cachefile);
?>
You can actually save the output of the page before you end the script, then load the cache at the start of the script.
example code:
If you have a lot of pages needing this caching you can do this:
in cachestart.php:
in cacheend.php:
<?php
$c = ob_get_contents();
file_put_contents($cachefile);
?>
Then just simply add
include('cachestart.php');
at the start of your scripts. and add
include('cacheend.php');
at the end of your scripts. Remember to have a folder named cache and allow PHP to access it.
Also do remember that if you're doing a full page cache, your page should not have SESSION specific display (e.g. display members' bar or what) because they will be cached as well. Look at a framework for specific-caching (variable or part of the page).
Second Method:
Use Squid or update the HTTP headers correctly to do browser caching.
PEAR has a caching package (actually two):
http://pear.php.net/package/Cache
Fourth Method:
Use http://memcached.org/. There's an explanation of how to do it on that site.
I usually use a combination of techniques:
HTML resulting from PHP code is not cached using the standard configuration, because it sends out the appropriate headers automatically.
Images and other binary assets get renamed if they change.
For JavaScript and CSS I add a automatically created unique code (e.a MD5 hash of the contents or the file size) to the filename (e.g. /public/styles.f782bed8.css) and remove it again with mod_rewrite. This way every change in the file results in a new file name. This can be done at runtime in PHP while outputting the HTML header, to have it fully automated. In this case however an MD5 might have a performance impact.
Is there a way to force a client's cache to reload an HTML file if you can't change the URI referencing that file (e.g., can't add a timestamp param)?
Here's my situation:
A plugin deployed to a 1000 users
That plugin loads example.com/page.html which calls on script.js
The resource URI example.com/page.html cannot be changed (w/o plugin updates)
page.html has been changed. I need to clear the old page.html from users' cache so the new page.html can load.
Any ideas? Htaccess? The PHP API that the old & new page.html call on?
Thanks!
Well, if the page is already cached by a browser it's difficult to tell it not to use its cached version because it probably won't bother to check again before it determines its cached version is stale. You'll just have to send a snail-mail letter to all of your users informing them to press ctrl+f5 :)
There is a chance that the browser might at least try a HEAD request to check the modified timestamp before it serves up its cached version, though. In this case the following will help you out.
Browsers negotiate their content from your web server using HTTP standard headers. Going forward if you want to tell a browser not to cache a file, you have to send the appropriate HTTP headers. If you want to do this in PHP you can use the header function to send the appropriate HTTP headers to the browser:
header('Cache-Control: no-cache');
header('Pragma: no-cache');
If it has to be done via HTML you can do the following in your page header:
<meta http-equiv="Expires" content="Tue, 01 Jan 1995 12:12:12 GMT">
<meta http-equiv="Pragma" content="no-cache">
There is no way for you to be sure if the browser will honor your request that it not cache a page, however. There are some other things like eTags and whatnot but frankly I don't think that's going to help you out if the page is already cached.
UPDATE
From the HTTP/1.1 specification on Response Cacheability:
If there is neither a cache validator nor an explicit expiration time
associated with a response, we do not expect it to be cached, but
certain caches MAY violate this expectation (for example, when little
or no network connectivity is available).
Perhaps PHP could be used to add a timestamp to the javascript call. You could then run this for the duration...........For example:
check_if_cache_should_be_disabled.php
<?php
$deployment_flag = true; // Uncomment this if you want to disable normal cache.
//$deployment_flag = false // Uncomment this if you want to enable normal cache.
?>
page.php
<script src="/js/script.js<?php
require('check_if_cache_should_be_disabled.php');
// File Get Contents can be used for a remote server
//file_get_contents('http://yourserver.com/check_if_cache_should_be_disabled.php');
if ($deployment_flag == true) {
print ( '?ts='.date() );
}
?>"></script>
You can change the file name, for example page_v2.html which will make the browser load the page as a new page.
This answer may be oversimplified, but you could always append a GET value to the end of the file that contains a file version. Modern browsers usually take that as a new version and will ignore previous caches and fetch the newest one. So all you'd have to do is update the version when you want the most up-to-date code to be cached instead.
For example:
<script type="javascript" src="example.js?version=2.0.8"></script>
You could even go as far as writing a PHP script that automatically does this for you when a file changes based on some specific parameter like "last updated."
the problem is the next: I am at main.php where I call a php file, with a JavaScript script, to synchronize the user's image with a new one on the server. But after all the file functions finish (with no problem) and I redirect with a header to the main.php, a very fast refresh is made but the image is not updated, I keep watching the old file but if I refresh the page with F5 the new image is shown.
Cache problem? I have tried with some HTML meta tags but no luck. Any idea?
Thanks a lot.
EDIT: In order to make it clear. I have tried with headers and with a timestamp in the redirection, but no luck. Here is the process:
main.php, once user click on his/her image a redirection with javascript (location.href) to update.php is launched.
I get a new image from the server, and save it overwriting the previous one (so the name is the same)
I add some headers to the code, no chae headers... and Location header redirecting again to main.php and I pass some parameter via GET also a timestamp (time()).
I reach the main.php but the reload is rarely fast and the new image is not shown.
Now I think is much clearer.
Thanks for your help.
You could add something like ?t=foo to the URL of the picture. Where foo is a random string or number. Or maybe the current timestamp. That will make the browser request a completely new image and no caching instance will do anything.
Yes it is cache issue, IE usually does that. I use the always append ?timestamp to the request to avoid such scenarios.
EDIT:
Yes i got it. I had same issue some time ago. Either way time stamp is the solution.
You have to add time stamp to <img src="path_to_image/image.jpg*?edited-time-stamp*" />
It worked for me, hope that works for you too.
You probably have to add the header Expires or Cache-Control using the header function
try out this method hope it helps you.
<?php
header("Cache-Control: no-cache, must-revalidate"); // HTTP/1.1
header("Expires: Sat, 26 Jul 1997 05:00:00 GMT"); // Date in the past
header("Location:http://www.test.com");
exit;
?>