I have been using a basic caching system on my site based on this link
It has so far worked well for everthing I want to do.
$cachefile = 'cache/'. basename($_SERVER['QUERY_STRING']) . '.html';
$cachetime = 1440 * 60;
if (file_exists($cachefile) && (time() - $cachetime < filemtime($cachefile))) {
include($cachefile);
echo "<!-- Cached ".date('jS F Y H:i', filemtime($cachefile))." -->";
exit;
}
ob_start();
// My html/php code here
$fp = fopen($cachefile, 'w'); // open the cache file for writing
fwrite($fp, ob_get_contents()); // save the contents of output buffer to the file
fclose($fp); // close
ob_end_flush(); // Send to browser
However I have a couple of pages with more detailed mysql queries, I have spent a fair bit of time optimising it however it still takes about 10 secs to run when I query it in mysql and even longer on the website. And sometimes it seems to time out as I get the below message.
The proxy server received an invalid response from an upstream server.
The proxy server could not handle the requestGET http://www.example.com
Reason: Error reading from remote server
This isn't a huge issue as because I am using the caching system above only the first person to click on it for the day gets the delay and the rest of the time the users get the cached page so it is actually quite fast for them.
I want to save myself from having to be the first person each day to go to the page and automate this process so at 17:00 (on the server) each day the file gets written to the cache.
How would I best achieve this?
I suggest you to use Php Speedy or this may help:
<?php
function getUrl () {
if (!isset($_SERVER['REQUEST_URI'])) {
$url = $_SERVER['REQUEST_URI'];
} else {
$url = $_SERVER['SCRIPT_NAME'];
$url .= (!empty($_SERVER['QUERY_STRING']))? '?' . $_SERVER[ 'QUERY_STRING' ] : '';
}
return $url;
}
//getUrl gets the queried page with query string
function cache ($buffer) { //page's content is $buffer
$url = getUrl();
$filename = md5($url) . '.cache';
$data = time() . '¦' . $buffer;
$filew = fopen("cache/" . $filename, 'w');
fwrite($filew, $data);
fclose($filew);
return $buffer;
}
function display () {
$url = getUrl();
$filename = md5($url) . '.cache';
if (!file_exists("/cache/" . $filename)) {
return false;
}
$filer = fopen("cache/" . $filename, 'r');
$data = fread($filer, filesize("cache/" . $filename));
fclose($filer);
$content = explode('¦', $data, 2);
if (count($content)!= 2 OR !is_numeric($content['0'])) {
return false;
}
if (time()-(100) > $content['0']) { // 100 is the cache time here!!!
return false;
}
echo $content['1'];
die();
}
// Display cache (if any)
display(); // if it is displayed, die function will end the program here.
// if no cache, callback cache
ob_start ('cache');
?>
Just include this script anywhere you need caching and set a cron job for running it automated.
Related
My code:
<?
$url = 'http://w1.weather.gov/xml/current_obs/KGJT.xml';
$xml = simplexml_load_file($url);
?>
<?
echo $xml->weather, " ";
echo $xml->temperature_string;
?>
This works great, but I read that caching external data is a must for page speed. How can I cache this for lets say 5 hours?
I looked into ob_start(), is this what I should use?
The ob system is for in-script cacheing. It's not useful for persistent multi invocation caching.
To do this properly, you'd write the resulting xml out of a file. Every time the script runs, you'd check the last updated time on that file. if it's > 5 hours, you fetch/save a fresh copy.
e.g.
$file = 'weather.xml';
if (filemtime($file) < (time() - 5*60*60)) {
$xml = file_get_contents('http://w1.weather.gov/xml/current_obs/KGJT.xml');
file_put_contents($file, $xml);
}
$xml = simplexml_load_file($file);
echo $xml->weather, " ";
echo $xml->temperature_string;
ob_start would not be a great solution. That only applies when you need to modify or flush the output buffer. Your XML returned data is not being sent to the buffer, so no need for those calls.
Here's one solution, which I've used in the past. Does not require MySQL or any database, as data is stored in a flat file.
$last_cache = -1;
$last_cache = #filemtime( 'weather_cache.txt' ); // Get last modified date stamp of file
if ($last_cache == -1){ // If date stamp unattainable, set to the future
$since_last_cache = time() * 9;
} else $since_last_cache = time() - $last_cache; // Measure seconds since cache last set
if ( $since_last_cache >= ( 3600 * 5) ){ // If it's been 5 hours or more since we last cached...
$url = 'http://w1.weather.gov/xml/current_obs/KGJT.xml'; // Pull in the weather
$xml = simplexml_load_file($url);
$weather = $xml->weather . " " . $xml->temperature_string;
$fp = fopen( 'weather_cache.txt', 'a+' ); // Write weather data to cache file
if ($fp){
if (flock($fp, LOCK_EX)) {
ftruncate($fp, 0);
fwrite($fp, "\r\n" . $weather );
flock($fp, LOCK_UN);
}
fclose($fp);
}
}
include_once('weather_cache.txt'); // Include the weather data cache
I have a website with the following architecture:
End user ---> Server A (PHP) ---> Server B (ASP.NET & Database)
web file_get_contents
browser
Server A is a simple web server, mostly serving static HTML pages. However, some content is dynamic, and this content is fetched from Server B. Example:
someDynamicPageOnServerA.php:
<html>
...static stuff...
<?php echo file_get_contents("http://serverB/somePage.aspx?someParameter"); ?>
...more static stuff...
</html>
This works fine. However, if server B is down (maintainance, unexpected crash, etc.), those dynamic pages on server A will fail. Thus, I'd like to
cache the last result of file_get_contents and
show this result if file_get_contents timeouted.
Now, it shouldn't be too hard to implement something like this; however, this seems to be a common scenario and I'd like to avoid re-inventing the wheel. Is there some PHP library or built-in feature that helps which such a scenario?
i would do something like this:
function GetServerStatus($site, $port){
$fp = #fsockopen($site, $port, $errno, $errstr, 2);
if (!$fp) {
return false;
} else {
return true;
}
}
$tempfile = '/some/temp/file/path.txt';
if(GetServerStatus('ServerB',80)){
$content = file_get_contents("http://serverB/somePage.aspx?someParameter");
file_put_contents($tempfile,$content);
echo $content;
}else{
echo file_get_contents($tempfile);
}
You could check the modified time of the file and only request it when it is different, otherwise load the local copy. Also, there is a cache pseudo-example on the PHP website in the comments for filemtime ( from: http://php.net/manual/en/function.filemtime.php ):
<?php
$cache_file = 'URI to cache file';
$cache_life = '120'; //caching time, in seconds
$filemtime = #filemtime($cache_file); // returns FALSE if file does not exist
if (!$filemtime or (time() - $filemtime >= $cache_life)){
ob_start();
resource_consuming_function();
file_put_contents($cache_file,ob_get_flush());
}else{
readfile($cache_file);
}
?>
I accepted dom's answer, since it was the most helpful one. I ended up using a slightly different approach, since I wanted to account for the situation where the server is reachable via port 80 but some other problem prevents it from serving the requested information.
function GetCachedText($url, $cachefile, $timeout) {
$context = stream_context_create(array(
'http' => array('timeout' => $timeout))); // set (short) timeout
$contents = file_get_contents($url, false, $context);
$status = explode(" ", $http_response_header[0]); // e.g. HTTP/1.1 200 OK
if ($contents === false || $status[1] != "200") {
$contents = file_get_contents($cachefile); // load from cache
} else {
file_put_contents($cachefile, $contents); // update cache
}
return $contents;
}
I am looking for a best solution for caching my webpages example http:/www.website.com/test.php?d=2011-11-01 which has url rewrite rule to become http:/www.website.com/testd-2011-11-01.html
the scripts below does not work for dynamic web page it give the same page regardless of the query.
<?php
$cachefile = "cache/".$reqfilename.".html";
$cachetime = 240 * 60; // 5 minutes
// Serve from the cache if it is younger than $cachetime
if (file_exists($cachefile) && (time() - $cachetime
< filemtime($cachefile)))
{
include($cachefile);
echo "<!-- Cached ".date('jS F Y H:i', filemtime($cachefile))."
-->";
exit;
}
ob_start(); // start the output buffer?>
my website content here
<?php
// open the cache file for writing
$fp = fopen($cachefile, 'w');
// save the contents of output buffer to the file
fwrite($fp, ob_get_contents());
// close the file
fclose($fp);
// Send the output to the browser
ob_end_flush(); ?>
If your URL looks like testd-2011-11-01.html, you have two possible solutions :
Use some RewriteRule, so that URL is rewritten to test.php?d=2011-11-01 ; and, then, your test.php script can deal with the cache generation / invalidation
Or use a cronjob, that will regenerate the testd-2011-11-01.html static file every X minutes.
The first solution is the one that's generally used, as it only requires you to setup a RewriteRule (and those are often available, even on cheap hosting services).
The second solution might be a bit better for performances (no PHP code is ever executed, except when the cronjob runs) ; but the difference is probably not that important, except if you have a very big website with an awful lot of users.
Something like this could work:
class SimpleCache {
private $_cacheDir = '/path/to/cache';
private $_cacheTime = 240*60;
public function isCached( $id ) {
$cacheFilename = $this->_cache . "/" . $id . "_*";
$files = glob($cacheFilename, GLOB_NOSORT);
if( $files && !empty($files) ) {
// There should always be one file in the array
$filename = $files[0];
$params = explode("_", $filename);
$cacheTime = strtok($params[1], '.');
// Check if the cached file is too old
if( time() - $params[1] > $this->_cacheTime ) {
#unlink($filename);
}
else {
return $filename;
}
}
return false;
}
public function cache( $id, $data ) {
$filename = $this->_cache . "/" . $id. "_" . time() . ".cache";
if( !($fp = #fopen($filename, "w")) ) {
return false;
}
if( !#fwrite($fp, $data) ) {
#fclose($fp);
return false;
}
#fclose($fp);
return true;
}
}
$cache = new SimpleCache();
if( !($buffer = $cache->isCached($reqfilename)) ) {
// Produce the contects of the file and save them in the $buffer variable
$cache->cache($reqfilename, $buffer);
}
echo $buffer;
But you could use memcached, APC, and more advanced caching techniques if you are up to it
You can use HTTP caching. You send headers telling the client (browser) to cache the whole page for a certain period of time.
// 5 minutes
header('Cache-Control: max-age=300');
If you have control over your hosting environment, you can also add a reverse proxy like varnish or nginx in front of your webserver. This proxy will then cache these requests for you, making the cached version shared between all visitors of your site.
See also the HTTP/1.1 specification.
i have read a bit around in the internet about the php cache.
at the moment i am using, this system to cache my pages:
This is putted on the start of the page
<?php
// Settings
$cachedir = 'cache/'; // Directory to cache files in (keep outside web root)
$cachetime = 600; // Seconds to cache files for
$cacheext = 'html'; // Extension to give cached files (usually cache, htm, txt)
// Ignore List
$ignore_list = array(
'addedbytes.com/rss.php',
'addedbytes.com/search/'
);
// Script
$page = 'http://' . $_SERVER['HTTP_HOST'] . $_SERVER['REQUEST_URI']; // Requested page
$cachefile = $cachedir . md5($page) . '.' . $cacheext; // Cache file to either load or create
$ignore_page = false;
for ($i = 0; $i < count($ignore_list); $i++) {
$ignore_page = (strpos($page, $ignore_list[$i]) !== false) ? true : $ignore_page;
}
$cachefile_created = ((#file_exists($cachefile)) and ($ignore_page === false)) ? #filemtime($cachefile) : 0;
#clearstatcache();
// Show file from cache if still valid
if (time() - $cachetime < $cachefile_created) {
//ob_start('ob_gzhandler');
#readfile($cachefile);
//ob_end_flush();
exit();
}
// If we're still here, we need to generate a cache file
ob_start();
?>
MY HTML CODE Goes here .............
and the code below is at the footer of my page.
<?php
// Now the script has run, generate a new cache file
$fp = #fopen($cachefile, 'w');
// save the contents of output buffer to the file
#fwrite($fp, ob_get_contents());
#fclose($fp);
ob_end_flush();
?>
There are some things that i need and this code dont have them :
gzip
the expired cache is not autodeleted after it expire.
Also wanted to ask, if this code is secure to use , if some one can suggest a better one or something to improve the current code it will be just great
Thank you fro reading this post.
Best Regards
Meo
….
// Show file from cache if still valid
if (time() - $cachetime < $cachefile_created) {
//ob_start('ob_gzhandler');
echo gzuncompress(file_get_contents($cachefile));
//ob_end_flush();
exit();
} else {
if(file_exists($cachefile) && is_writable($cachefile)) unlink($cachefile)
}
….
and
// Now the script has run, generate a new cache file
$fp = #fopen($cachefile, 'w');
// save the contents of output buffer to the file
#fwrite($fp, gzcompress(ob_get_contents(), 9));
#fclose($fp);
ob_end_flush();
?>
Use ob_start("ob_gzhandler"); to initiate gzipped buffering (it'll take care of determining if the client can actually accept/wants gzipped data and adjust things accordingly).
To delete the cached files:
if (time() - $cachetime < $cachefile_created) {
#readfile($cachefile);
//ob_end_flush();
exit();
} else {
unlink($cachefile);
exit();
}
But there can be delay or maybe error when the file is being written and someone requests for that page. You should use flock to overcome such problems as mentioned at Error during file write in simple PHP caching
Something like this at the end of page
<?php
$fp = #fopen($cachefile, 'w');
if (flock($fp, LOCK_EX | LOCK_NB)) {
fwrite($fp, gzcompress(ob_get_contents(), 9));
flock($fp, LOCK_UN);
fclose($fp);
}
ob_end_flush(); ?>
I have a Yahoo currency script in my site but they are taking too much time to load and are slowing my site. How can I cache them and refreshing cache every 3600 minutes?
You need some place to store these results. MySQL is a popular choice, but if the data does not need to stick around or have historical values, using memcache would be easier. Depending on your host, both of these options may be available.
The idea is:
create some sort of cache dir and set the defined cache age
then, at the very beginning of your function, check for cache
if it exists, check it's age.
If within range, get it
if cache too old
use live data and set that data into cache file.
Something like this should do the trick:
define(CACHE_DIR, 'E:/xampp/xampp/htdocs/tmp');
define(CACHE_AGE, 3600);
/**
* Adds data to the cache, if the cache key doesn't aleady exist.
* #param string $path the path to cache file (not dir)
* #return false if there is no cache file or the cache file is older that CACHE_AGE. It return cache data if file exists and within CACHE_AGE
*/
function get_cache_value($path){
if(file_exists($path)){
$now = time();
$file_age = filemtime($path);
if(($now - $file_age) < CACHE_AGE){
return file_get_contents($path);
} else {
return false;
}
} else {
return false;
}
}
function set_cache_value($path, $value){
return file_put_contents($path, $value);
}
function kv_euro () {
$path = CACHE_DIR . '/euro.txt';
$kveuro = get_cache_value($path);
if(false !== $kveuro){
echo "\nFROM CACHE\n";
return round($kveuro, 2);
} else {
echo "\nFROM LIVE\n";
$from = 'EUR'; /*change it to your required currencies */
$to = 'ALL';
$url = 'http://finance.yahoo.com/d/quotes.csv?e=.csv&f=sl1d1t1&s='. $from . $to .'=X';
$handle = #fopen($url, 'r');
if ($handle) {
$result = fgets($handle, 4096);
fclose($handle);
}
$allData = explode(',',$result); /* Get all the contents to an array */
$kveuro = $allData[1];
set_cache_value($path, $kveuro);
return $kveuro;
}
}
Also, rather than fgets which reads file line by line rather slower and since you are not manipulating a line, you should consider using file_get_contents function instead.