I have a Yahoo currency script in my site but they are taking too much time to load and are slowing my site. How can I cache them and refreshing cache every 3600 minutes?
You need some place to store these results. MySQL is a popular choice, but if the data does not need to stick around or have historical values, using memcache would be easier. Depending on your host, both of these options may be available.
The idea is:
create some sort of cache dir and set the defined cache age
then, at the very beginning of your function, check for cache
if it exists, check it's age.
If within range, get it
if cache too old
use live data and set that data into cache file.
Something like this should do the trick:
define(CACHE_DIR, 'E:/xampp/xampp/htdocs/tmp');
define(CACHE_AGE, 3600);
/**
* Adds data to the cache, if the cache key doesn't aleady exist.
* #param string $path the path to cache file (not dir)
* #return false if there is no cache file or the cache file is older that CACHE_AGE. It return cache data if file exists and within CACHE_AGE
*/
function get_cache_value($path){
if(file_exists($path)){
$now = time();
$file_age = filemtime($path);
if(($now - $file_age) < CACHE_AGE){
return file_get_contents($path);
} else {
return false;
}
} else {
return false;
}
}
function set_cache_value($path, $value){
return file_put_contents($path, $value);
}
function kv_euro () {
$path = CACHE_DIR . '/euro.txt';
$kveuro = get_cache_value($path);
if(false !== $kveuro){
echo "\nFROM CACHE\n";
return round($kveuro, 2);
} else {
echo "\nFROM LIVE\n";
$from = 'EUR'; /*change it to your required currencies */
$to = 'ALL';
$url = 'http://finance.yahoo.com/d/quotes.csv?e=.csv&f=sl1d1t1&s='. $from . $to .'=X';
$handle = #fopen($url, 'r');
if ($handle) {
$result = fgets($handle, 4096);
fclose($handle);
}
$allData = explode(',',$result); /* Get all the contents to an array */
$kveuro = $allData[1];
set_cache_value($path, $kveuro);
return $kveuro;
}
}
Also, rather than fgets which reads file line by line rather slower and since you are not manipulating a line, you should consider using file_get_contents function instead.
Related
I have a php function that reads a line (random message) from a file and displays it on my web page. It displays a new message every time I click refresh, but I want it to display the message for a day (it should change at midnight). Is it possible to do it with another function, implying my database? Or with a JS function?
EDIT
This is the function (not my code):
function loadMessagesFromFile()
{
$path = ROOT_PATH. '/messages.txt';
$file = fopen($path,"r");
$messages = array();
while($data = fgets($file))
{
$messages[] = $data;
}
fclose($file);
return $messages;
}
This is how I use it to display the message:
$messages_from_file = loadMessagesFromFile();
$key = array_rand($messages_from_file);
$full_text = $messages_from_file[$key];
LATER EDIT
I found the answer here: <https://stackoverflow.com/questions/6815614/generating-word-of-the-day-via-php-random-number#:~:text=Just%20set%20the%20current%20date,same%20number%20for%20one%20Day.>
You have multiple possibilities:
Create a file with only one line every day taken from the other file
Create a cron who will be executed ad midnight who will get a random message and set it in database or cache.
If you want a different message per people but persist one day, you can use the local storage of the visitor to store the message with the current date and if the date stored is different from the current date, you change it.
get total lines from your file
Use a database or file and store a random number daily.
Use that number to read a random message from the file.
Use cron to update random numbers at midnight in your database or file.
So, for 24 hours the same message will be displayed.
function getRandomNo(): int {
$path = ROOT_PATH. '/randomno.txt';
$file = fopen($path,"r");
$data = fread($file,100);
fclose($file);
if($data) {
$parts = explode('=',$data);
if($parts[0] != date("Y-m-d")) {
$randomNo = rand(0,100);
overWrite($randomNo);
} else {
$randomNo = $parts[1];
}
} else {
$randomNo = rand(0,100);
overWrite($randomNo);
}
return $randomNo;
}
function overWrite(int $randomNo): void {
$path = ROOT_PATH. '/randomno.txt';
$file = fopen($path,"w+");
$data = date("Y-m-d").'='.$randomNo;
fwrite($file, $data);
fclose($file);
}
$messages_from_file = loadMessagesFromFile();
$key = getRandomNo();
$full_text = $messages_from_file[$key];
I am trying to record hits to my website to a text file and I want to limit the size of the text file to some value. Once the limit is crossed I want a new file to be created dynamically. With my current code, a new file does get created after the set limit is passed, but the rest of the data is only stored on that new file.
public function storeActivityHitCount(Request $request)
{
if($request->ajax() && isset($request->data)){
$clientIP = request()->ip(); $i= 1;
$date = date("Y-m-d h:i:sa");
$data = $clientIP.', '.$date.', '.$request->data;
$file = public_path().'/adminpanel/hits/activity/activity'.$i.'.txt';
if(!file_exists($file)){
fopen($file,"w+");
}
$filesize = filesize($file);
if($filesize >= 76){ $i++;
$file1 = public_path().'/adminpanel/hits/activity/activity'.$i.'.txt';
if(!file_exists($file1)){
fopen($file1,"w+");
}
$content = file_get_contents($file1);
$content .= $data. PHP_EOL;
$upload_success = file_put_contents($file1, $content);
}else{
$content = file_get_contents($file);
$content .= $data. PHP_EOL;
$upload_success = file_put_contents($file, $content);
}
if($upload_success){
return Response::json(['status' => 'success'], 200);
}
return Response::json(['status' => 'failed'], 400);
}
abort(403);
}
Your original code of course was only trying one other file and then writing to it regardless of its size. You want to put that logic into a repeating structure. Or in other words, you want to look for a different file while you keep finding full ones.
public function storeActivityHitCount(Request $request)
{
if ($request->ajax() && isset($request->data)) {
$clientIP = request()->ip();
$date = date('Y-m-d h:i:sa');
$data = $clientIP . ', ' . $date . ', ' . $request->data;
$i = 1;
$file = public_path() . '/adminpanel/hits/activity/activity' . $i . '.txt';
while (filesize($file) >= 76 && $i <= 20) {
$i++;
$file = public_path() . '/adminpanel/hits/activity/activity' . $i . '.txt';
}
if (filesize($file) >= 76) {
// the loop got away from us
// do something?
$upload_success = false;
} else {
$upload_success = file_put_contents($file, $content, \FILE_APPEND);
}
if ($upload_success) {
return Response::json(['status' => 'success'], 200);
}
return Response::json(['status' => 'failed'], 400);
}
abort(403);
}
I put an upper limit of 20 iterations on the loop; you usually don't want a while loop without some kind of escape mechanism.
file_put_contents will always create a file that doesn't exist, so you didn't need to use fopen (and, you weren't using fclose.) Furthermore, if you pass the FILE_APPEND flag, it will append to the existing file; no need for getting contents and appending stuff to it.
Try to keep your code consistent and readable. Multiple commands on one line, inconsistent whitespace around control structures and operators, inconsistent indentation: all these things end up making you work harder than you have to.
And, of course, this would all be better handled by standard system tools like logrotate which is probably running on your server.
I would isolate the action of getting the activity log file. Like using a function like this:
function getAvailableActivityLogFile(){
$looking=true;
$i=0;
while($looking){
$i++;
$file_path = public_path() . "/adminpanel/hits/activity/activity{$i}.txt";
// If file does not exist or file is less than 76bytes you have the right candiate
if(!file_exists($file_path) || filesize($file_path) < 76){
$looking=false;
return fopen($file_path, 'a+');
}
// Otherwise keep looking on next iteration. You can also write some logic to have a max of loop iterations or max files to look for also.
}
}
Then you can use this function to get the next available file and don't bother with too much logic about what file is available. In my version about you should use fwrite() to write to the file using the file pointer returned by the function.
With the a+ option, you get a pointer there that appends content to the file on every new fwrite to the pointer.
You can also write the function to retrieve a path instead of a pointer.
I have a process that writes a file using file_put_contents():
file_put_contents ( $file, $data, LOCK_EX );
I have added the LOCK_EX parameter to prevent concurrent processes from writing to the same file, and prevent trying to read it when it's still being written to.
I'm having difficulties testing this properly due to the concurrent nature, and I'm not sure how to approach this. I've got this so far:
if (file_exists($file)) {
$fp = fopen($file, 'r+');
if (!flock($fp, LOCK_EX|LOCK_NB, $wouldblock)) {
if ($wouldblock) {
// how can I wait until the file is unlocked?
} else {
// what other reasons could there be for not being able to lock?
}
}
// does calling fclose automatically close all locks even is a flock was not obtained above?
fclose($file);
}
Questions being:
Is there a way to wait until the file is not locked anymore, while keeping the option to give this a time limit?
Does fclose() automatically unlock all locks when there would be another process that had locked the file?
I wrote a small test that uses sleep() so that I could simulate concurrent read/write processes with a simple AJAX call. It seems this answers both questions:
when the file is locked, a sleep that approximates estimated write duration and subsequent lock check allow for waiting. This could even be put in a while loop with an interval.
fclose() does indeed not remove the lock from the process that's already running as confirmed in some of the answers.
PHP5.5 and lower on windows does not support the $wouldblock parameter according to the docs,
I was able to test this on Windows + PHP5.3 and concluded that the file_is_locked() from my test still worked in this scenario:
flock() would still return false just not have the $wouldblock parameter but it would still be caught in my else check.
if (isset($_POST['action'])) {
$file = 'file.txt';
$fp = fopen($file, 'r+');
if ($wouldblock = file_is_locked($fp)) {
// wait and then try again;
sleep(5);
$wouldblock = file_is_locked($fp);
}
switch ($_POST['action']) {
case 'write':
if ($wouldblock) {
echo 'already writing';
} else {
flock($fp, LOCK_EX);
fwrite($fp, 'yadayada');
sleep(5);
echo 'done writing';
}
break;
case 'read':
if ($wouldblock) {
echo 'cant read, already writing';
} else {
echo fread($fp, filesize($file));
}
break;
}
fclose($fp);
die();
}
function file_is_locked( $fp ) {
if (!flock($fp, LOCK_EX|LOCK_NB, $wouldblock)) {
if ($wouldblock) {
return 'locked'; // file is locked
} else {
return 'no idea'; // can't lock for whatever reason (for example being locked in Windows + PHP5.3)
}
} else {
return false;
}
}
I often use a small class... that is secure and fast, basically you have to write only when you obtain exclusive lock on the file otherwise you should wait until is locked...
lock_file.php
<?php
/*
Reference Material
http://en.wikipedia.org/wiki/ACID
*/
class Exclusive_Lock {
/* Private variables */
public $filename; // The file to be locked
public $timeout = 30; // The timeout value of the lock
public $permission = 0755; // The permission value of the locked file
/* Constructor */
public function __construct($filename, $timeout = 1, $permission = null, $override = false) {
// Append '.lck' extension to filename for the locking mechanism
$this->filename = $filename . '.lck';
// Timeout should be some factor greater than the maximum script execution time
$temp = #get_cfg_var('max_execution_time');
if ($temp === false || $override === true) {
if ($timeout >= 1) $this->timeout = $timeout;
set_time_limit($this->timeout);
} else {
if ($timeout < 1) $this->timeout = $temp;
else $this->timeout = $timeout * $temp;
}
// Should some other permission value be necessary
if (isset($permission)) $this->permission = $permission;
}
/* Methods */
public function acquireLock() {
// Create the locked file, the 'x' parameter is used to detect a preexisting lock
$fp = #fopen($this->filename, 'x');
// If an error occurs fail lock
if ($fp === false) return false;
// If the permission set is unsuccessful fail lock
if (!#chmod($this->filename, $this->permission)) return false;
// If unable to write the timeout value fail lock
if (false === #fwrite($fp, time() + intval($this->timeout))) return false;
// If lock is successfully closed validate lock
return fclose($fp);
}
public function releaseLock() {
// Delete the file with the extension '.lck'
return #unlink($this->filename);
}
public function timeLock() {
// Retrieve the contents of the lock file
$timeout = #file_get_contents($this->filename);
// If no contents retrieved return error
if ($timeout === false) return false;
// Return the timeout value
return intval($timeout);
}
}
?>
Simple use as follow:
include("lock_file.php");
$file = new Exclusive_Lock("my_file.dat", 2);
if ($file->acquireLock()) {
$data = fopen("my_file.dat", "w+");
$read = "READ: YES";
fwrite($data, $read);
fclose($data);
$file->releaseLock();
chmod("my_file.dat", 0755);
unset($data);
unset($read);
}
If you want add more complex level you can use another trick... use while (1) to initialize a infinite loop that breaks only when the exclusive lock is acquired, not suggested since can block your server for an undefined time...
include("lock_file.php");
$file = new Exclusive_Lock("my_file.dat", 2);
while (1) {
if ($file->acquireLock()) {
$data = fopen("my_file.dat", "w+");
$read = "READ: YES";
fwrite($data, $read);
fclose($data);
$file->releaseLock();
chmod("my_file.dat", 0755);
unset($data);
unset($read);
break;
}
}
file_put_contents() is very fast and writes directly into file but as you say has a limit... race condition exists and may happen even if you try to use LOCK_EX. I think that a php class is more flexible and usable...
See you this thread that treats a similar question: php flock behaviour when file is locked by one process
The first question is answered here How to detect the finish with file_put_contents() in php? and beacuse PHP is single-threaded, only solution is to use extension of core PHP using PTHREADS and one good simple article about it is https://www.mullie.eu/parallel-processing-multi-tasking-php/
The second question is answered here Will flock'ed file be unlocked when the process die unexpectedly?
The fclose() will unlock only valid handle that is opened using fopen() or fsockopen() so if handle is still valid, yes it will close file and release lock.
Here is a fix for #Alessandro answer to work correctly and not lock the file forever
lock_file.php
<?php
/*
Reference Material
http://en.wikipedia.org/wiki/ACID
*/
class Exclusive_Lock {
/* Private variables */
public $filename; // The file to be locked
public $timeout = 30; // The timeout value of the lock
public $permission = 0755; // The permission value of the locked file
/* Constructor */
public function __construct($filename, $timeout = 1, $permission = null, $override = false) {
// Append '.lck' extension to filename for the locking mechanism
$this->filename = $filename . '.lck';
// Timeout should be some factor greater than the maximum script execution time
$temp = #get_cfg_var('max_execution_time');
if ($temp === false || $override === true) {
if ($timeout >= 1) $this->timeout = $timeout;
set_time_limit($this->timeout);
} else {
if ($timeout < 1) $this->timeout = $temp;
else $this->timeout = $timeout ;
}
// Should some other permission value be necessary
if (isset($permission)) $this->permission = $permission;
if($this->timeLock()){
$this->releaseLock();
}
}
/* Methods */
public function acquireLock() {
// Create the locked file, the 'x' parameter is used to detect a preexisting lock
$fp = #fopen($this->filename, 'x');
// If an error occurs fail lock
if ($fp === false) return false;
// If the permission set is unsuccessful fail lock
if (!#chmod($this->filename, $this->permission)) return false;
// If unable to write the timeout value fail lock
if (false === #fwrite($fp, time() + intval($this->timeout))) return false;
// If lock is successfully closed validate lock
return fclose($fp);
}
public function releaseLock() {
// Delete the file with the extension '.lck'
return #unlink($this->filename);
}
private function timeLock() {
// Retrieve the contents of the lock file
$timeout = #file_get_contents($this->filename);
// If no contents retrieved return true
if ($timeout === false) return true;
// Return the timeout value
return (intval($timeout) < time());
}
}
use as follow:
include("lock_file.php");
$file = new Exclusive_Lock("my_file.dat", 2);
if ($file->acquireLock()) {
$data = fopen("my_file.dat", "w+");
$read = "READ: YES";
fwrite($data, $read);
fclose($data);
$file->releaseLock();
chmod("my_file.dat", 0755);
unset($data);
unset($read);
}
hope that save some else time
I have been using a basic caching system on my site based on this link
It has so far worked well for everthing I want to do.
$cachefile = 'cache/'. basename($_SERVER['QUERY_STRING']) . '.html';
$cachetime = 1440 * 60;
if (file_exists($cachefile) && (time() - $cachetime < filemtime($cachefile))) {
include($cachefile);
echo "<!-- Cached ".date('jS F Y H:i', filemtime($cachefile))." -->";
exit;
}
ob_start();
// My html/php code here
$fp = fopen($cachefile, 'w'); // open the cache file for writing
fwrite($fp, ob_get_contents()); // save the contents of output buffer to the file
fclose($fp); // close
ob_end_flush(); // Send to browser
However I have a couple of pages with more detailed mysql queries, I have spent a fair bit of time optimising it however it still takes about 10 secs to run when I query it in mysql and even longer on the website. And sometimes it seems to time out as I get the below message.
The proxy server received an invalid response from an upstream server.
The proxy server could not handle the requestGET http://www.example.com
Reason: Error reading from remote server
This isn't a huge issue as because I am using the caching system above only the first person to click on it for the day gets the delay and the rest of the time the users get the cached page so it is actually quite fast for them.
I want to save myself from having to be the first person each day to go to the page and automate this process so at 17:00 (on the server) each day the file gets written to the cache.
How would I best achieve this?
I suggest you to use Php Speedy or this may help:
<?php
function getUrl () {
if (!isset($_SERVER['REQUEST_URI'])) {
$url = $_SERVER['REQUEST_URI'];
} else {
$url = $_SERVER['SCRIPT_NAME'];
$url .= (!empty($_SERVER['QUERY_STRING']))? '?' . $_SERVER[ 'QUERY_STRING' ] : '';
}
return $url;
}
//getUrl gets the queried page with query string
function cache ($buffer) { //page's content is $buffer
$url = getUrl();
$filename = md5($url) . '.cache';
$data = time() . '¦' . $buffer;
$filew = fopen("cache/" . $filename, 'w');
fwrite($filew, $data);
fclose($filew);
return $buffer;
}
function display () {
$url = getUrl();
$filename = md5($url) . '.cache';
if (!file_exists("/cache/" . $filename)) {
return false;
}
$filer = fopen("cache/" . $filename, 'r');
$data = fread($filer, filesize("cache/" . $filename));
fclose($filer);
$content = explode('¦', $data, 2);
if (count($content)!= 2 OR !is_numeric($content['0'])) {
return false;
}
if (time()-(100) > $content['0']) { // 100 is the cache time here!!!
return false;
}
echo $content['1'];
die();
}
// Display cache (if any)
display(); // if it is displayed, die function will end the program here.
// if no cache, callback cache
ob_start ('cache');
?>
Just include this script anywhere you need caching and set a cron job for running it automated.
I am looking for a best solution for caching my webpages example http:/www.website.com/test.php?d=2011-11-01 which has url rewrite rule to become http:/www.website.com/testd-2011-11-01.html
the scripts below does not work for dynamic web page it give the same page regardless of the query.
<?php
$cachefile = "cache/".$reqfilename.".html";
$cachetime = 240 * 60; // 5 minutes
// Serve from the cache if it is younger than $cachetime
if (file_exists($cachefile) && (time() - $cachetime
< filemtime($cachefile)))
{
include($cachefile);
echo "<!-- Cached ".date('jS F Y H:i', filemtime($cachefile))."
-->";
exit;
}
ob_start(); // start the output buffer?>
my website content here
<?php
// open the cache file for writing
$fp = fopen($cachefile, 'w');
// save the contents of output buffer to the file
fwrite($fp, ob_get_contents());
// close the file
fclose($fp);
// Send the output to the browser
ob_end_flush(); ?>
If your URL looks like testd-2011-11-01.html, you have two possible solutions :
Use some RewriteRule, so that URL is rewritten to test.php?d=2011-11-01 ; and, then, your test.php script can deal with the cache generation / invalidation
Or use a cronjob, that will regenerate the testd-2011-11-01.html static file every X minutes.
The first solution is the one that's generally used, as it only requires you to setup a RewriteRule (and those are often available, even on cheap hosting services).
The second solution might be a bit better for performances (no PHP code is ever executed, except when the cronjob runs) ; but the difference is probably not that important, except if you have a very big website with an awful lot of users.
Something like this could work:
class SimpleCache {
private $_cacheDir = '/path/to/cache';
private $_cacheTime = 240*60;
public function isCached( $id ) {
$cacheFilename = $this->_cache . "/" . $id . "_*";
$files = glob($cacheFilename, GLOB_NOSORT);
if( $files && !empty($files) ) {
// There should always be one file in the array
$filename = $files[0];
$params = explode("_", $filename);
$cacheTime = strtok($params[1], '.');
// Check if the cached file is too old
if( time() - $params[1] > $this->_cacheTime ) {
#unlink($filename);
}
else {
return $filename;
}
}
return false;
}
public function cache( $id, $data ) {
$filename = $this->_cache . "/" . $id. "_" . time() . ".cache";
if( !($fp = #fopen($filename, "w")) ) {
return false;
}
if( !#fwrite($fp, $data) ) {
#fclose($fp);
return false;
}
#fclose($fp);
return true;
}
}
$cache = new SimpleCache();
if( !($buffer = $cache->isCached($reqfilename)) ) {
// Produce the contects of the file and save them in the $buffer variable
$cache->cache($reqfilename, $buffer);
}
echo $buffer;
But you could use memcached, APC, and more advanced caching techniques if you are up to it
You can use HTTP caching. You send headers telling the client (browser) to cache the whole page for a certain period of time.
// 5 minutes
header('Cache-Control: max-age=300');
If you have control over your hosting environment, you can also add a reverse proxy like varnish or nginx in front of your webserver. This proxy will then cache these requests for you, making the cached version shared between all visitors of your site.
See also the HTTP/1.1 specification.