I used the following code, but it is taking time. i want to cache without storing in a text file.
$file = 'cache_toppers.txt';
if (file_exists($file) &&
filemtime($file) > (time() - $expire)) {
$records = unserialize(file_get_contents($file));
} else {
include("kalvidbconnect.php");
$query = "SELECT * FROM vpfmsttoppers";
$result = mysql_query($query)
or die (mysql_error());
while ($record = mysql_fetch_array($result) ) {
$records[] = $record;
}
$OUTPUT = serialize($records);
$fp = fopen($file,"w");
fputs($fp, $OUTPUT);
fclose($fp);
}
Thanks,
Kamatchi.D
There are some ready to use PHP extensions providing cache functionality. Some of them:
memcache http://pecl.php.net/package/memcache
APC http://php.net/manual/en/book.apc.php
eAccelerator
XCache
these are the ones I know of, but surely there are many more.
Just a thought, not sure, but how about using CouchDB!?
Here is a good tutorial on IBM http://www.ibm.com/developerworks/opensource/library/os-php-couchdb/index.html?ca=drs-
If you don't want to use file-based caching, then one option is to build a wrapper and store it in shared memory,
http://se.php.net/manual/en/ref.sem.php
Maybe APC utilizes the same technique, I don't know, but if you don't want to install PECL-extensions then building your own cache-handling might be an option.
I would however consider caching rendered content to file, since that would put the least amount of load on the server.
Depending on a very long list of factors, I'd typically expect trying to unserialize the file to take longer than loading it fresh from the database.
Well, use cache then, for example APC - apc_store()/ apc_fetch()
Related
I would like to use exclusive/shared, both blocking and non-blocking locks like these found in flock(). How can this be achieved using semaphores?
It depends on the use case.
If you are on only one server, use a lock file.
function doSomething() {
$file = /temp/path/somthing.lock;
if (file_exists($file)) {
return false;
}
touch($file);
// Safely mess with things
unlink($file);
}
If you have multiple web servers, e.g. behind a load balancer, the same thing can be accomplished by using a table in mysql.
function doSomething() {
$query = "SELECT * FROM locks WHERE name='something'");
$res = mysqli_query($query);
if (mysql_num_rows($res) > 0) {
return false;
}
$query = "INSERT INTO locks (name) VALUES ('something')";
mysqli_query($query);
// Safely mess with things
$query = "DELETE FROM locks WHERE name='something'");
mysqli_query($query);
}
Memcache is another obvious candidate with multiple machine support.
You should not use ACP, because it is intended for caching only. This means you do not have control when ACP storage gets deleted, it could happen at any time.
You can also use semaphores, however the same caveats as with lock files apply if you use more than one server.
I would recommend creating lock($key), is_locked($key) and release($key) functions, and then religiously use them throughout your project. That way you can start with lock files (quick to implement), but then upgrade to something better later without editing the rest of your code. If you want to get really fancy, you can implement them as methods of a logger object that you put in a known location of your code.
I am looking at trying to cache variables in PHP from a JSON file. Is there anyone that knows of a good tutorial or can provide an example?
Save variable to file cache:
file_put_contents('cache.txt', json_encode($variable));
Read cache into variable:
$variable = json_decode(file_get_contents('cache.txt'));
Memcached is your best bet. It will save any serializable data in a very fast cache. You can find a tutorial at:
http://php.net/manual/en/memcache.examples-overview.php
It is lightning quick and has many other features that makes it better than just saving a txt file to the server.
$memcache->set('key', $jsonstring, false, 10)
and
$get_result = $memcache->get('key');
A simple approach is:
function getMyJson()
{
$data = apc_fetch('my_json', $wasCached);
if ($wasCached) {
return $data;
}
$data = json_decode(file_get_contents('/path/to/data.json'));
apc_store('my_json', $data);
return $data;
}
This uses APC's cache but you could work similarly with memcached, redis etc.
How can I benchmark certain peices of code in PHP? I can use timers to calculate the differences, I'm just not sure if it is the best solution out there.
Have a look at XDebug Profiler to benchmark the performance and more.
Xdebug's Profiler is a powerful tool
that gives you the ability to analyze
your PHP code and determine
bottlenecks or generally see which
parts of your code are slow and could
use a speed boost.
You can use a profiler like the one built into Xdebug.
XDebug is cool but if you dont want to install this library, you could try the following:
What I use to locate possible bottle necks is:
$benchmark_start = microtime(true);
// Code goes here
$benchmark_stop = microtime(true);
$benchmark_total = $benchmark_stop - $benchmark_start;
echo "The script took ". $benchmark_total." seconds";
a bit more sophisticated example of manual profiling using timers
works perfect for me, especially when I am asked to sort things out on some live server with FTP access only.
needless to mention that profiling is way more important (and useful) on live server, rather than on hothouse developer's PC.
$TIMER['start']=microtime(TRUE);
// some code
$query="SELECT ...";
$TIMER['before q']=microtime(TRUE);
$res=mysql_query($query);
$TIMER['after q']=microtime(TRUE);
while ($row = mysql_fetch_array($res)) {
// some code
}
$TIMER['array filled']=microtime(TRUE);
// some code
$TIMER['pagination']=microtime(TRUE);
if ('127.0.0.1' === $_SERVER['REMOTE_ADDR']) { //I set my IP here
echo "<table border=1><tr><td>name</td><td>so far</td><td>delta</td><td>per cent</td></tr>";
reset($TIMER);
$start=$prev=current($TIMER);
$total=end($TIMER)-$start;
foreach($TIMER as $name => $value) {
$sofar=round($value-$start,3);
$delta=round($value-$prev,3);
$percent=round($delta/$total*100);
echo "<tr><td>$name</td><td>$sofar</td><td>$delta</td><td>$percent</td></tr>";
$prev=$value;
}
echo "</table><>";
}
server side is PHP + zend framework.
problem:
i have huge of data appox 5000 records and no of columns are 5 in input.txt file.
i like to read all data into memory only once and send some data to the every browser request.
but if i update that input.txt file then updated data must be auto synchronized to that
memory location.
so i need to solve that problem by using memory caching technique.but caching technique
has expire time.but if input.txt is updated before cache expire then i need to auto synchronize to that memory location.
now i am using zend framework 1.10.is it possible in zend framework.
can anybody give me some line of code of zendfrmawork
i have no option to use memchached server(distributed).
Only zend framwork.
It is possible to cache something like that using zend framework.
Check Zend documentation online - its not complete but can give you a head start:
http://framework.zend.com/manual/en/zend.cache.introduction.html
Use lazy loading like this (1h cache is usually OK).
function getData() {
$cache = ...; //get your memory cache here
$cacheId = 'MyCacheId'; //cache id for your data
$loadTimeCacheId = 'dataLoadCacheId'; //cache id for data load timestamp
$cacheLength = 3600; //seconds
$data = $cache->load($cacheId);
$loadTime = (int) $cache->load($loadTimeCacheId);
if (!$data || filemtime('/path/to/your/file') > $loadTime) {
$data = ...; //real loading here
$cache->save($data, $cacheId, array(/*no tags*/), $cacheLength); //save data to cache
$cache->save(time(), $loadTimeCacheId, array(/*no tags*/), $cacheLength); //save load timestamp
}
return $data;
}
Best option is to use Zend_Cache_Frontend_File pointed to your file and Zend_Cache_Backend_Memcached. There is virtually no other option how to store anything in memory than Memcache or APC. It cannot be done without external extension IMO.
I wanted to know if there is way to log the mysql queries in CakePHP being executed when we use the find method on the models, I know that rails database queries, so does Cake do the same, if so how can I enable it or use it?
Shiv
This page has instructions on how to get Cake to log queries the same way as rails.
A Very simple method to log all the queries being executed:
in your cake\libs\model\datasources\dbo\dbo_mysql.php
find the _execute function:
function _execute($sql) {
return mysql_query($sql, $this->connection);
}
add the line "$this->log($sql);" before "return mysql_query($sql, $this->connection);"
function _execute($sql) {
$this->log($sql);
return mysql_query($sql, $this->connection);
}
That's it!!!!! All your SQL Queries gets logged. Make sure the log file is configured properly and have sufficient permission. Enjoy
Assuming you are on a nix os, the best approach would actually to tail the mysql log itself.
You might learn some interesting things out of it.
log in Ubuntu when installing from repository
tail -f /var/log/mysql/mysql.log
As mentioned below, this is a huge performance killer (well, all logs have some performance impact). So, make sure you use it only on your dev/QA machines and only for short periods on your production machine.
CakePHP 1.3 uses the sql_dump element for that.
You can use the element directly when Configure::read('debug') is set to 2:
echo $this->element('sql_dump');
Or take it's code directly if you need to do something else with it (like echo it from a ShellTask)
$sources = ConnectionManager::sourceList();
$logs = array();
foreach ($sources as $source):
$db =& ConnectionManager::getDataSource($source);
if (!$db->isInterfaceSupported('getLog')):
continue;
endif;
$logs[$source] = $db->getLog();
endforeach;
Echo with e.g.:
print_r($logs)
This is what I use (put it in element folder then include in your layout)
<?php
ob_start();
echo $this->element('sql_dump');
$out = ob_get_contents();
ob_end_clean();
CakeLog::write('mysql' , $out);
?>
then you will find the mysql.log file in TMP.logs.DS.mysql.log