I have a page that has quite a lot of data loading from my db. I'd like to speed-up loading time. I already cache query, but still the loading time is longer than I'd like it to be.
Is it possible to render a table with data and store it in a session to load on every new page refresh? I was even thinking of putting it in an external text file using ob_start();
What's the best way to handle it?
Storing it in sessions is probably not the best idea, when you add data to a session (by default) the data is written to a file on the OS, usually in /tmp/ which means you're going to be hitting the disk quite a lot and storing just as much data.
If the data is not user specific then you could store it on disk, or in memory - (see: php.net/apc)
If the data is user specific, I recommend storing it in a distributed cache, such as Memcached (memcached.org) PHP has a library you can enable (php.net/memcached)
(by user specific I mean data like a users transactions, items, shopping cart, etc)
The logic is basically the same for any method you choose:
Memcached, user specific data example:
<?php
$memcached = new Memcached();
$data = $memcached->get('shopping-cart_' . $user_id);
if (!$data) {
$sql = $db->query("..");
$data = array();
while($row = $query->fetch_assoc()) {
$data[] = $row;
}
$memcached->set('shopping-cart_' . $user_id, $data);
}
?>
<table>
<?php
foreach ($data as $item) {
echo '<tr><td>' . $item['name'] .' </td></tr>';
}
?>
</table>
Global data (not user specific)
<?php
$cache_file = '/cached/pricing-structure.something';
if (file_exists($cache_file)) {
echo $cache_file;
} else {
// heavy query here
$h = fopen('/cached/pricing-structure.something', 'w+');
fwrite($h, $data_from_query);
fclose($h);
}
If you are doing caching on a single web server (as opposed to multiple distributed servers), PHP APC is a very easy-to-use solution. It is an in-memory cache, similar to memcache, but runs locally. I would avoid storing any significant amount of data in session.
APC is a PECL extension and can be installed with the command
pecl install apc
You may need to enable it in your php.ini.
Related
I've been searching for a suitable PHP caching method for MSSQL results.
Most of the examples I can find suggest storing the results in an array, which would then get included to page. This seems great unless a request for the content was made at the same time as it being updated/rebuilt.
I was hoping to find something similar to ASP's application level variables, but far as I'm aware, PHP doesn't offer this functionality?
The problem I'm facing is I need to perform 6 queries on page to populate dropdown boxes. This happens on the vast majority of pages. It's also not an option to combine the queries. The cached data will also need to be rebuilt sporadically, when the system changes. This could be once a day, once a week or a month. Any advice will be greatly received, thanks!
You can use Redis server and phpredis PHP extension to cache results fetched from database:
$redis = new Redis();
$redis->connect('/tmp/redis.sock');
$sql = "SELECT something FROM sometable WHERE condition";
$sql_hash = md5($sql);
$redis_key = "dbcache:${sql_hash}";
$ttl = 3600; // values expire in 1 hour
if ($result = $redis->get($redis_key)) {
$result = json_decode($result, true);
} else {
$result = Db::fetchArray($sql);
$redis->setex($redis_key, $ttl, json_encode($result));
}
(Error checks skipped for clarity)
I have a app over 500k member and all of their user get data from server every 1h.
All of my data store in a php file and device get it with JSON.
This is my Simple php file:
<?php
$response = array();
header('Content-type: application/json');
$response["AppInf"] = array();
$product = array();
$product["apptitle"] = "string1";
$product["apps"] = "string2";
$product["apps2"] = "string3";
$product["apps4"] = "string4";
$product["idapp"] = "stringid";
array_push($response["AppInf"], $product);
$response["success"] = 1;
echo json_encode($response);
?>
but when access over 15k user in server my cpu load grow to 100%.
I have a good vps server with 64g ram and xenon cpu.
Anyone can help me for manage and fix this problem???
If your content is really static as in your example: store content in static file and use caching. If your content is the same for at least a group of users then you only have to calculate the desired result once and store the data for later retrieval
Consider using a reverse proxy like varnish to move load from your web server to another server
If it is possible: Don't let all users fetch data at the same time. Add some random offset to the time when data is being pulled.
I'm running a php script that pulls data from MySQL table. I will be running this on a frequently visited server and would like to to keep the data in cache for X amount of time. So once you pull it, the data gets saved on the server and while the time has not passed. Here's the script:
<?php
include('mysql_connection.php');
$c = mysqlConnect();
$locale = $_GET['locale'];
$last_news_id = $_GET['news_id'];
sendQuery ("set character_set_results='utf8'");
sendQuery ("set collation_connection='utf8_general_ci'");
if(strcmp($locale,"ru") != 0)
$locale = "en";
$result = sendQuery("SELECT * FROM news WHERE id > ".$last_news_id." and locale = '".$locale."' ORDER BY id DESC LIMIT 10");
echo '<table width=\"100%\">';
while($row = mysqli_fetch_array($result, MYSQL_NUM))
{
echo '<tr><td width=\"100%\"><b>Date: </b>'.$row[2].'</td></tr>';
echo '<tr><td width=\"100%\">'.preg_replace('/#([^#]*)#(.*)/', ' $1', $row[3]).'</td></tr>';
echo '<tr><td width=\"100%\"><hr style="height: 2px; border: none; background: #515151;"></td></tr>';
}
echo '</table>';
mysqliClose($c);
?>
What php functions to use to cache the data? What are the best methods? Thank you!
You can use php Memcache:
Just add this code in your script after "sendQuery()" funciton and store it in cache like below:
$memcache_obj = memcache_connect('memcache_host', 11211);
memcache_set($memcache_obj, 'var_key', $result, 0, 30);
echo memcache_get($memcache_obj, 'var_key');
The two go-to solutions are APC and Memcache. The former is also an opcache and the latter can be distributed. Pick what suits you best.
As a matter of fact, your data already saved on the server.
And such a query should be pretty fast.
So, it seems that caching is unnecessary here. Especially if you experiencing no load problems but doing it just in case.
Apc/Memcached can be used and are used generally for this type of things. You have to be aware though about the problems that might arise from this approach: managing new inserts/updates and so on. As long as you don't really care about this information, you can set up arbitrary intervals for which the data will expire, but if the information is really relevant to your application, then this approach will not work.
Also, mysql already caches selects that are not modified between 2 requests, so basicly, if you do a select now, and one in 10 minutes with the exact same query, if nothing changed in the table, you will get the result from the query cache of mysql. There is still the overhead of issuing a data request and receiving data, but it is extremly fast. This approach works by default with the update/delete problem, because whenever a record in the table has modified, the associated query caches get erased, so you will get all modifications as they are.
I'm developing on GAE using Resin, it seems that my PHP session on the production site is short lived and doesn't get updated (i.e., making requests doesn't seem to increase it's expiry period). Local is fine, as long as I don't close the tab, the session persists.
Any pointer on this? My users are getting frustrated as they are kicked very frequently :(
I think the code is the best tutorial :)
// global mem cache service handle
$MEM_CACHE_SERVICE = NULL;
// table to store session like information
$MY_SESSION_TABLE = array();
function load_mcache($key) {
global $MEM_CACHE_SERVICE;
if (!$MEM_CACHE_SERVICE) {
import com.google.appengine.api.memcache.MemcacheServiceFactory;
import com.google.appengine.api.memcache.Expiration;
$MEM_CACHE_SERVICE = MemcacheServiceFactory::getMemcacheService();
}
return $MEM_CACHE_SERVICE->get($key);
}
function save_mcache($key, $value, $cache_time) {
global $MEM_CACHE_SERVICE;
if (!$MEM_CACHE_SERVICE) {
import com.google.appengine.api.memcache.MemcacheServiceFactory;
import com.google.appengine.api.memcache.Expiration;
$MEM_CACHE_SERVICE = MemcacheServiceFactory::getMemcacheService();
}
$expiration = Expiration::byDeltaSeconds($cache_time);
return $MEM_CACHE_SERVICE->put($key, $value, $expiration);
}
// unserializing array from mem cache
// if nothing found like first time and after a minute, then add key to the table
if (!($MY_SESSION_TABLE = unserialize(load_mcache($_REQUEST['JSESSIONID'])))) {
// save something to cache on first page load because we didnt have anything
$MY_SESSION_TABLE['key1'] = date('m/d/Y H:i:s');
// using jsessionid as a mem cache key, serializing array and setting cache time to one minute
save_mcache($_REQUEST['JSESSIONID'], serialize($MY_SESSION_TABLE), 60);
}
// now my session table is available for a minute until its initialized again
print_r($MY_SESSION_TABLE);
Now for proper session functionality you need to add set and get methods or even better a small class for handling it. Little abstraction to the classes and you could choose what kind of session mechanism to use with same library on different web app scenarios.
I'm currently attempting to implement APC caching as a datastore in my web application.
Currently, the system retrieves data directly from the MySQL database, and requires a database call per request.
I'm currently attempting to change this by prepopulating the cache with data which is intercepted and served from the cache at each request.
Here's the current method:
if(!empty($_GET['id'])){
$app = $db->real_escape_string($_GET['id']);
$result = $db->query("SELECT * FROM pages_content WHERE id = $app");
$rawdata = $result->fetch_assoc();
}
Data is presented by output by:
$title = stripslashes($rawdata['title']);
$meta['keywords'] = stripslashes($rawdata['htmlkeywords']);
$meta['description'] = stripslashes($rawdata['htmldesc']);
$subs = stripslashes($rawdata['subs']);
$pagecontent = "<article>".stripslashes($rawdata['content'])."</article>";
What I need the prepopulating script to do is for each row of data in the data table, cache each row's data. The serving script would then be able to pluck the data from the cache as an when required, using something such as apc_fetch('[columname][id]').
How could I devise this?
I assume I'd need to serialize the data?
I really don't know you cache schema so the reply can't be exact.
First: remember that APC uses the server's shared memory, in which case if you have multiple server they all will fetch at least once the db to get the data.
If you try to store per column you need to be sure of create some sort of lock otherwise you will be having race conditions since maybe when you are saving a column any other could change.
What I recommend you is to save the row completely, just doing:
<?php
foreach ($row = $mylsql->get_row()) {
$key = 'content_' . $row['id'];
apc_store($key, $row);
}
But in the other hand that means if you have 1000 articles you will save them in the cache and maybe many of them are not read at all.
Int that case I would recommend you use lazy loading:
<?php
$id = $_GET['id'];
$key = 'content_' . $id;
$data = apc_fetch($key);
if (!is_array($data)) {
// call mysql here
$data = $mylsql->get_row();
apc_store($key, $data);
}
// your script here using $data
this way you only cache the content that is often hit.
In the other hand please be consistent with your cache invalidation, to avoid having old data in the cache.