I'm currently attempting to implement APC caching as a datastore in my web application.
Currently, the system retrieves data directly from the MySQL database, and requires a database call per request.
I'm currently attempting to change this by prepopulating the cache with data which is intercepted and served from the cache at each request.
Here's the current method:
if(!empty($_GET['id'])){
$app = $db->real_escape_string($_GET['id']);
$result = $db->query("SELECT * FROM pages_content WHERE id = $app");
$rawdata = $result->fetch_assoc();
}
Data is presented by output by:
$title = stripslashes($rawdata['title']);
$meta['keywords'] = stripslashes($rawdata['htmlkeywords']);
$meta['description'] = stripslashes($rawdata['htmldesc']);
$subs = stripslashes($rawdata['subs']);
$pagecontent = "<article>".stripslashes($rawdata['content'])."</article>";
What I need the prepopulating script to do is for each row of data in the data table, cache each row's data. The serving script would then be able to pluck the data from the cache as an when required, using something such as apc_fetch('[columname][id]').
How could I devise this?
I assume I'd need to serialize the data?
I really don't know you cache schema so the reply can't be exact.
First: remember that APC uses the server's shared memory, in which case if you have multiple server they all will fetch at least once the db to get the data.
If you try to store per column you need to be sure of create some sort of lock otherwise you will be having race conditions since maybe when you are saving a column any other could change.
What I recommend you is to save the row completely, just doing:
<?php
foreach ($row = $mylsql->get_row()) {
$key = 'content_' . $row['id'];
apc_store($key, $row);
}
But in the other hand that means if you have 1000 articles you will save them in the cache and maybe many of them are not read at all.
Int that case I would recommend you use lazy loading:
<?php
$id = $_GET['id'];
$key = 'content_' . $id;
$data = apc_fetch($key);
if (!is_array($data)) {
// call mysql here
$data = $mylsql->get_row();
apc_store($key, $data);
}
// your script here using $data
this way you only cache the content that is often hit.
In the other hand please be consistent with your cache invalidation, to avoid having old data in the cache.
Related
I've been searching for a suitable PHP caching method for MSSQL results.
Most of the examples I can find suggest storing the results in an array, which would then get included to page. This seems great unless a request for the content was made at the same time as it being updated/rebuilt.
I was hoping to find something similar to ASP's application level variables, but far as I'm aware, PHP doesn't offer this functionality?
The problem I'm facing is I need to perform 6 queries on page to populate dropdown boxes. This happens on the vast majority of pages. It's also not an option to combine the queries. The cached data will also need to be rebuilt sporadically, when the system changes. This could be once a day, once a week or a month. Any advice will be greatly received, thanks!
You can use Redis server and phpredis PHP extension to cache results fetched from database:
$redis = new Redis();
$redis->connect('/tmp/redis.sock');
$sql = "SELECT something FROM sometable WHERE condition";
$sql_hash = md5($sql);
$redis_key = "dbcache:${sql_hash}";
$ttl = 3600; // values expire in 1 hour
if ($result = $redis->get($redis_key)) {
$result = json_decode($result, true);
} else {
$result = Db::fetchArray($sql);
$redis->setex($redis_key, $ttl, json_encode($result));
}
(Error checks skipped for clarity)
I have a app over 500k member and all of their user get data from server every 1h.
All of my data store in a php file and device get it with JSON.
This is my Simple php file:
<?php
$response = array();
header('Content-type: application/json');
$response["AppInf"] = array();
$product = array();
$product["apptitle"] = "string1";
$product["apps"] = "string2";
$product["apps2"] = "string3";
$product["apps4"] = "string4";
$product["idapp"] = "stringid";
array_push($response["AppInf"], $product);
$response["success"] = 1;
echo json_encode($response);
?>
but when access over 15k user in server my cpu load grow to 100%.
I have a good vps server with 64g ram and xenon cpu.
Anyone can help me for manage and fix this problem???
If your content is really static as in your example: store content in static file and use caching. If your content is the same for at least a group of users then you only have to calculate the desired result once and store the data for later retrieval
Consider using a reverse proxy like varnish to move load from your web server to another server
If it is possible: Don't let all users fetch data at the same time. Add some random offset to the time when data is being pulled.
because a provider I use, has a quite unreliable MySQL servers, which are down at leas 1 time pr week :-/ impacting one of the sites I made, I want to prevent its outeges in the following way:
dump the MySQL table to a file In case the connection with the SQL
server is failed,
then read the file instead of the Server, till the Server is back.
This will avoid outages from the user experience point of view.
In fact things are not so easy like it seems and I ask for your help please.
What I did is to save the data to a JSON file format.
But this got issues because many data on the DB are "in clear" included escaped complex URLs, with long argument's line, that give some issue during the decode process from JSON.
On CSV and TSV is also not workign correctly.
CSV is delimited by Commas or Semilcolon , and those are present in the original content taken from the DB.
TSV format leave double quotes that are not deletable, without avoid to go to eliminate them into the record's fields
Then I tried to serialize each record read from the DB, store it and retrive it serializing it.
But the result is a bit catastrophic, becase all the records are stored in the file.
When I retrieve them, only one is returned. then there is something that blocks the functioning of the program (here below the code please)
require_once('variables.php');
require_once("database.php");
$file = "database.dmp";
$myfile = fopen($file, "w") or die("Unable to open file!");
$sql = mysql_query("SELECT * FROM song ORDER BY ID ASC");
// output data of each row
while ($row = mysql_fetch_assoc($sql)) {
// store the record into the file
fwrite($myfile, serialize($row));
}
fclose($myfile);
mysql_close();
// Retrieving section
$myfile = fopen($file, "r") or die("Unable to open file!");
// Till the file is not ended, continue to check it
while ( !feof($myfile) ) {
$record = fgets($myfile); // get the record
$row = unserialize($record); // unserialize it
print_r($row); // show if the variable has something on it
}
fclose($myfile);
I tried also to uuencode and also with base64_encode but they were worse choices.
Is there any way to achieve my goal?
Thank you very much in advance for your help
If you have your data layer well decoupled you can consider using SQLite as a fallback storage.
It's just a matter of adding one abstraction more, with the same code accessing the storage and changing the storage target in case of unavailability of the primary one.
-----EDIT-----
You could also try to think about some caching (json/html file?!) strategy returning stale data in case of mysql outage.
-----EDIT 2-----
If it's not too much effort, please consider playing with PDO, I'm quite sure you'll never look back and believe me this will help you structuring your db calls with little pain when switching between storages.
Please take the following only as an example, there are much better
way to design this architectural part of code.
Just a small and basic code to demonstrate you what I mean:
class StoragePersister
{
private $driver = 'mysql';
public function setDriver($driver)
{
$this->driver = $driver;
}
public function persist($data)
{
switch ($this->driver)
{
case 'mysql':
$this->persistToMysql($data);
case 'sqlite':
$this->persistToSqlite($data);
}
}
public function persistToMysql($data)
{
//query to mysql
}
public function persistSqlite($data)
{
//query to Sqlite
}
}
$storage = new StoragePersister;
$storage->setDriver('sqlite'); //eventually to switch to sqlite
$storage->persist($somedata); // this will use the strategy to call the function based on the storage driver you've selected.
-----EDIT 3-----
please give a look at the "strategy" design pattern section, I guess it can help to better understand what I mean.
After SELECT... you need to create a correct structure for inserting data, then you can serialize or what you want.
For example:
You have a row, you could do that - $sqls[] = "INSERT INTOsong(field1,field2,.. fieldN) VALUES(field1_value, field2_value, ... fieldN_value);";
Than you could serialize this $sqls, write into file, and when you need it, you could read, unserialize and make query.
Have you thought about caching your queries into a cache like APC ? Also, you may want to use mysqli or pdo instead of mysql (Mysql is deprecated in the latest versions of PHP).
To answer your question, this is one way of doing it.
var_export will export the variable as valid PHP code
require will put the content of the array into the $data variable (because of the return statement)
Here is the code :
$sql = mysql_query("SELECT * FROM song ORDER BY ID ASC");
$content = array();
// output data of each row
while ($row = mysql_fetch_assoc($sql)) {
// store the record into the file
$content[$row['ID']] = $row;
}
mysql_close();
$data = '<?php return ' . var_export($content, true) . ';';
file_put_contents($file, $data);
// Retrieving section
$rows = require $file;
Is it possible to ask for all data in my database and make objects from it and save it into an array or something, so I just need to call the database once and afterwards I just use my local array? If yes, how is it done?
public function getAllProjects(){
$query="SELECT * FROM projects";
$result=mysql_query($query);
$num=mysql_numrows($result);
while ($row = mysql_fetch_object($result)) {
// save object into array
}
}
public function fetchRow($row){
include("Project.php");
$project = new Project();
$id=$row->id;
$project->setId($id);
$title=$row->title;
$project->setTitle($title);
$infos=$row->info;
$project->setInfo($infos);
$text=$row->text;
$project->setText($text);
$cate=$row->category;
$project->setCategory($cate);
return $project;
}
If I have for example this code. How do i store the objects correctly into an array, where I grab the data from? And why can't I make more than one object of type "Project"?
Let's ignore the fact that you will run out of memory.
If you have everything in an array you will no longer have the functionalities of a relational database.
Try a search over a multi megabytes, multi dimensional array in php and be prepared for a extended coffee break.
If you are thinking in doing something like that is because you feel that the database is slow... You should learn about data normalization and correct use of indexes then.
And no NoSQL is not the answer.
Sorry to pop your balloon.
Edited to add: What you CAN to is use memcache to store the final product of some expensive processes. Don't bother storing the result of trivial queries, the internal cache of mysql is very optimized for those.
You should use the $_SESSION vars in php, To use them, add a session_start() at the beginning of your code. Then you can set vars with $_SESSION['selectaname'] = yourvar
Nothing prevent you to make a sql query like "SELECT username FROM users WHERE id = 1" and then set a $_SESSION['user'] = $queryresult
Then you'll have this :
echo $_SESSION['user'];
"mylogin"
I have a page that has quite a lot of data loading from my db. I'd like to speed-up loading time. I already cache query, but still the loading time is longer than I'd like it to be.
Is it possible to render a table with data and store it in a session to load on every new page refresh? I was even thinking of putting it in an external text file using ob_start();
What's the best way to handle it?
Storing it in sessions is probably not the best idea, when you add data to a session (by default) the data is written to a file on the OS, usually in /tmp/ which means you're going to be hitting the disk quite a lot and storing just as much data.
If the data is not user specific then you could store it on disk, or in memory - (see: php.net/apc)
If the data is user specific, I recommend storing it in a distributed cache, such as Memcached (memcached.org) PHP has a library you can enable (php.net/memcached)
(by user specific I mean data like a users transactions, items, shopping cart, etc)
The logic is basically the same for any method you choose:
Memcached, user specific data example:
<?php
$memcached = new Memcached();
$data = $memcached->get('shopping-cart_' . $user_id);
if (!$data) {
$sql = $db->query("..");
$data = array();
while($row = $query->fetch_assoc()) {
$data[] = $row;
}
$memcached->set('shopping-cart_' . $user_id, $data);
}
?>
<table>
<?php
foreach ($data as $item) {
echo '<tr><td>' . $item['name'] .' </td></tr>';
}
?>
</table>
Global data (not user specific)
<?php
$cache_file = '/cached/pricing-structure.something';
if (file_exists($cache_file)) {
echo $cache_file;
} else {
// heavy query here
$h = fopen('/cached/pricing-structure.something', 'w+');
fwrite($h, $data_from_query);
fclose($h);
}
If you are doing caching on a single web server (as opposed to multiple distributed servers), PHP APC is a very easy-to-use solution. It is an in-memory cache, similar to memcache, but runs locally. I would avoid storing any significant amount of data in session.
APC is a PECL extension and can be installed with the command
pecl install apc
You may need to enable it in your php.ini.