I'm running a php script that pulls data from MySQL table. I will be running this on a frequently visited server and would like to to keep the data in cache for X amount of time. So once you pull it, the data gets saved on the server and while the time has not passed. Here's the script:
<?php
include('mysql_connection.php');
$c = mysqlConnect();
$locale = $_GET['locale'];
$last_news_id = $_GET['news_id'];
sendQuery ("set character_set_results='utf8'");
sendQuery ("set collation_connection='utf8_general_ci'");
if(strcmp($locale,"ru") != 0)
$locale = "en";
$result = sendQuery("SELECT * FROM news WHERE id > ".$last_news_id." and locale = '".$locale."' ORDER BY id DESC LIMIT 10");
echo '<table width=\"100%\">';
while($row = mysqli_fetch_array($result, MYSQL_NUM))
{
echo '<tr><td width=\"100%\"><b>Date: </b>'.$row[2].'</td></tr>';
echo '<tr><td width=\"100%\">'.preg_replace('/#([^#]*)#(.*)/', ' $1', $row[3]).'</td></tr>';
echo '<tr><td width=\"100%\"><hr style="height: 2px; border: none; background: #515151;"></td></tr>';
}
echo '</table>';
mysqliClose($c);
?>
What php functions to use to cache the data? What are the best methods? Thank you!
You can use php Memcache:
Just add this code in your script after "sendQuery()" funciton and store it in cache like below:
$memcache_obj = memcache_connect('memcache_host', 11211);
memcache_set($memcache_obj, 'var_key', $result, 0, 30);
echo memcache_get($memcache_obj, 'var_key');
The two go-to solutions are APC and Memcache. The former is also an opcache and the latter can be distributed. Pick what suits you best.
As a matter of fact, your data already saved on the server.
And such a query should be pretty fast.
So, it seems that caching is unnecessary here. Especially if you experiencing no load problems but doing it just in case.
Apc/Memcached can be used and are used generally for this type of things. You have to be aware though about the problems that might arise from this approach: managing new inserts/updates and so on. As long as you don't really care about this information, you can set up arbitrary intervals for which the data will expire, but if the information is really relevant to your application, then this approach will not work.
Also, mysql already caches selects that are not modified between 2 requests, so basicly, if you do a select now, and one in 10 minutes with the exact same query, if nothing changed in the table, you will get the result from the query cache of mysql. There is still the overhead of issuing a data request and receiving data, but it is extremly fast. This approach works by default with the update/delete problem, because whenever a record in the table has modified, the associated query caches get erased, so you will get all modifications as they are.
Related
I've been searching for a suitable PHP caching method for MSSQL results.
Most of the examples I can find suggest storing the results in an array, which would then get included to page. This seems great unless a request for the content was made at the same time as it being updated/rebuilt.
I was hoping to find something similar to ASP's application level variables, but far as I'm aware, PHP doesn't offer this functionality?
The problem I'm facing is I need to perform 6 queries on page to populate dropdown boxes. This happens on the vast majority of pages. It's also not an option to combine the queries. The cached data will also need to be rebuilt sporadically, when the system changes. This could be once a day, once a week or a month. Any advice will be greatly received, thanks!
You can use Redis server and phpredis PHP extension to cache results fetched from database:
$redis = new Redis();
$redis->connect('/tmp/redis.sock');
$sql = "SELECT something FROM sometable WHERE condition";
$sql_hash = md5($sql);
$redis_key = "dbcache:${sql_hash}";
$ttl = 3600; // values expire in 1 hour
if ($result = $redis->get($redis_key)) {
$result = json_decode($result, true);
} else {
$result = Db::fetchArray($sql);
$redis->setex($redis_key, $ttl, json_encode($result));
}
(Error checks skipped for clarity)
I'm writing a chat program for a site that does live broadcasting, and like you can guess with any non application driven chat it relies on a looping AJAX call to get new information (messages) in my case once every 2 seconds. My JSON that is being created via PHP and populated by SQL is of some concern to me, while it shows no noticeable impact on my server at present I cannot predict what adding several hundred users to the mix may do.
<?PHP
require_once("../../../../wp-load.php");
global $wpdb;
$table_name = $wpdb->prefix . "chat_posts";
$posts = $wpdb->get_results("SELECT * FROM ". $table_name ." WHERE ID > ". $_GET['last'] . " ORDER BY ID");
echo json_encode($posts);
?>
There obviously isn't much wiggle room as far as optimizing the code itself, but I am a little worried about how well the Wordpress SQL engine is written and if it will bog my SQL down once it gets to the point where it is receiving 200 requests every 2 seconds. Would caching the json encoded results of the DB query to a file then age checking it against new calls to the PHP script and either re-constructing the file with a new query or passing the files contents based on its last modification date be a better way to handle this? At that point I am putting a bigger load on my file-system but reducing my SQL load to one query every 2 seconds regardless of number of users.
Or am I already on the right path with just querying the server on every call?
So this is what I came up with, I went the DB only route for a few tests and while response was snappy, it didn't scale well and connections quickly got eaten up. So I decided to write a quick little bit of caching logic. So far it has worked wonderfully and seems to allow me to scale my chat as big as I want.
$cacheFile = 'cache/chat_'.$_GET['last'].'.json';
if (file_exists($cacheFile) && filemtime($cacheFile) + QUERY_REFRESH_RATE > time())
{
readfile($cacheFile);
} else {
require_once("../../../../wp-load.php");
$timestampMin = gmdate("Y-m-d H:i:s", (time() - 7200));
$sql= "/*qc=on*/" . "SELECT * FROM ". DB_TABLE ."chat_posts WHERE ID > ". $_GET['last'] . " AND timestamp > '".$timestampMin."' ORDER BY ID;";
$posts = $wpdb->get_results($sql);
$json = json_encode($posts);
echo $json;
file_put_contents($cacheFile,$json);
}
Its also great in that it allows me to run my formatting functions against messages such as parsing URL's into actual links and such with much less overhead.
This is more of a logic question than language question, though the approach might vary depending on the language. In this instance I'm using Actionscript and PHP.
I have a flash graphic that is getting data stored in a mysql database served from a PHP script. This part is working fine. It cycles through database entries every time it is fired.
The graphic is not on a website, but is being used at 5 locations, set to load and run at regular intervals (all 5 locations fire at the same time, or at least within <500ms of each-other). This is real-time info, so time is of the essence, currently the script loads and parses at all 5 locations between 30ms-300ms (depending on the distance from the server)
I was originally having a pagination problem, where each of the 5 locations would pull a different database entry since i was moving to the next entry every time the script runs. I solved this by setting the script to only move to the next entry after a certain amount of time passed, solving the problem.
However, I also need the script to send an email every time it displays a new entry, I only want it to send one email. I've attempted to solve this by adding a "has been emailed" boolean to the database. But, since all the scripts run at the same time, this rarely works (it does sometimes). Most of the time I get 5 emails sent. The timeliness of sending this email doesn't have to be as fast as the graphic gets info from the script, 5-10 second delay is fine.
I've been trying to come up with a solution for this. Currently I'm thinking of spawning a python script through PHP, that has a random delay (between 2 and 5 seconds) hopefully alleviating the problem. However, I'm not quite sure how to run exec() command from php without the script waiting for the command to finish. Or, is there a better way to accomplish this?
UPDATE: here is my current logic (relevant code only):
//get the top "unread" information from the database
$query="SELECT * FROM database WHERE Read = '0' ORDER BY Entry ASC LIMIT 1";
//DATA
$emailed = $row["emailed"];
$Entry = $row["databaseEntryID"];
if($emailed == 0)
{
**CODE TO SEND EMAIL**
$EmailSent="UPDATE database SET emailed = '1' WHERE databaseEntryID = '$Entry'";
$mysqli->query($EmailSent);
}
Thanks!
You need to use some kind of locking. E.g. database locking
function send_email_sync($message)
{
sql_query("UPDATE email_table SET email_sent=1 WHERE email_sent=0");
$result = FALSE;
if(number_of_affacted_rows() == 1) {
send_email_now($message);
$result = TRUE;
}
return $result;
}
The functions sql_query and number_of_affected_rows need to be adapted to your particular database.
Old answer:
Use file-based locking: (only works if the script only runs on a single server)
function send_email_sync($message)
{
$fd = fopen(__FILE__, "r");
if(!$fd) {
die("something bad happened in ".__FILE__.":".__LINE__);
}
$result = FALSE;
if(flock($fd, LOCK_EX | LOCK_NB)) {
if(!email_has_already_been_sent()) {
actually_send_email($message);
mark_email_as_sent();
$result = TRUE; //email has been sent
}
flock($fd, LOCK_UN);
}
fclose($fd);
return $result;
}
You will need to lock the row in your database by using a transaction.
psuedo code:
Start transaction
select row .. for update
update row
commit
if (mysqli_affected_rows ( $connection )) >1
send_email();
I am doing this animation tool where I fetch a value from my database and then a picture will animate to a certain position. My question is if it is possible to retrieve data constantly or like every 5 seconds?
Somehow like this:
while(autoretreive){
$data = mysql_query("select * from ......");
}
UPDATED from here
Thanks for your answers! Made it a little bit clearer what to do! Maybe I can explain better what I'm doing in my code.
I am doing this animation program as said, where balls with information is moving around to different locations. I have one value that will be updated frequently in the database, lets call it 'city'.
First at previous page I post the balls of information I want based on the 'city' and I do like this (simplified):
$pid = $_POST['id'];
$pcity[0] = $_POST['city'];
$pcity[1] = $_POST['city'];
$pcity[2] = $_POST['city'];
//...
$while(autoretrieve) { // HOW TO?
$data = mysql_query(select * from table where city == $pcity[0] OR $pcity == [1] //...);
while($rows = mysql_fetch_array($data)){
$city = $rows['city'];
$id = $rows['id'];
if($city == example1){
"animate to certain pos"; //attached to image
}
else if($city == example2){
"animate to certain pos"; //attached to image
}
}
}
So for every update in the database the image will animate to a new position. So a time interval of 5 seconds would be great. I'm not an expert in coding so sorry for deprecated code. Not so familiar with AJAX either so what is going to be imported to the code? It is also important that the page is not reloading. Just the fetch from database.
you can do it with ajax and javascript
make one javascript function which contains ajax code to retrive data from database
and at your page load using setTimeout call your ajax function at every 5 second
You can use sleep function to control how often you want to fetch data.
while(autoretreive){
$data = mysql_query("select * from ......");
//output your data here, check more in link about server sent events bellow
sleep(5);
}
Since you haven't specified how you plan to access data I'm writing this answer assuming Server-Sent Events as they are only ones that make sense according to your question.
Now all this was according to your question which wasn't very clear on how do you plan to use data. Again you'll most likely want to fetch data using ajax, but Server Sent Events can also be a good way you could achieve this.
And don't use mysql_* it's deprecated, switch to PDO or mysqli_*
Alright so I have no idea how to even begin doing this
But basically I have one of the menus that displays on every page come getting it's text and links from a mysql database.
Here's the code:
<table class="LeftMenuTable">
<?php
// Generates the left menu from the LeftMenu_items table
$result = MySqlQuery("SELECT * FROM Menu_LeftMenu;");
while ($row = mysqli_fetch_assoc($result))
{
if ((int)$row['header'] == 0)
{
// echos value is on or not
echo "<tr><td class='LeftMenu'><a href='" . $row['url'] . "'>" . $row['text'] . "</a></td></tr>";
}
else if ((int)$row['header'] == 1)
{
// header
echo "<tr><td style='border:0px; height:5px;'></td></tr>"; // adds extra empty tabel
echo "<tr><td class='LeftMenuHeader'><b><strong>" . $row['text'] . "</strong></b></td></tr>";
}
}
?>
</table>
function MySqlQuery($Query)
{
$result = $mysqli->query($Query) or die(ReportMysqlError(mysqli_error($mysqli), $Query));
return $result;
}
I feel like any sql queries that could be be replaced by html cache somehow are reducing the site's speed.
If anyone has any information or suggestions it's much appereciated.
If you were keen enough, I would suggest a slightly different approach.
If you make your pages using a templating language called "Smarty" find out more here. http://smarty.net you should find that smarty will manage the caching for you.
How is this related to your question.
It will make your web development easier as you will stop "echoing" content into your HTML.
Smarty will do the caching for you. When a smarty template loads, smarty keeps a copy of it (or you can tell it cache a file for x hours, days etc).
As you site grows smartys caching will help keep your site running and loading fast.
You have to very little work to make caching work, just use smarty templates to build your site.
Lastly you may find it a LOT simpler to build sites using smarty.
John.
In the method that calls your database (to fetch menu items and build the corresponding html):
Check if there is an item in the current $_SESSION with the html of your menu.
If (1) returns nothing, execute the query, build the html and store the results in $_SESSION.
Return the html of your menu.
A bit more information about how you can use the session can be found here.
This way, your menu query will only be fired once per session. I doubt it's a real performance breaker though (if your query is more or less in normal form).
Note that changes in your menu will not get picked up automatically before the session expires with the mechanism described above.
I usually use memcached (or some other similar solution) when I need to organize a caching layer.
But in your case may be it'd be done in a more simple way: use an intermediate 'generator' script which will be fired each time the table in question is updated (as it's updated from some form of admin panel, right?) and will generate a static file, then include this file within your main view script.