memcaching php resource objects - php

Say I have this code in PHP :
$query = mysql_query("SELECT ...");
the statement returns a resource object. Normally it would get passed to mysql_fetch_array() or one of the mysql_fetch_* functions to populate the data set.
I'm wondering if the resouce object - the $query variable in this case can be cached in memcache and then a while later can be fetched and used just like the moment it's created.
// cache it
$memcache->set('query', $query);
// restore it later
$query = $memcache->get('query');
// reuse it
while(mysql_fetch_array($query)) { ... }
I have googled the question, didn't got much luck.
I'm asking this is because it looks way much light-weighted than the manner of "populate the result array first then cache".
So is it possible?

I doubt it. From the serialize manual entry
serialize() handles all types, except the resource-type.
Edit: Resources are generally tied to the service that created them. I don't know if memcached uses serialize however I'd guess it would be subject to the same limitations.

The Memcache extension serializes objects before sending them to the Memcached server. As the other poster mentioned, resources can't be serialized. A resource is basically a reference to a network connection to the database server. You can't cache a network connection and reuse it later. You can only cache the data that gets transmitted over the connection.
With queries like that, you should fetch all rows before caching them.
$all_results = array();
while ($row = mysql_fetch_array($query)) {
$all_results[] = $row;
}
$memcache->set('query', $all_results);
Modern database drivers such as MySQLi and PDO have a fetch_all() function that will fetch all rows. This simplifies things a bit.
When you retrieve that array later, you can just use foreach() to iterate over it. This doesn't work with very large query results (Memcached has a 1MB limit) but for most purposes you shouldn't have any problem with it.

Related

PHP DB caching, without including files

I've been searching for a suitable PHP caching method for MSSQL results.
Most of the examples I can find suggest storing the results in an array, which would then get included to page. This seems great unless a request for the content was made at the same time as it being updated/rebuilt.
I was hoping to find something similar to ASP's application level variables, but far as I'm aware, PHP doesn't offer this functionality?
The problem I'm facing is I need to perform 6 queries on page to populate dropdown boxes. This happens on the vast majority of pages. It's also not an option to combine the queries. The cached data will also need to be rebuilt sporadically, when the system changes. This could be once a day, once a week or a month. Any advice will be greatly received, thanks!
You can use Redis server and phpredis PHP extension to cache results fetched from database:
$redis = new Redis();
$redis->connect('/tmp/redis.sock');
$sql = "SELECT something FROM sometable WHERE condition";
$sql_hash = md5($sql);
$redis_key = "dbcache:${sql_hash}";
$ttl = 3600; // values expire in 1 hour
if ($result = $redis->get($redis_key)) {
$result = json_decode($result, true);
} else {
$result = Db::fetchArray($sql);
$redis->setex($redis_key, $ttl, json_encode($result));
}
(Error checks skipped for clarity)

Cache data from MySQL in PHP?

Is it possible to ask for all data in my database and make objects from it and save it into an array or something, so I just need to call the database once and afterwards I just use my local array? If yes, how is it done?
public function getAllProjects(){
$query="SELECT * FROM projects";
$result=mysql_query($query);
$num=mysql_numrows($result);
while ($row = mysql_fetch_object($result)) {
// save object into array
}
}
public function fetchRow($row){
include("Project.php");
$project = new Project();
$id=$row->id;
$project->setId($id);
$title=$row->title;
$project->setTitle($title);
$infos=$row->info;
$project->setInfo($infos);
$text=$row->text;
$project->setText($text);
$cate=$row->category;
$project->setCategory($cate);
return $project;
}
If I have for example this code. How do i store the objects correctly into an array, where I grab the data from? And why can't I make more than one object of type "Project"?
Let's ignore the fact that you will run out of memory.
If you have everything in an array you will no longer have the functionalities of a relational database.
Try a search over a multi megabytes, multi dimensional array in php and be prepared for a extended coffee break.
If you are thinking in doing something like that is because you feel that the database is slow... You should learn about data normalization and correct use of indexes then.
And no NoSQL is not the answer.
Sorry to pop your balloon.
Edited to add: What you CAN to is use memcache to store the final product of some expensive processes. Don't bother storing the result of trivial queries, the internal cache of mysql is very optimized for those.
You should use the $_SESSION vars in php, To use them, add a session_start() at the beginning of your code. Then you can set vars with $_SESSION['selectaname'] = yourvar
Nothing prevent you to make a sql query like "SELECT username FROM users WHERE id = 1" and then set a $_SESSION['user'] = $queryresult
Then you'll have this :
echo $_SESSION['user'];
"mylogin"

How to "release" memory in loop?

I have a script that is running on a shared hosting environment where I can't change the available amount of PHP memory. The script is consuming a web service via soap. I can't get all my data at once or else it runs out of memory so I have had some success with caching the data locally in a mysql database so that subsequent queries are faster.
Basically instead of querying the web service for 5 months of data I am querying it 1 month at a time and storing that in the mysql table and retrieving the next month etc. This usually works but I sometimes still run out of memory.
my basic code logic is like this:
connect to web service using soap;
connect to mysql database
query web service and store result in variable $results;
dump $results into mysql table
repeat steps 3 and 4 for each month of data
the same variables are used in each iteration so I would assume that each batch of results from the web service would overwrite the previous in memory? I tried using unset($results) in between iterations but that didn't do anything. I am outputting the memory used with memory_get_usage(true) each time and with every iteration the memory used is increased.
Any ideas how I can fix this memory leak? If I wasn't clear enough leave a comment and I can provide more details. Thanks!
***EDIT
Here is some code (I am using nusoap not the php5 native soap client if that makes a difference):
$startingDate = strtotime("3/1/2011");
$endingDate = strtotime("7/31/2011");
// connect to database
mysql_connect("dbhost.com", "dbusername" "dbpassword");
mysql_select_db("dbname");
// configure nusoap
$serverpath ='http://path.to/wsdl';
$client = new nusoap_client($serverpath);
// cache soap results locally
while($startingDate<=$endingDate) {
$sql = "SELECT * FROM table WHERE date >= ".date('Y-m-d', $startingDate)." AND date <= ".date('Y-m-d', strtotime($startingDate.' +1 month'));
$soapResult = $client->call('SelectData', $sql);
foreach($soapResult['SelectDateResult']['Result']['Row'] as $row) {
foreach($row as &$data) {
$data = mysql_real_escape_string($data);
}
$sql = "INSERT INTO table VALUES('".$row['dataOne']."', '".$row['dataTwo']."', '".$row['dataThree'].")";
$mysqlResults = mysql_query($sql);
}
$startingDate = strtotime($startingDate." +1 month");
echo memory_get_usage(true); // MEMORY INCREASES EACH ITERATION
}
Solved it. At least partially. There is a memory leak using nusoap. Nusoap writes a debug log to a $GLOBALS variable. Altering this line in nusoap.php freed up a lot of memory.
change
$GLOBALS['_transient']['static']['nusoap_base']->globalDebugLevel = 9;
to
$GLOBALS['_transient']['static']['nusoap_base']->globalDebugLevel = 0;
I'd prefer to just use php5's native soap client but I'm getting strange results that I believe are specific to the webservice I am trying to consume. If anyone is familiar with using php5's soap client with www.mindbodyonline.com 's SOAP API let me know.
Have you tried unset() on $startingDate and mysql_free_result() for $mysqlResults?
Also SELECT * is frowned upon even if that's not the problem here.
EDIT: Also free the SOAP result too, perhaps. Some simple stuff to begin with to see if that helps.

From PHP/MySQL/JSON to iOS/Objective-C/SQLite

I'm trying to create an iOS application which upon loading, will initially connect via HTTP back to a PHP web service which will output data as JSON from a MySQL database. I would then like it to import this data into a local SQLite database within the iOS app. I've already downloaded the JSON-Framework for Objective-C.
My question is two fold.
1) What is the best way to output the JSON from PHP so that I can send multiple database tables in the same JSON file? I have 4 tables of data that I'm trying to send (user, building, room, device).
Here is how I am currently outputting the JSON data:
// Users
$query = "SELECT * from user";
$result = mysql_query($query,$conn) or die('Errant query: '.$query);
$users = array();
if(mysql_num_rows($result)) {
while($user = mysql_fetch_assoc($result)) {
$users[] = array('user'=>$user);
}
}
// Buildings
$query = "SELECT * from building";
$result = mysql_query($query,$conn) or die('Errant query: '.$query);
$buildings = array();
if(mysql_num_rows($result)) {
while($building = mysql_fetch_assoc($result)) {
$buildings[] = array('building'=>$building);
}
}
// Rooms
$query = "SELECT * from room";
$result = mysql_query($query,$conn) or die('Errant query: '.$query);
$rooms = array();
if(mysql_num_rows($result)) {
while($room = mysql_fetch_assoc($result)) {
$rooms[] = array('room'=>$room);
}
}
// Devices
$query = "SELECT * from device";
$result = mysql_query($query,$conn) or die('Errant query: '.$query);
$devices = array();
if(mysql_num_rows($result)) {
while($device = mysql_fetch_assoc($result)) {
$devices[] = array('device'=>$device);
}
}
header('Content-type: application/json');
echo json_encode(array('users'=>$users));
echo json_encode(array('buildings'=>$buildings));
echo json_encode(array('rooms'=>$rooms));
echo json_encode(array('devices'=>$devices));
I fear that this method isn't the right way to send multiple objects.
2) In the iOS app, how can I automatically take this JSON data and insert it into the corresponding local database tables in SQLite?
Thanks for any help.
On 1. Instead of JSOn you could use binary Property lists they are natively implemented on the iPhone and there is a library to turn PHP into binary Plist https://github.com/rodneyrehm/CFPropertyList
There are many benefits to using binary property lists, they are probably 1/5 of the size of JSON, you don't need a external library to parse them, therefore all code is much simpler, etc.
On 2. There is no easy way to take the JSON/Plist structure and insert it to a SQL database, because JSON/Plist allow much more flexibility then SQL tables. So you would have to first create the right tables in your SQLite DB and then use normal INSERT to insert the data one by one into the database exactly like you would do with PHP.
Yeah Luke's recommendation is good but you will be fine with the way you are exporting your tables. You may just have to dig "deeper" into the structure to get what you want - i.e. your output with return a "dictionary of dictionaries of arrays" which will then contain the data for each table.
As for downloading them first:
1) NSURLConnection and its delegate methods - you can send asynchronous request to your webserver to get this file and get notified when the data has been downloaded so the user interface is never blocked in your app.
Here's the documentation with some good examples from Apple: http://developer.apple.com/library/mac/#documentation/Cocoa/Reference/Foundation/Classes/NSURLConnection_Class/Reference/Reference.html
At the end of the download, you will have an NSData object which can then be converted back to a string using NSString *jsonContents = [[NSString alloc] initWithData:jsonData encoding:NSUTF8StringEncoding.
You can then use a JSON parser library - I recommend SBJSON https://github.com/stig/json-framework - which will parse the data and return it as a dictionary or array depending on your structure.
From there you can access your tables and value with valueForKey in dictionaries or objectAtIndex: in arrays and then map it into your chosen local storage, for which I recommend Coredata (or you could use sqlite if you are familiar with it too).
Hope it helps.
Rog
I can't speak to 2), but for 1), I would recommend combining your JSON into a single array. One of the nice things about JSON (and arrays) is the ability to nest elements as deeply as you like.
echo json_encode(array(
'users'=>$users,
'buildings'=>$buildings,
'rooms'=>$rooms,
'devices'=>$devices,
));
What about preparing the SQLite database on the webserver and downloading that to the iOS application? This way, you do the heavy lifting on the PHP side. If the data is relatively static, you can even setup a scheduled task to generate the SQLite database on a regular interval.
We've done this for one of our apps and it worked very well.
One thing to keep in mind then is that you should enable gzip compression on the webserver to minimize the data transfer. Remember that you have to do some extra stuff to use gzip compression with NSURLConnection:
http://www.bigevilempire.com/codelog/entry/nsurlconnection-with-gzip/
you can use REST server and RESTKit.
If you would like a more full-featured solution that what is offered by a standalone parsing library, you may want to take a look at RestKit: http://restkit.org/
The framework wraps the operations of fetching, parsing, and mapping JSON payloads into objects. It can handle deeply nested structures and can map directly back to Core Data for persistence.
At a high level, here's what your fetch & post operations would feel like in RestKit:
- (void)loadObjects {
[[RKObjectManager sharedManager] loadObjectsAtResourcePath:[#"/path/to/stuff.json" delegate:self];
}
- (void)objectLoader:(RKObjectLoader*)loader didLoadObjects:(NSArray*)objects {
NSLog(#"These are my JSON decoded, mapped objects: %#", objects);
// Mutate and PUT the changes back to the server
MyObject* anObject = [objects objectAtIndex:0];
anObject.name = #"This is the new name!";
[[RKObjectManager sharedManager] putObject:anObject delegate:self];
}
The framework takes care of the JSON parsing/encoding on a background thread and let's you declare how attributes in the JSON map to properties on your object. A number of people in the community are working with PHP backends + RestKit with great success.

store objects in $_GET

how can i store an object in the $_GET array in PHP. i want to pass an object containing database information from one page to another page via the $_GET array, so there would be no need to access the database again.
To pass any sort of object, you'd have to serialize it on one end, and unserialize it on the other end.
But note that this will not work for database connections themselves : the connection to the database is automatically closed when a PHP script ends.
You could pass some connection informations, like login, host, or stuff like that (but that would not be a good idea -- quite not safe to expose such critical informations !) ; but you cannot pass a connection resource.
Really, you should be passing data from one page to another via the $_SESSION variable instead, if possible. That is what sessions are for. Ideally just store an id in the session and look up the data on each page. If you do use $_SESSION then it is as simple as ...
$_SESSION['myarray'] = $myarrayobject;
$_SESSION['someotherthing'] = 42;
If you have to use $_GET, then I would recommend just passing an id of some kind, then re-looking up the data on each page refesh.
Keep in mind, it would be easy for a malicious user to change the values that are sent via $_GET between pages, so make sure there is nothing that can be abused in this information.
You would need to serialize it to text (possibly using json_encode), then generate a URL that included it in the query string (making sure that it was urlencoded)
That's very bad idea.
Database were invented to serve each request, while query string were designed to pass only commands and identificators, not tons of data between browser and server
Instead of using get, another possibility to pass something from one page to another is to use the $_SESSION array and then to unset that variable on the other side whenever you're done with it. I've found that to be a pretty good way to do this.
Like everyone else has said though, passing database information can be a bad idea, assuming the information you're passing isn't something like username, first name, etc. If that's what you're passing when you say "database information" then I would store all that stuff in the $_SESSION variable and then destroy their session when they log out. Don't store the entire connection or something.
As has been said before you should avoid (mis-)using GET parameters as a "cache" for overly complex and loooong data.
On the other hand your question is vague enough to assume that you want to transmit only a few values. And your "second" script needs nothing else from the database, in fact it might not even have to check if those values came from a database at all.
In this case extract the values from your database result and append them as parameters to the url. Try to make the parameter list as simple as possible but yet unambiguous. http_build_query() can help you with that.
But keep in mind that you want to keep GET operations idempotent as described in http://www.w3.org/Protocols/rfc2616/rfc2616-sec9.html.
if ( isset($_GET['a'], $_GET['b'], $_GET['c']) ) {
// this doesn't care about the values in the database at all.
echo 'product: ', $_GET['a'] * $_GET['b'] * $_GET['c'], "\n";
}
else {
$pdo = new PDO("mysql:host=localhost;dbname=test", 'localonly', 'localonly');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
// let's make it a self-contained example (and extra constly) by creating a temproary table ...
$pdo->exec('CREATE TEMPORARY TABLE foo (id int auto_increment, a int, b int, c int, primary key(id))');
// ... with some garbage data
$stmt = $pdo->prepare('INSERT INTO foo (a,b,c) VALUES (?,?,?)');
for($i=0; $i<10; $i++) {
$stmt->execute(array(rand(1,10), rand(1,10), rand(1,10)));
}
foreach( $pdo->query('SELECT a,b,c FROM foo', PDO::FETCH_ASSOC) as $row) {
printf('%s<br />',
http_build_query($row),
join(' * ', $row)
);
}
}

Categories