My AJAX search program keeps asking PHP for results of a particular search term. The start of the PHP script reads thru a MySQL DB and initializes, so if the PHP keeps restarting, it will have to read the DB millions of times. If I can keep it alive and let multiple silly AJAX requests be served by a single running PHP script instance I'm sure performance would improve.
How do you do this typically? Use a service? Can this be done with services?
PHP has no concept of long-lived objects or shared state between threads or requests, every request always starts at zero (except for the session state, of course). You can emulate long-lived objects by caching to disk or memory (see memcached).
Do you have a particular reason to read the entire database when your script initializes?
How about storing the db results in a session variable? You'd check first if the keyword is not in the session (sessions allow to transport variable values between page refreshes), and if not, do a db query.
To store it:
$_SESSION['storedQueries']['keyword']= 'its definition, from the database';
To look for it:
$index= array_search('keyword',array_keys($_SESSION['storedQueries']));
$result = ($index) ? $_SESSION['storedQueries'][$index] : 'nothing found, you should do a db query';
The ajax part is pretty easy if you use a javascript library, such as jquery:
$('#resultZone').load('find.php',{keyword: $('input.search').val() });
If you know the results are the same every time, just move those results to a session variable.
PHP sessions are explained pretty well in their documentation:
http://us3.php.net/manual/en/book.session.php
If the search result is something that would be similar to multiple users, I usually create a cache file and serialize the result set there. As a filename I might use md5sum of a string containing the search query and perhaps user group. Then when a Ajax needs the data I just need to check if the file is too old, if not I just need to send it to the client or maybe even just redirect the Ajax http-request to the file (assuming it is formatted properly). If the file is too old, I just refresh it with new content.
For very high volume sites memcached is usually a better option. And also some kind of php cache helps and SQL connection pooling lowers the overhead of opening SQL connections.
connecting to the DB is a very expensive operation and you can go around that by caching the results, take a look at Zend_Cache and see how it can save you allot headache.
Related
The page really needs to load fast, but the DB is slow, so we split it into two db calls, one faster and one slower, the first one that is faster runs and we can serve a part of the page that is quite usable by itself.
But then we want the second request to go off, and we know that it will ALWAYS be necessary to do whenever the first request goes off. So now the first part of the page contains a script which fires off http requests and then we make a db call and finally it loads.
But this is a serial opreation, which means the first part of page load needs to both finish its db, return http, render in the browser, run the script, request http then wait for db and finally return us the whole page.
How do you go about solving this in PHP? We dont have memcache and I looked into fifo but we dont have posix_mkfifo function either.
I want to make two db calls on the first request, serve the first request and part of page, let the second db call continue running, when its finished I want to keep it in /tmp/ or a buffer or wherever fast - in memory - and when the script asks for it - perhaps the scripts http req will need to wait for it some more, perhaps its lucky and will get it served from memory already.
But where in memory do you keep it, across requests and php instances? Not in global, not in session, not in memcached. Where? Sockets?? Should I fork and pipe?
EDIT: Thanks, everybody. I went with the two-async-http-requests route.
I think you could use AJAX.
First time send HTML page with 2 javascript AJAX call, one for each sql query, triggered by page load.
Then load page async with those results.
The problem is that your problem is to complex to solve it without extra solutions like memcache. Direkt in PHP you can save short data in SHM. But thats not the best solution.
The best solution is to build a better database structure so get a better result and a faster response from your database.
For better performance in your database you can look at MySQL memory tables. But be careful the tables will be cleared after restart. So you can fill the tables with data for caching.
And you can send more then one request at a time with Ajax.
My website sends curl requests to an external service and gets XML responses.
The requests are user specific and the responses are rather heavy (& several requests on the same page), so it takes time to load the page and uses too much server's traffic.
How I tried to solve the problem:
The requests sent from the client side (js). Unluckily for me it becomes rather messy to parse the received data and integrate it to the page's objects
Put the responses in session (as they are specific for user). The session files on server get large too fast. Implemented a counter, that erases all the responses from session if their number is too big (using this now)
Memcache? Too much data to save
Do you think I should use one of the solutions or is there another way to do it?
Use a commbination of
cache
database
You push things in your "data store" (this is cache and database). Then you look up in your datastore if it is available. The data store looks into cache, if available give it, if not look in database. And if everything fails get the info.
You could also increase the size of the cache (but that is not a good sollution).
Try like this
$key = "User_id_".$user_id."category_".$category_id;
then acc to this key store each data like
$memcache->set($key, $data, , 3600);
I've never really cached data before, but I feel as though it will greatly improve my site.
Basically, I pull JSON data from an external API that helps the basic functionality of my website. This info doesn't really change that often yet my users pull the info from the API thousands of times a day. If it updated once a day, that would be fine. I want to run a cron job daily that will pull the info and update the cache.
I already tried a few different things, both pulled using PHP:
1) Store data in an SQL table
I had it working, but there's no reason why I should ping the database each time when I can just store it in basic HTML/JSON.
2) .JSON file (using fwrite)
I had it stored, but the only way this worked is if the .getJSON() callback function is prepended to the JSON data and then the data is surrounded by parentheses (making it jsonp, I believe).
Does anyone have any advice or any directions to lead me in? As I said, I've never really done anything like this so I don't even know if I'm remotely headed in the right direction.
Edit:
Okay so I talked to my hosting and since I'm on a shared hosting (dreamhost) I can't install memcached, which sucks. The only info they could give me was that if it is on http://pecl.php.net/ then I can most likely use it. They said APC is available. I'm not sure if this fits my problem. I'd like to be able to access the cache directly in jQuery. Thanks
You can use memcached. Memcached is an in-memory key-value store for small chunks of arbitrary data (strings, objects) from results of database calls, API calls, or page rendering. Very easy to implement and has a low system footprint.
Since you can't use memcached, go back to your database option and store it in a table using the MEMORY engine.
try memcached. You can easily generate a string key and store whatever blob of json you want with it. works like a dictionary, except persists in memory.
There's an option to give it an expiration time (otherwise it just stays cached forever). So when the user requests the data, just check if it's stored in memcached. If it is, great, return it. If it's not, do whatever you do to build it, then put it in memcached with a 24 hour expiration.
If your data varies per user:
I've done this by storing an object in the $_SESSION array.
I attach a quick bit of logic that determines if the data expiry period is up. If so, it draws new data, serves and caches. If not, it draws from $_SESSION and serves it up.
Try Redis.
And to store data easily without unexpected errors on set/get - encode it using base64.
this to store:
file_put_contents($path, $json_text);
and this to restore:
$json_text = file_get_contents($path);
echo $json_text;
echo can be used to pass the json exactly as it comes from the http request. if you need to parse it into a variable (in javascript) you can use array = JSON.parse('<?php echo $json_text; ?>');. if you need to parse in php, use $array = json_decode($json_text);.
I'm just playing around with some PHP and was wondering what happens when an object from a class is created within another PHP script?
I assume once its created and been processed their is no way of then going back and 'playing' around with it from another script?
The idea is i'm trying to create a kind of deck of cards using a card class, each card has specific data that is added to each individual object to make it unique, suit, value etc. Once its created i need to be able to go back to specific cards to use them. In java i'd have an arraylist of card objects, i'm not sure how to approach the same area in PHP.
Thanks.
There is no problem passing objects around inside a php script, your problem is that php is that the webserver calling the script is essentially "stateless". i.e. every time someone posts the url from a browser a complete fresh copy of the php program is fired up.
To save data between times there are several options:-
One is to use $_SESSION variables which are associated with a user session but $_SESSION itself is an array so it gets really clumsy holding complex structures here, also , it sounds like you want to share the deck between users.
You could serialise your object and store it in a file -- which is OK as long as its not updated very often -- but if its updated by every user they will start overwriting each others changes.
Much better is to store the deck in a database (SQLITE is usually built into php) so that several users can share and update in a controlled manner.
Another good option would be to use one of the popular data caches such as "memcached" which will cache the data between calls to the script.
To reuse an object between page calls seems to be your issue. Maybe you can serialize the object and store it in database and pick it up back?? Check php.net/serialize Let know how it goes.
What you could do to keep the objects available to you is to serialize the objects and store them in a database table. If you link a game ID or something similar to the cards then you can retrieve them later using this game ID.
I don't know if the cardgame you are writing is realtime, using a database might be too much overhead. Another possibility is to use an existing caching solution, like for example Memcache.
So you want to create a serverside coded cardsgame? Good luck!
It is possible to do this, tho I think a script like a javascript you are talking about is much more suitable.
You could make a function that initialises a deck of cards and work with indexes etc. Save your things in cookies / sessions and work with postbacks. It's gonna be a hell of a job tho in my opinion compared to jscript.
Tho when you think about it, you could use ajax to make this game feel better for the user :).
Php scripts are not like Java server apps.
Where your Java server will run for a long time, your php script will just be a one time thing.
Instead of this kind of process : user make a request to Java-run server, server receive the request in one of it's infinite loops, server process it, server send the response, server wait for new request; you have this kind of thing : a webserver (Apache, Nginx, whatever other webserver) receive the user's request, understand it needs to be interpreted by php, starts a php child, this child do what's in the script, send its answer, dies, the server wait for new requests.
So, when a php script ends, nothing (in good case) is left from it.
But, a php script can use persistent storage on the server so another request can read from it. That's why you have files, databases and even shared memories functions.
If the games state is for one user only, you can use sessions (usually files) to store your deck object. If it's meant to be used by multiple players, you should store it after serialization in a database.
I'm running a web application that allows a user to log in. The user can add/remove content to his/her 'library' which is displayed on a page called "library.php". Instead of querying the database for the contents of the users library everytime they load "library.php", I want to store it globally for PHP when the user logs in, so that the query is only run once. Is there a best practice for doing this? fx. storing their library in an array in a session?
Thanks for your time
If you store each user's library in a $_SESSION as an array, as you suggested (which is definitely possible) you will have to make sure that any updates the user makes to the library are instantly reflected to that session variable.
Honestly, unless there is some seriously heavy querying going on to fetch a library, or you have tons of traffic, I would just stick to 'execute query whenever the user hits library.php'.
Consider the size of the data. Multiply that by the maximum number of concurrent users.
Then compare that the to memory avaiable on your server. Also consider whether or not this is a shared server; other sites needs resources too.
Based on this, it is probably best to either create a file that can be used (as per Remi's comment), or remain in the default stateless form and read every time. I doubt that reading the data each time is creating much of an overhead.
When the user login you can generate a xml file (USER_ID.xml for instance) that you display with xslt.
http://php.net/manual/en/book.xslt.php
Each PHP script dies when it completes, so data can not be kept permanentely live in a web application as you would do in a PC application.
One way could be sessions, but it depends on the amount of data you want to save. According to your example you are talking about a library, so it sounds to me like big quantity of data need to be saved, in such case the DB is the way to go, and yes you have to query it each time.
Another way could be to save them in an array inside a php file, but in the same way you have to query the DB each time, you would have to include such php file each time.
Since this is a db performance optimization, I would suggest that you take a look at memcached which matches your problem perfectly:
memcached is [..] intended for use in speeding
up dynamic web applications by
alleviating database load.
I think it would be best to store it in a Session.
It the user logs in, the Session is being created and you can save data in it using the superglobal:
$_SESSION['key'] = "value";
You can also store Arrays or everything else there and it can be cleared if the user logs out.
you care for performance; Please note:
Session may use database or file to store data.
database is here to be used instead of files, for it's performance and abilities.
use database, it is designed to be used exactly in such situations!