I have to develop a RESTful API for an Android app and have decided to go with PHP using the slim framework.
Some background, currently every time the client app makes a request, the server does some DB operations and creates a payload. This was causing high load on the server during peak hours of app use. So I'm looking for a way to cache this payload and have it available whenever the request comes. This cache will have to updated rarely compared to the number of reads (Only on DB change by admin).
To test this I tried the following code,
In index.php
$app->flag = 1;
And the endpoint
$app->get("/getContent", function() use ($app){
if($app->flag == 1){
echo 'Changing value';
$app->flag = 0;
return;
}
echo $app->folder;
});
The ideal case would be if it prints "Changing value" first time and 0 thereafter. But the value of '$app->flag' is always 1, at the start of the endpoint. How can I persist the data between successive calls to the end point?
Also, would it be better if I store the payload in a file each time and do an I/O to handle the endpoint request (Will this throw an I/O exception if the admin tried to update the file while the endpoint is reading it for the client)?
I'm fairly new to PHP, will really appreciate your insight or even other ideas to do the same from you.
Here are my suggestions:
Can you use memcache for storing the data? You do the DB operations and processing once, then build json and store it in memcache (which stores data directly in memory, and as a plus you would avoid I/O operations)
Do all the DB processing, build json and save it in another table (used for caching results). The main concern with this approach is the number of users hitting your DB at once. Although it will be simple reads and for sure the query result will be cached by DB engine natively you must consider the number of DB connections used at a time
Related
I'm trying to send certain data from iOS to online MySQL database. PHP is used on the server to handle the data receiving and inserting.
The thing is that I have several data packages. And the key is to send them one by one, which means I need a mechanism to let the second data package in the queue wait until iOS received feedback from the server confirming the first set of data has already been stored into the database.
I initially tried creating a serial dispatch queue, aiming to have the iOS app execute uploading work in a sequence. Although iOS side did carry out the work according to the sequence, but each task simply "finished" at sending out its data package without waiting to see if the data have been inserted into the database. Then the problem is there will always be some time lapse between sending out the data and data being fully saved to MySQL in the server, due to issues like network connection.
So the result is the data may not be saved in desired sequence, with some later data may be saved earlier than the previous data.
I'm guess what is missing is a "feedback" mechanism from the server side to the iOS side.
Can anybody suggest a way to realize this feedback mechanism, so I can control the serial sequence of uploading data tasks.
Thank you very much!
Regards,
Paul
If you are sending data to server then most of available frameworks offers callback. With AFNetworking (or now known as Almofire) it would look like this:
[[ConnectionManager instance] GET: #"link" parameters: nil
success:^(AFHTTPRequestOperation* operation, id responseObject)
{
}
failure:^(AFHTTPRequestOperation* operation, NSError* error)
{
}];
So you can put your code in given handlers and continuously make requests.
You may also want to create concurrent Operations and put those on OperationQueue while setting proper dependencies but it's surely more time consuming.
Maybe it's a stupid question. But anyway here is my problem. I have multiple classes in my project.
At the beginning the constructor of the class Calculate($param1, $param2...) is called.
This Calculate class is called multiple times via jQuery Events (click, change..) depending on which new form field is filled.. The prices and values are calculated in the background by php and are represented on the website via AJAX (live while typing).
The connection between the AJAX and the Calculate class is a single file (jsonDataHanlder) this file receives the POST-values from the AJAX and returns a JSON-String for the website output. So every time I call this jsonDataHandler a new Calculate object is beeing created. With the updated values, but never the first created object. I am experiencing now multiple problems as you may can imagine.
How can I always access the same object, without creating an new one?
EDIT: because of technical reasons, I cannot use sessions..
Here is the php application lifetime:
The browser sends an http request to the web-server
Web-server (for example Apache), accepts the request and launches your php application (in this case your jsonDataHandler file)
Your php application handles the request and generates the output
Your php application terminates
Web-server sends the response generated by php application to the browser
So the application "dies" at the end of each request, you can not create an object which will persist between requests.
Possible workarounds:
Persist the data on the server - use sessions or the database (as you said this is not an option for you)
Persist the data on the client - you still create your object for each request, but you keep additional information client-side to be able to restore the state of your object (see more on this below)
Use something like reactphp to have your application running persistently (this also can be not an option because you will need to use different environment). Variance of this option - switch to another technology which doesn't re-launch the server-side application each time (node.js, python+flask, etc).
So, if you can't persist the data on the server, the relatively simple option is to persist the data on the client.
But this will only work if you need to keep the state of your calculator for each individual client (vs keeping the same state for all clients, in this case you do need to persist data on the server).
The flow with client-side state can be this:
Client sends the first calculation request, for example param1=10
Your scripts responds with value=100
Client-side code stores both param1=10 and param1_value=100 into cookies or browser local storage
Client sends the next calculation, for example param2=20, this time the client-side code finds previous results and sends everything together (param1=10¶m1_value=100¶m2=20)
On the server you now can re-create the whole sequence of calculation, so you can get the same result as if you would have a persistent Calculate object
Maybe you should try to save the values of the parameters of Calculate object in database, and every you make an AJAX call you take the latest values from the DB.
My website sends curl requests to an external service and gets XML responses.
The requests are user specific and the responses are rather heavy (& several requests on the same page), so it takes time to load the page and uses too much server's traffic.
How I tried to solve the problem:
The requests sent from the client side (js). Unluckily for me it becomes rather messy to parse the received data and integrate it to the page's objects
Put the responses in session (as they are specific for user). The session files on server get large too fast. Implemented a counter, that erases all the responses from session if their number is too big (using this now)
Memcache? Too much data to save
Do you think I should use one of the solutions or is there another way to do it?
Use a commbination of
cache
database
You push things in your "data store" (this is cache and database). Then you look up in your datastore if it is available. The data store looks into cache, if available give it, if not look in database. And if everything fails get the info.
You could also increase the size of the cache (but that is not a good sollution).
Try like this
$key = "User_id_".$user_id."category_".$category_id;
then acc to this key store each data like
$memcache->set($key, $data, , 3600);
I have a Flex program that gets a JSON array from a PHP script. The PHP script doesn't contain just a simple JSON array but it grabs data from Activecollab and do some work on the data before encoding the data.
The first test involve a small JSON array that took a short time to encode by PHP. However, when I try to scale up the test, the Flash movie will crash trying to load the JSON data from PHP. There's no code difference between the tests, just the amount of data and amount of time it takes PHP to encode. Am I looking at a memory problem or a time out problem?
PS: When I call the PHP script in Firefox, it doesn't time out and still return a JSON array. It just took awhile to return the array.
I'm assuming that hitting your php service in a browser does not time out. If it does, then you need to change your php settings to allow the script to execute longer.
Otherwise you may try a different strategy altogether like this: Have flex call your php service and tell it to start data processing, have your php service return a token id to flex to use for polling. Have another php service track the progress of the processing (receiving the token for tracking the job). This second service will return a progress report with each request until the processing is done. When the processing is done, it returns the data on the next request.
Have php generate an id used to track the 'job' so that flex can poll and retrieve the data when the job is done. This will at least eliminate the wait time that Flex is dealing with while server-side processing happens.
Of course you'll need to store your pre-preocessed output server side somewhere while waiting for the request with the matching token. You'll also need a cleanup mechanism that clears this map/cache occasionally with a timeout.
You can check your memory usage on the client by using a flash memory profiler. Is your total memory usage going up a lot when you try to load the data?
My AJAX search program keeps asking PHP for results of a particular search term. The start of the PHP script reads thru a MySQL DB and initializes, so if the PHP keeps restarting, it will have to read the DB millions of times. If I can keep it alive and let multiple silly AJAX requests be served by a single running PHP script instance I'm sure performance would improve.
How do you do this typically? Use a service? Can this be done with services?
PHP has no concept of long-lived objects or shared state between threads or requests, every request always starts at zero (except for the session state, of course). You can emulate long-lived objects by caching to disk or memory (see memcached).
Do you have a particular reason to read the entire database when your script initializes?
How about storing the db results in a session variable? You'd check first if the keyword is not in the session (sessions allow to transport variable values between page refreshes), and if not, do a db query.
To store it:
$_SESSION['storedQueries']['keyword']= 'its definition, from the database';
To look for it:
$index= array_search('keyword',array_keys($_SESSION['storedQueries']));
$result = ($index) ? $_SESSION['storedQueries'][$index] : 'nothing found, you should do a db query';
The ajax part is pretty easy if you use a javascript library, such as jquery:
$('#resultZone').load('find.php',{keyword: $('input.search').val() });
If you know the results are the same every time, just move those results to a session variable.
PHP sessions are explained pretty well in their documentation:
http://us3.php.net/manual/en/book.session.php
If the search result is something that would be similar to multiple users, I usually create a cache file and serialize the result set there. As a filename I might use md5sum of a string containing the search query and perhaps user group. Then when a Ajax needs the data I just need to check if the file is too old, if not I just need to send it to the client or maybe even just redirect the Ajax http-request to the file (assuming it is formatted properly). If the file is too old, I just refresh it with new content.
For very high volume sites memcached is usually a better option. And also some kind of php cache helps and SQL connection pooling lowers the overhead of opening SQL connections.
connecting to the DB is a very expensive operation and you can go around that by caching the results, take a look at Zend_Cache and see how it can save you allot headache.