Cache XML API Response in php server side caching - php

Is it possible to cache the response from a POST request to a XML API? I want to do this without having to create actual files on my server. I was told this is possible but Ive been unsuccessful.
Essentially.. I want to check if that call was already made, if so..pull that data, if not do the API call and save for future use.

To do this without files you would need to store the call and the result in a database. It is probably the best way to do it anyway. I have done this with google geolocation API because there are limits on how many requests you can do per day.
So I would just store the request (i.e. a zip code like 37206) and the result from google. Then the next time 37206 was requested I would just return the result stored in my database.
Further, to prevent something like a zip code lat/lon change I would store the date at which the result was stored and after say a year, I would use a cron to remove it.

Related

How can I secure JSON web service?

I have location data containing lat,long,location_name to be shown in the map. Only logged in users can see this map. What I did was that I used php and with a select * to MySQL DB and then I used json_encode to format the data in usable way and echoed it to be grabbed in the front-end and used in map api. This php file echoing the JSON file is called mapData.php
I want this file to not be accessible even from logged user. I came across session and request headers in the mapData.php file (internal api file)but then again if h hacker sign up to my service and open dev console he/she can see the received file and with one side requesting tool can put the header and see the data. Or maybe changing the access level with Linux but I have no idea how.
Another method is uglify and minifying JSON but since I am having 29000 rows in my dB with another inner join I think it will slow down the process. Any suggestion for securing this internal api so that even logged in user cannot access to it?
I would hide the map data file in a subdirectory, then use a service to access the data file and retrieve just the data you need. If you absolutely need the 29,000 rows at once, then there's not much you can do. Even if you encrypt it, eventually the data is going to be in native JavaScript format, and then it's just a matter of running a debugger and peering in the data structures.

What's a best way to store a lot of data in cache?

My website sends curl requests to an external service and gets XML responses.
The requests are user specific and the responses are rather heavy (& several requests on the same page), so it takes time to load the page and uses too much server's traffic.
How I tried to solve the problem:
The requests sent from the client side (js). Unluckily for me it becomes rather messy to parse the received data and integrate it to the page's objects
Put the responses in session (as they are specific for user). The session files on server get large too fast. Implemented a counter, that erases all the responses from session if their number is too big (using this now)
Memcache? Too much data to save
Do you think I should use one of the solutions or is there another way to do it?
Use a commbination of
cache
database
You push things in your "data store" (this is cache and database). Then you look up in your datastore if it is available. The data store looks into cache, if available give it, if not look in database. And if everything fails get the info.
You could also increase the size of the cache (but that is not a good sollution).
Try like this
$key = "User_id_".$user_id."category_".$category_id;
then acc to this key store each data like
$memcache->set($key, $data, , 3600);

PHP: understanding cache with dynamic concent and API

I have a php file, and I'm using an API where if I have an id, I can obtain data through the API. However, I'm currently learning how to create a cache system. The API data is retrieved through JSON data. So I was wondering, if its possible to constantly add JSON data to the existing cache file that already has JSON data in it, so when I have an id, next time I'll search the cache file that matches the id instead of searching the API (which has a limit like any API does).
Maybe create multiple arrays and search for the id key?
I hope someone can understand this? I'm still looking around to help me with a caching script, if anyone have any ideas where I can look, that'll be very helpfuul as well.
Thanks!
No need to do that. Cache is often based on time that passed from last request. And in your case, since you are requesting data via API, i think it would be the best to cache the result pafe for few minutes or to not cache it at all.

Getting big amount of data from a very slow external data-source

I need to recieve a big amount of data from external source. The problem is that external source sends data very slow. The workflow is like this:
The user initiates some process from app interface (common it is fetching data from local xml file). This is quite fast process.
After that we need to load information connected with fetched data from external source(basically it is external statistics for data from xml). And it is very slow. But user needs this additional inforamtion to continue work. For example he may perform filtering according to external data or something else.
So, we need to do it asynchronously. The main idea is to shows external data as it becomes available. The question is how could we organise this async process? Maybe some quess or something else? We`re using php+mysql as backend and jquery at front-end.
Thanks a lot!
Your two possible strategies are:
Do the streaming on the backend, using a PHP script that curls the large external resource into a database or memcache, and responds to period requests for new data by flushing that db row or cache into the response.
Do the streaming on the frontend, using a cross-browser JavaScript technique explained in this answer. In Gecko and WebKit, the XmlHttpRequest.onreadystatechange event fires every time new data is received, making it possible to stream data slowly into the JavaScript runtime. In IE, you need to use an iframe workaround, also explained at Ajax Patterns article linked in the above SO post.
One possible solution would be to make the cURL call using system() with the output being redirected in a file. Thus PHP would not hang until the call is finished. From the PHP manual for system():
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
This would split the data gathering from the user interface. You could then work with the gathered local data by several means, for example:
employ an iFrame in the GUI that would refresh itself in some intervals and fetch data from the local stored file (and possibly store it in the database or whatever),
use jQuery to make AJAX calls to get the data and manipulate it,
use some CGI script that would run in the background and handle the database writes too and display the data using one of the above from the DB directly,
dozens more I can't think of now...

PHP Get Cookie by Session ID (or otherwise pass data between two different connections)

Normally I try to format my question as a basic question and then explain my situation, but the solution I'm looking for might be the wrong one altogether, so here's the problem:
I'm building a catalog application for an auction website that has the ability to save individual lots. So far this has worked great by simply creating a cookie with a comma-separated list of IDs for those lots, via something like this:
$_COOKIE["MyLots_$AuctionId"] = implode(",",$arrayOfIds);
The problem I'm now hitting is that when I go to print the lots, I'm using wkhtmltopdf through the command-line to request the url of the printout I want, like this:
exec("wkhtmltopdf '$urlofmylots' filename.pdf");
The problem is that I can't pass a cookie to this call, because Apache sees an internal request, not the request of the user. I tried putting it in the get string, but once I have more than a pre-set limit for GET parameters, that value disappears from the $_GET array on the target url. I can't seem to find a way to send POST data between them. My next possible ideas are the following:
Maybe just pass the sessionID to the url, and see if there's a way that I can use PHP to dig through the cookies for that session and pull the right cookie, but that sounds like it'd be risky security-wise for a PHP server to allow (letting one session be aware of another). Example:
exec("wkhtmltopdf '$urlofmylots?sessionId=$sessionIdFromThisRequest' filename.pdf");
Possibly set a session variable and then pass that session Id, and see if I can use PHP to wade through that information instead (rather than using the cookie).
Would I be able to just create an array and somehow have that other script be aware of it, possibly by including it? That doesn't really solve the problem of wkhtmltopdf expecting a web-facing address as its first parameter.
(not really an idea, but some reasoning) In other instances of using this, I've just passed an ID to the script that generates the markup for wkhtmltopdf to parse, and the script uses that ID to get data from the database. I don't want to store this data in a file or the database for the simple purpose of transferring data from the caller to the callee in this case. Cookies and sessions seem cleaner since apache/php handle memory allocation for these sessions.
The ultimate problem here is that I'm trying to get my second script (referenced here by $urlofmylots) to be aware of data available to the calling script but it's being executed as if it were an external web request, not two php scripts being called from the web root.
Can anyone offer some insight here?
You might consider rendering whatever the output of $urlofmylots?lots=$lots_to_print would be to a temporary file and running wkhtmltopdf against that file.

Categories