PHP: understanding cache with dynamic concent and API - php

I have a php file, and I'm using an API where if I have an id, I can obtain data through the API. However, I'm currently learning how to create a cache system. The API data is retrieved through JSON data. So I was wondering, if its possible to constantly add JSON data to the existing cache file that already has JSON data in it, so when I have an id, next time I'll search the cache file that matches the id instead of searching the API (which has a limit like any API does).
Maybe create multiple arrays and search for the id key?
I hope someone can understand this? I'm still looking around to help me with a caching script, if anyone have any ideas where I can look, that'll be very helpfuul as well.
Thanks!

No need to do that. Cache is often based on time that passed from last request. And in your case, since you are requesting data via API, i think it would be the best to cache the result pafe for few minutes or to not cache it at all.

Related

How to handle a large amount of JSON from a URL

I use an API that shows a product's information, but I would like to get all of them -- there are over 75,000 products. When I open URL that shows the JSON, my browser just keeps loading.
Can someone me help me to retrieve all of the products as fast as possible using Laravel 5.2?
Below should be the controller code:
return view('Product.List')->with('Products', $Countries->getData()->Data);
^^^^^^^^^^^^^^^
But I am not sure if this is correct way to fix this issue. I am reading JsonResponse.
I think the best thing to do in your case is using limit and offset in your queries to that server and you could use lazy load and ajax, pagination or other methods to retrieve data from the offset with limit.
This make your program more efficient and fast and on the other hand, you don't confuse your user with a sudden huge amount of data on the screen.
Edit: You could provide more information about the API and the method to access it to get better help.

Cache XML API Response in php server side caching

Is it possible to cache the response from a POST request to a XML API? I want to do this without having to create actual files on my server. I was told this is possible but Ive been unsuccessful.
Essentially.. I want to check if that call was already made, if so..pull that data, if not do the API call and save for future use.
To do this without files you would need to store the call and the result in a database. It is probably the best way to do it anyway. I have done this with google geolocation API because there are limits on how many requests you can do per day.
So I would just store the request (i.e. a zip code like 37206) and the result from google. Then the next time 37206 was requested I would just return the result stored in my database.
Further, to prevent something like a zip code lat/lon change I would store the date at which the result was stored and after say a year, I would use a cron to remove it.

What's a best way to store a lot of data in cache?

My website sends curl requests to an external service and gets XML responses.
The requests are user specific and the responses are rather heavy (& several requests on the same page), so it takes time to load the page and uses too much server's traffic.
How I tried to solve the problem:
The requests sent from the client side (js). Unluckily for me it becomes rather messy to parse the received data and integrate it to the page's objects
Put the responses in session (as they are specific for user). The session files on server get large too fast. Implemented a counter, that erases all the responses from session if their number is too big (using this now)
Memcache? Too much data to save
Do you think I should use one of the solutions or is there another way to do it?
Use a commbination of
cache
database
You push things in your "data store" (this is cache and database). Then you look up in your datastore if it is available. The data store looks into cache, if available give it, if not look in database. And if everything fails get the info.
You could also increase the size of the cache (but that is not a good sollution).
Try like this
$key = "User_id_".$user_id."category_".$category_id;
then acc to this key store each data like
$memcache->set($key, $data, , 3600);

PHP Get Cookie by Session ID (or otherwise pass data between two different connections)

Normally I try to format my question as a basic question and then explain my situation, but the solution I'm looking for might be the wrong one altogether, so here's the problem:
I'm building a catalog application for an auction website that has the ability to save individual lots. So far this has worked great by simply creating a cookie with a comma-separated list of IDs for those lots, via something like this:
$_COOKIE["MyLots_$AuctionId"] = implode(",",$arrayOfIds);
The problem I'm now hitting is that when I go to print the lots, I'm using wkhtmltopdf through the command-line to request the url of the printout I want, like this:
exec("wkhtmltopdf '$urlofmylots' filename.pdf");
The problem is that I can't pass a cookie to this call, because Apache sees an internal request, not the request of the user. I tried putting it in the get string, but once I have more than a pre-set limit for GET parameters, that value disappears from the $_GET array on the target url. I can't seem to find a way to send POST data between them. My next possible ideas are the following:
Maybe just pass the sessionID to the url, and see if there's a way that I can use PHP to dig through the cookies for that session and pull the right cookie, but that sounds like it'd be risky security-wise for a PHP server to allow (letting one session be aware of another). Example:
exec("wkhtmltopdf '$urlofmylots?sessionId=$sessionIdFromThisRequest' filename.pdf");
Possibly set a session variable and then pass that session Id, and see if I can use PHP to wade through that information instead (rather than using the cookie).
Would I be able to just create an array and somehow have that other script be aware of it, possibly by including it? That doesn't really solve the problem of wkhtmltopdf expecting a web-facing address as its first parameter.
(not really an idea, but some reasoning) In other instances of using this, I've just passed an ID to the script that generates the markup for wkhtmltopdf to parse, and the script uses that ID to get data from the database. I don't want to store this data in a file or the database for the simple purpose of transferring data from the caller to the callee in this case. Cookies and sessions seem cleaner since apache/php handle memory allocation for these sessions.
The ultimate problem here is that I'm trying to get my second script (referenced here by $urlofmylots) to be aware of data available to the calling script but it's being executed as if it were an external web request, not two php scripts being called from the web root.
Can anyone offer some insight here?
You might consider rendering whatever the output of $urlofmylots?lots=$lots_to_print would be to a temporary file and running wkhtmltopdf against that file.

Caching JSON data

I've never really cached data before, but I feel as though it will greatly improve my site.
Basically, I pull JSON data from an external API that helps the basic functionality of my website. This info doesn't really change that often yet my users pull the info from the API thousands of times a day. If it updated once a day, that would be fine. I want to run a cron job daily that will pull the info and update the cache.
I already tried a few different things, both pulled using PHP:
1) Store data in an SQL table
I had it working, but there's no reason why I should ping the database each time when I can just store it in basic HTML/JSON.
2) .JSON file (using fwrite)
I had it stored, but the only way this worked is if the .getJSON() callback function is prepended to the JSON data and then the data is surrounded by parentheses (making it jsonp, I believe).
Does anyone have any advice or any directions to lead me in? As I said, I've never really done anything like this so I don't even know if I'm remotely headed in the right direction.
Edit:
Okay so I talked to my hosting and since I'm on a shared hosting (dreamhost) I can't install memcached, which sucks. The only info they could give me was that if it is on http://pecl.php.net/ then I can most likely use it. They said APC is available. I'm not sure if this fits my problem. I'd like to be able to access the cache directly in jQuery. Thanks
You can use memcached. Memcached is an in-memory key-value store for small chunks of arbitrary data (strings, objects) from results of database calls, API calls, or page rendering. Very easy to implement and has a low system footprint.
Since you can't use memcached, go back to your database option and store it in a table using the MEMORY engine.
try memcached. You can easily generate a string key and store whatever blob of json you want with it. works like a dictionary, except persists in memory.
There's an option to give it an expiration time (otherwise it just stays cached forever). So when the user requests the data, just check if it's stored in memcached. If it is, great, return it. If it's not, do whatever you do to build it, then put it in memcached with a 24 hour expiration.
If your data varies per user:
I've done this by storing an object in the $_SESSION array.
I attach a quick bit of logic that determines if the data expiry period is up. If so, it draws new data, serves and caches. If not, it draws from $_SESSION and serves it up.
Try Redis.
And to store data easily without unexpected errors on set/get - encode it using base64.
this to store:
file_put_contents($path, $json_text);
and this to restore:
$json_text = file_get_contents($path);
echo $json_text;
echo can be used to pass the json exactly as it comes from the http request. if you need to parse it into a variable (in javascript) you can use array = JSON.parse('<?php echo $json_text; ?>');. if you need to parse in php, use $array = json_decode($json_text);.

Categories