How to handle a large amount of JSON from a URL - php

I use an API that shows a product's information, but I would like to get all of them -- there are over 75,000 products. When I open URL that shows the JSON, my browser just keeps loading.
Can someone me help me to retrieve all of the products as fast as possible using Laravel 5.2?

Below should be the controller code:
return view('Product.List')->with('Products', $Countries->getData()->Data);
^^^^^^^^^^^^^^^
But I am not sure if this is correct way to fix this issue. I am reading JsonResponse.

I think the best thing to do in your case is using limit and offset in your queries to that server and you could use lazy load and ajax, pagination or other methods to retrieve data from the offset with limit.
This make your program more efficient and fast and on the other hand, you don't confuse your user with a sudden huge amount of data on the screen.
Edit: You could provide more information about the API and the method to access it to get better help.

Related

Call to an API to retrieve JSON data and store it in the database

What do I need to do in order to make a call to an API, retrieve some data in JSON format and then store that data into a database. I know the specifics of this process, but the thing is I need this to be done at an interval of 5 minutes and without placing burden on the loading time of the website. So, I don't know the concept of how this is done because it obviously can't be done when the page is loaded, hence it has to be done by the server, but how ?
Any useful reading material regarding this topic would also come in handy. Thanks.
Take a look at the new web workers method in HTML 5.
http://www.w3schools.com/html/html5_webworkers.asp
You tagged this as a jquery question so I assume you already know how to make an ajax call. If you know how to make an ajax call Im sure you can figure out how to use setInterval to make this repeat every x number of seconds. But just in case take a look at this:
http://www.w3schools.com/jsref/met_win_setinterval.asp
This is assuming you want to do this on the page itself, otherwise as others have commented just set up a cron job.

PHP: understanding cache with dynamic concent and API

I have a php file, and I'm using an API where if I have an id, I can obtain data through the API. However, I'm currently learning how to create a cache system. The API data is retrieved through JSON data. So I was wondering, if its possible to constantly add JSON data to the existing cache file that already has JSON data in it, so when I have an id, next time I'll search the cache file that matches the id instead of searching the API (which has a limit like any API does).
Maybe create multiple arrays and search for the id key?
I hope someone can understand this? I'm still looking around to help me with a caching script, if anyone have any ideas where I can look, that'll be very helpfuul as well.
Thanks!
No need to do that. Cache is often based on time that passed from last request. And in your case, since you are requesting data via API, i think it would be the best to cache the result pafe for few minutes or to not cache it at all.

Safely execute queries via AJAX on document ready

Greeting everybody,
My problem is like this: I have some custom made statistics on my internet site where I log informations on what users do ( google analytics - like ). Obviously I aggregate information a couple of months, but the Tables I store information have grown too large and have a negative impact on page loading. The flow is like this ( in index, so affect all pages ) :
1. Get the included files
2. Execute part of statistics queries
3. Effective page code
4. Execute the last part of statistics queries
To get rid of this problem I want to make those queries on <body onload="execQueries();"> or on document ready with javascript / AJAX.
How can I safely and securely make those queries using AJAX, so that cannot be abused by a client with good knowledge of javascript/ajax. Because if I simply make that JS function it can be accesed everytime by a user with firebug.
The solution I think about is including the use of $_SESSION where I mark in top of my index.php information about those queries ( id, info ) and in the script called by AJAX I check if that $_SESSION['query_info'] is set and I execute it reading all the info from there, and then I use unset($_SESSION['query_info']);. So, if the AJAX is called again, because tat specific $_SESSION['query_info'] does not exists, I do not do anything in my DB.
Do you think this is a secure solution or do you have other ideas? Anything viable is welcomed.
Thank you
Try putting your related javascript codes into Closures.

Caching JSON data

I've never really cached data before, but I feel as though it will greatly improve my site.
Basically, I pull JSON data from an external API that helps the basic functionality of my website. This info doesn't really change that often yet my users pull the info from the API thousands of times a day. If it updated once a day, that would be fine. I want to run a cron job daily that will pull the info and update the cache.
I already tried a few different things, both pulled using PHP:
1) Store data in an SQL table
I had it working, but there's no reason why I should ping the database each time when I can just store it in basic HTML/JSON.
2) .JSON file (using fwrite)
I had it stored, but the only way this worked is if the .getJSON() callback function is prepended to the JSON data and then the data is surrounded by parentheses (making it jsonp, I believe).
Does anyone have any advice or any directions to lead me in? As I said, I've never really done anything like this so I don't even know if I'm remotely headed in the right direction.
Edit:
Okay so I talked to my hosting and since I'm on a shared hosting (dreamhost) I can't install memcached, which sucks. The only info they could give me was that if it is on http://pecl.php.net/ then I can most likely use it. They said APC is available. I'm not sure if this fits my problem. I'd like to be able to access the cache directly in jQuery. Thanks
You can use memcached. Memcached is an in-memory key-value store for small chunks of arbitrary data (strings, objects) from results of database calls, API calls, or page rendering. Very easy to implement and has a low system footprint.
Since you can't use memcached, go back to your database option and store it in a table using the MEMORY engine.
try memcached. You can easily generate a string key and store whatever blob of json you want with it. works like a dictionary, except persists in memory.
There's an option to give it an expiration time (otherwise it just stays cached forever). So when the user requests the data, just check if it's stored in memcached. If it is, great, return it. If it's not, do whatever you do to build it, then put it in memcached with a 24 hour expiration.
If your data varies per user:
I've done this by storing an object in the $_SESSION array.
I attach a quick bit of logic that determines if the data expiry period is up. If so, it draws new data, serves and caches. If not, it draws from $_SESSION and serves it up.
Try Redis.
And to store data easily without unexpected errors on set/get - encode it using base64.
this to store:
file_put_contents($path, $json_text);
and this to restore:
$json_text = file_get_contents($path);
echo $json_text;
echo can be used to pass the json exactly as it comes from the http request. if you need to parse it into a variable (in javascript) you can use array = JSON.parse('<?php echo $json_text; ?>');. if you need to parse in php, use $array = json_decode($json_text);.

Call PHP Function from Smarty with AJAX with no user action

I am hitting a lot of different sites to get a list of information and I want to display this information as I get it. Right now I am using a Smarty Template and what I would like to do is:
Pseudo code:
{foreach $page}
$smarty_var = use AJAX to call a PHP function
Render out a new table row on the fly w/ the newly assigned var
<tr><td>{$smarty_var}</td></tr>
{/foreach}
I don't know much about AJAX, I used it a long time ago, and it was similar to this, but not quite, there was user action taken. No I don't have a JS Framework in place. Am I way off here on how this should go? Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
Sure, I will tell you about what I am trying to do: http://bookscouter.com/prices.php?isbn=0132184745+&x=19&y=6
If you click on the 'Click to view prices from all 43 links' at the bottom on that page you will see. I am using cURL to get all the pages I want a price from. Then for each page I want to get the price. So each page is gonna fire off a function that runs some fun code like this:
function parseTBDOTpageNew($page, $isbn)
{
$first_cut = preg_split('/<table[^>]*>/', $page);
$second_cut = preg_split('/<td[^>]*>/', $first_cut[2]);
if(strstr($second_cut[4], "not currently buying this book") == true)
{
return "\$0.00";
}
$third_cut = preg_split('/<b[^>]*>/', $second_cut[9]);
$last_cut = preg_split('/</', $third_cut[3]);
return $last_cut[0];
}
This function is called from another function which puts the price returned from the function above, the name of the company, and a link in an array to be added to another bigger array that is sent to smarty. Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
I will take your advice on Jquery, what I have started is an onload function that receives the $pages to be parsed, and I was just in the middle of writing: foreach page get the info and spit some html w/ the info on the page.
Also the function that calls the function to get the price is in a php file, so I need the request to hit a function within a php file and NOT just call file.php?param1=foo, I need to it to actually hit the function in the file. I have Jquery in place, now just trying to figure it out and get it to do what I need, ugh. I am searching, any help would be appreciated.
No I don't have a JS Framework in place
Fix that first. You don't want to juggle XMLHTTPRequests yourself. jQuery is SO's canonical JS library, and is pretty nifty.
Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
How many rows will you be dealing with? Do they all have to be loaded asynchronously?
Let's tackle this in a braindead, straightforward way. Create a script that does nothing more than:
Take a site ID and fetch data from the corresponding URL
Render that data to some data transport format, either HTML or JSON.
Then it's a simple matter of making the page that the user gets, which will contain Javascript code that makes the ajax calls to the data fetcher, then either shoves the HTML in the page directly, or transforms the data into HTML and then shoves that into the page.
You'll note that at no point is Smarty really involved. ;)
This solution is highly impractical for anything more than a trivial number of sites to be polled asynchronously. If you need rows for dozens or hundreds of sites, that means each client is going to need to make dozens or hundreds of requests to your site for every single normal pageview. This is going to slaughter your server if more than one or two people load the page at once.
Can you tell us more about what you're doing, and what you're trying to accomplish? There are lots of ways to mitigate this problem, but they all depend on what you're doing.
Update for your question edit.
First, please consider using an actual HTML parser instead of regular expressions. The DOM is very powerful and you can target specific elements using XPath.
Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
So, here's the ultimate problem. You want to do something asynchronously. PHP does not have a built-in generalized way to perform asynchronous tasks. There are a few ways to deal with this problem.
The first is as I've described above. Instead of doing any of the curl requests on page load, you farm the work out to the end user, and have the end user's browser make requests to your scraping script one by one, filling in the results.
The second is to use an asynchronous work queue, like Gearman. It has excellent PHP support via a PECL extension. You'd write one or more workers that can accept requests, and keep a pool of them running at all times. The larger the pool, the more things you can do at once. Once all of the data has returned, you can throw the complete set of data at your template engine, and call it good.
You can even combine the two, having the user make only one or two or three extra requests via ajax to fetch part of the returned data. Heck, you can even kick off the jobs in the background and return the page immediately, then request the results of the background jobs later via ajax.
Regardless of which way you handle it, you have a giant, huge problem. You're scraping someone's site. You may well be scraping someone's site very often. Not everyone is OK with this. You should seriously consider caching results aggressively, or even checking with each of the vendors to see if they have an API or data export that you can query against instead.

Categories