Download external JSON regularly - php

I'm currently using an API which returns a JSON object. I pay per hit, so I would like to minimize my hits. I use this object to fill in images and text on my page. The object that gets returned is very similar to a itunes lookup hit.
A simplified version of my code is this:
<img id="test" src="" alt="Image" />
<script>
$.getJSON( "https://itunes.apple.com/lookup?id=284910350", function( data ) {
document.getElementById('test').setAttribute("src", data.results[0].screenshotUrls[0]);
});
</script>
Everytime a users opens this page, a request gets sent to the server and a hit gets added to my account (obviously). I would like to store the object temporarily on my own server so I can request the data once, and serve a 'local' version to the user. What is the best way to do this? Is it possible to have the file updated every week or so automatically?
Thanks in advance!

It's an easy cron job. Assuming that you can execute bash script in your server:
1 - In your server put a bash script called fetchItune.sh. The content of this script basically stores some curl requests to outside API:
#!/bin/sh
curl -H "Accept: application/json" https://itunes.apple.com/lookup\?id\=284910350 -o /path/to/storage/data.json
You can get fancy with this script e.g. putting the list of endpoints in an array or output to different files, etc. but at the core, just make sure they are valid HTTP requests that accept a JSON response.
2 - Set up a cron job to do it weekly. It could be as simple as putting this script in /etc/cron.weekly if you are using an Ubuntu server. Otherwise, please search through your server documentation. I'm sure there is a section on cron job.
3 - From your JavaScript, request your server endpoint instead of the outside API:
<script>
$.getJSON( "/path/to/storage/data.json", function( data ) {
document.getElementById('test').setAttribute("src", data.results[0].screenshotUrls[0]);
});
</script>
EDIT: You can write PHP script to make request to external API instead of bash script. The principle is the same. I take this directly from PHP curl documentation: http://php.net/manual/en/curl.examples-basic.php
<?php
$ch = curl_init("https://itunes.apple.com/lookup\?id\=284910350");
$fp = fopen("/path/to/storage/data.json", "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_exec($ch);
curl_close($ch);
fclose($fp);
?>

Related

Why does this code so negatively affect my server's performance?

I have a Silverstripe site that deals with very big data. I made an API that returns a very large dump, and I call that API at the front-end by ajax get.
When ajax calling the API, it will take 10 mins for data to return (very long json data and customer accepted that).
While they are waiting for the data return, they open the same site in another tab to do other things, but the site is very slow until the previous ajax request is finished.
Is there anything I can do to avoid everything going unresponsive while waiting for big json data?
Here's the code and an explanation of what it does:
I created a method named geteverything that resides on the web server as below, it accessesses another server (data server) to get data via streaming API (sitting in data server). There's a lot of data, and the data server is slow; my customer doesn't mind the request taking long, they mind how slow everything else becomes. Sessions are used to determine particulars of the request.
protected function geteverything($http, $id) {
if(($System = DataObject::get_by_id('ESM_System', $id))) {
if(isset($_GET['AAA']) && isset($_GET['BBB']) && isset($_GET['CCC']) && isset($_GET['DDD'])) {
/**
--some condition check and data format for AAA BBB CCC and DDD goes here
**/
$request = "http://dataserver/streaming?method=xxx";
set_time_limit(120);
$jsonstring = file_get_contents($request);
echo($jsonstring);
}
}
}
How can I fix this, or what else would you need to know in order to help?
The reason it's taking so long is your downloading the entirity of the json to your server THEN sending it all to the user. There's no need to wait for you to get the whole file before you start sending it.
Rather than using file_get_contents make the connection with curl and write the output directly to php://output.
For example, this script will copy http://example.com/ exactly as is:
<?php
// Initialise cURL. You can specify the URL in curl_setopt instead if you prefer
$ch = curl_init("http://example.com/");
// Open a file handler to PHP's output stream
$fp = fopen('php://output', 'w');
// Turn off headers, we don't care about them
curl_setopt($ch, CURLOPT_HEADER, 0);
// Tell curl to write the response to the stream
curl_setopt($ch, CURLOPT_FILE, $fp);
// Make the request
curl_exec($ch);
// close resources
curl_close($ch);
fclose($fp);

Curl Requests in PHP - Using an API

I'm trying to figure out how to use the Cheddar API (http://cheddarapp.com/developer) in my PHP application.
Cheddar's API uses curl requests - which have been fine for me using in terminal but not in my index.php.
I'd like to create a button that when clicked, creates a task in a list call Colors. If a list does not exist, it'll create the list.
Have anybody used Cheddar's API or even included curl requests in PHP or even how to include them in Javascript which I'm guessing you use for things of this matter.
Update
Here's the Curl request for creating a task in Cheddar: https://cheddarapp.com/developer/tasks#create.
I'd like to make a button that onclick, it will create a task. Is it not as a simple as creating a function in Javascript and using onclick on an anchor?
I am using now days curl Php
$LOCAL_REST_URL = 'whateverurlofyourrestapi'
$json_part = { pass the data for post }
you will get your response in $buffer in json format iterate it use the response
As u said how to make a curl request i would like to give you a simple POST example
$curl_handle=curl_init();
curl_setopt($curl_handle,CURLOPT_URL,$LOCAL_REST_URL);
curl_setopt($curl_handle,CURLOPT_CONNECTTIMEOUT,20);
curl_setopt($curl_handle, CURLOPT_POSTFIELDS,$json_part);
curl_setopt($curl_handle, CURLOPT_HTTPHEADER, $headers);
curl_setopt($curl_handle,CURLOPT_RETURNTRANSFER,1);
$buffer = curl_exec($curl_handle);
curl_close($curl_handle);
Where $json_partis the request body and $LOCAL_REST_URL is is your rest url
I am hoping this post will help you
You could run your terminal program with shell_exec from php.
Or transcript the curl-code to curl-requests from php, http://se2.php.net/manual/en/ref.curl.php

Trying to find the best method

I will set up a register page using MSSQL.
The system must work like:
User appends data at something.com/register.php
The data is sent to host-ip-address/regsecond.php which my database will be at. (For security reasons, this php page wont directly access to the database.
The php page at host will start another PHP page or EXE file will directly reach database directly and securely.
As my php level is not high, I wanted to learn If i could start php scripts which will work and do their job without coming into users browsers. Here I explain what I say:
" I append some data at x.php, and it starts another PHP script which will do the job with the DATA appended from x.php but the -another PHP script- wont come into users browser "
I was hopefully clear ,as summary, should I use exe [will be harder] or can I start PHP script without coming into browser. And how of course.
You can do this using the curl extension. You can find info on it here:
http://php.net/manual/en/book.curl.php
You can do something like the following:
$postdata = array(
'item1' => 'data'
);
$ch = curl_init("http://host-ip-address/regsecond.php");
curl_setopt ($ch, CURLOPT_POST, true);
curl_setopt ($ch, CURLOPT_POSTFIELDS, $postdata);
curl_exec($ch);
curl_close($ch);
This makes a call directly from your first script to your second script without exposing anything to the user. On the far side, the data will come in as regular post data ($_POST).
You can't post data through PHP to a different website.
If you would like your website then you can configure your PHP script to connect to a different server for your MySQL, I wouldn't say it's a huge amount safer. For example
Instead of:
mysql_connect(localhost,username,password);
Try this
mysql_connect(http://your-ip:portnumber,username,password);
I'm not sure I understand this correctly but you may
§1 use a "public" php script that invokes a private one:
<?php
//public register script
//now call private
//store data to txt-file or similar..
require('/path/outside/www-data/script_that_processes_further.php');
§2 request a script at another server,
<?php
file_get_contents('http://asdf.aspx?firstname=' . $theFirstName); //simplistic
//other options would be curl, xml/soap or whatever.
§1 may be used with §2.
regards,
/t

Stream response from CURL request without waiting for it to finish

I have a PHP script on my server that is making a request to another server for an image.
The script is accessed just like a regular image source like this:
<img src="http://example.com/imagecontroller.php?id=1234" />
Browser -> Script -> External Server
The script is doing a CURL request to the external server.
Is it possible to "stream" the CURL response directly back to the client (browser) as it is received on the server?
Assume my script is on a slow shared hosting server and the external server is blazing fast (a CDN). Is there a way to serve the response directly back to the client without my script being a bottleneck? It would be great if my server didn't have to wait for the entire image to be loaded into memory before beginning the response to the client.
Pass the -N/--no-buffer flag to curl. It does the following:
Disables the buffering of the output stream. In normal work
situations, curl will use a standard buffered output stream that will
have the effect that it will output the data in chunks, not
necessarily exactly when the data arrives. Using this option will
disable that buffering.
Note that this is the negated option name documented. You can thus use
--buffer to enforce the buffering.
Check out Pascal Martin's answer to an unrelated question, in which he discusses using CURLOPT_FILE for streaming curl responses. His explanation for handling " Manipulate a string that is 30 million characters long " should work in your case.
Hope this helps!
Yes you can use the CURLOPT_WRITEFUNCTION flag:
curl_setopt($ch, CURLOPT_WRITEFUNCTION, $callback);
Where $ch is the Curl handler, and $callback is the callback function name.
This command will stream response data from remote site. The callback function can look something like:
$result = '';
$callback = function ($ch, $str) {
global $result;
$result .= $str;//$str has the chunks of data streamed back.
//here you can mess with the stream data either with $result or $str
return strlen($str);//don't touch this
};
If not interrupted at the end $result will contain all the response from remote site.
Not with curl, you could use fsocket to do streaming.

PHP: Remote Function Call and returning the result?

I'm not very expert to PHP. I want to know how to communicate between 2 web servers. For clearance, (from 1st Server) run a function (querying) on remote server. And return the result to 1st server.
Actually the theme will be:
Web Server (1) ----------------> Web Server (2) ---------------> Database Server
Web Server (1) <---------------- Web Server (2) <--------------- Database Server
Query Function() will be only located on Web Server (2). Then i need to run that query function() remotely from Web Server (1).
What is it call? And Is it possible?
Yes.
A nice way I can think of doing would be to send a request to the 2nd server via a URL. In the GET (or POST) parameters, specify which method you'd like to call, and (for security) some sort of hash that changes with time. The hash in there to ensure no third-party can run the function arbitrarily on the 2nd server.
To send the request, you could use cURL:
function get_url($request_url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $request_url);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$response = curl_exec($ch);
curl_close($ch);
return $response;
}
This sends a GET request. You can then use:
$request_url = 'http://second-server-address/listening_page.php?function=somefunction&securityhash=HASH';
$response = get_url($request_url);
On your second server, set up the listening_page.php (with whatever filename you like, of course) that checks for GET requests and verifies the integrity of the request (i.e. the hash, correct & valid params).
You can do so by using an API. create a page on second server that takes variables and communicates to the server using those vars (depending on what you need). and the standard reply from that page should be either JSON or XML. then read that from server 1 by requesting that file and getting the reply from the 2nd server.
*NOTE if its a private file, make sure you use an authentication method to prevent users from accessing the file
What you are aiming to do is definitely possible. You will need to set up some sort of api in order for server one to make a request to server 2.
I suggest you read up on SOAP and REST api
http://www.netmagazine.com/tutorials/make-your-own-soap-api
Generally you will use something like CURL to contact server 2 from server 1.
Google curl and you should quickly get idea.
Its not going to be easy to give you a complete solution so I hope this nudge in the right direction is helpful.

Categories