My website takes too long to load data from my API - php

It's my first time developing an API, which is why i'm not surpirsed it was running a little slow, taking 2-4 seconds to load (I used a microtime timer on my webpage).
But then I found out how long it took for the API commands to execute, they're around 0.002 seconds. So why when I use CURL in PHP, does it take another 2 seconds to load?
My API Connection Code:
function APIPost($DataToSend){
$APILink = curl_init();
curl_setopt($APILink,CURLOPT_URL, "http://api.subjectplanner.co.uk");
curl_setopt($APILink,CURLOPT_POST, 4);
curl_setopt($APILink,CURLOPT_POSTFIELDS, $DataToSend);
curl_setopt($APILink, CURLOPT_HEADER, 0);
curl_setopt($APILink, CURLOPT_RETURNTRANSFER, 1);
return curl_exec($APILink);
curl_close($APILink);
}
How I retrieve data in my web page:
$APIData=array(
'com'=>'todayslessons',
'json'=>'true',
'sid'=>$_COOKIE['SID']
);
$APIResult = json_decode(APIPost($APIData), true);
if($APIResult['functionerror']==0){
$Lessons['Error']=false;
$Lessons['Data']=json_decode($APIResult['data'], true);
}else{
$Lessons['Error']=true;
$Lessons['ErrorDetails']="An error has occured.";
}
The APIPost function is within a functions.php file, which is included at the begging of my page. The time it took from the begging of the second snippet of code, to the end is about 2.0126 seconds. What is the best way to fetch my API data?

This is just a guess, so please dont beat me up about it. But maybe its waiting for curl to complete i.e. timeout as you dont close curl before doing the return.
Try this tiny amendment see if it helps:
function APIPost($DataToSend){
$APILink = curl_init();
curl_setopt($APILink,CURLOPT_URL, "http://api.subjectplanner.co.uk");
curl_setopt($APILink,CURLOPT_POST, 4);
curl_setopt($APILink,CURLOPT_POSTFIELDS, $DataToSend);
curl_setopt($APILink, CURLOPT_HEADER, 0);
curl_setopt($APILink, CURLOPT_RETURNTRANSFER, 1);
$ret curl_exec($APILink);
curl_close($APILink);
return $ret;
}

Related

Run other curl when the other curl call is already in progress in PHP

I have to call external script in which i make a first call with CURL to get data which takes about 2-3 minutes, Now during this time i need to make other external call with CURL to get the progress of the first call. But issue is my next call wait till the reply of first CURL comes. I also checked curl_multi but that is also not helping me as i want to make many calls when the first call is in progress. So anyone can help me to solve it please.
I suppose that, there is no need to make second call to track the CURL progress. You can achieve the same by using CURL option CURLOPT_PROGRESSFUNCTION with a callback function.
The call back method takes 5 arguments:
cURL resource
Total number of bytes expected to be downloaded
Number of bytes downloaded so far
Total number of bytes expected to be uploaded
Number of bytes uploaded so far
In the callback method you can calculate the percentage downloaded/uploaded. An example is given below:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "https://stackoverflow.com");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_PROGRESSFUNCTION, 'progress');
curl_setopt($ch, CURLOPT_NOPROGRESS, false);
curl_setopt($ch, CURLOPT_HEADER, 0);
$html = curl_exec($ch);
curl_close($ch);
function progress($resource,$download_size, $downloaded, $upload_size, $uploaded)
{
if($download_size > 0)
echo $downloaded / $download_size * 100 . "%\n";
sleep(1);
}
There is a way to do this - please see the following links, they explain how to do this using curl_multi_init: php.net curl_multi_init and http://arguments.callee.info/2010/02/21/multiple-curl-requests-with-php/

What is the fastest way to get a vimeo thumbnail with PHP

I am using the following code, in PHP, to get the thumbnail (from the JSON) of a vimeo video according to the vimeo API
private function curl_get($url)
{
$curl = curl_init($url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_TIMEOUT, 30);
$return = curl_exec($curl);
curl_close($curl);
return $return;
}
After profiling the page I noticed that it takes to curl_exec about 220 milliseconds, which I find a lot considering that I only want the thumbnail of the video.
Do you know a faster way to get the thumbnail?
it takes to curl_exec about 220 milliseconds
It's probably the network overhead (DNS lookup - connect - transfer - fetching the transferred data). It may not be possible to speed this up any further.
Make sure you are caching the results locally, so they don't have to be fetched anew every time.

Effective way of fetching web pages

I have to fetch multiple web pages, let's say 100 to 500. Right now I am using curl to do so.
function get_html_page($url) {
//create curl resource
$ch = curl_init();
//set url
curl_setopt($ch, CURLOPT_URL, $url);
//return the transfer as a string
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_HEADER, FALSE);
//$output contains the output string
$html = curl_exec($ch);
//close curl resource to free up system resources
curl_close($ch);
return $html;
}
My major concern is the total time taken by my script to fetch all these web pages. I know that the time taken is directly proportional to my internet speed and hence the majority time is taken by $html = curl_exec($ch); function call.
I was thinking that instead of creating and destroying curl instance again and again for each and every web page, if I create it only once and then just reuse it for each and every page and finally in the end destroy it. Something like:
<?php
function get_html_page($ch, $url) {
//$output contains the output string
$html = curl_exec($ch);
return $html;
}
//create curl resource
$ch = curl_init();
//set url
curl_setopt($ch, CURLOPT_URL, $url);
//return the transfer as a string
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_HEADER, FALSE);
.
.
.
<fetch web pages using get_html_page()>
.
.
.
//close curl resource to free up system resources
curl_close($ch);
?>
Will it make any significant difference in the total time taken? If there is any other better approach then please let me know about it also?
How about trying to benchmark it? It may be more efficient to do it the second way, but I don't think it will add up to much. I'm sure your system can create and destroy curl instances in microseconds. It has to initiate the same HTTP connections each time either way, too.
If you were running many of these at the same time and were worried about system resources, not time, it might be worth exploring. As you noted, most of the time spent doing this will be waiting for network transfers, so I don't think you'll notice a change in overall time with either method.
For web scraping I would use : YQL + JSON + xPath. You'll implement it using cURL
I think you'll save a lot of resources.

stop a curl transfer in the middle

i could only think of curl_close() from one of the callback functions.
but php throws a warning:
PHP Warning: curl_close(): Attempt to close cURL handle from a callback.
any ideas how to do that?
you can return false or something what is not length of currently downloaded data from callback function to abort curl
I had a similar problem that needed me to be able to stop a curl transfer in the middle. This is easily in my personal top ten of 'dirty hacks that seem to work' of all time.
Create a curl read function that knows when it's time to cancel the upload.
function curlReadFunction($ch, $fileHandle, $maxDataSize){
if($GLOBALS['abortTransfer'] == TRUE){
sleep(1);
return "";
}
return fread($fileHandle, $maxDataSize);
}
And tell Curl to stop if the data read rate drops too low for a certain amount of time.
curl_setopt($ch, CURLOPT_READFUNCTION, 'curlReadFunction');
curl_setopt($ch, CURLOPT_LOW_SPEED_LIMIT, 1024);
curl_setopt($ch, CURLOPT_LOW_SPEED_TIME, 5);
This will cause the curl transfer to abort during the upload. Obviously not ideal but it seems to work.
If the problem is that is taking too long to execute the curl, you could set a time, example
<?php
$c = curl_init('http://slow.example.com/');
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
curl_setopt($c, CURLOPT_CONNECTTIMEOUT, 15);
$page = curl_exec($c);
curl_close($c);
echo $page;

Loading a page that sometimes 'hangs' via PHP (Curl)

I'm trying to get information from a site by parsing/scraping it via PHP & Curl. But sometimes the current page doesn't finish loading, so the script runs without anything happening. It's a simple script like this...
...
curl_setopt($curl, CURLOPT_URL, $url);
$page = curl_exec($curl);
...
Is there a way to simply retry the loading of the same page if the page doesn't finish loading after (for example) 60 sec, without interrupting the complete script?
It would be great if someone could help me out with a way to realize this task.
You can use CURLOPT_TIMEOUT which is the maximum number of seconds to allow cURL functions to execute.
curl_setopt($ch, CURLOPT_TIMEOUT, timeout_in_seconds);
Something simple would be..
$bol=true;
while($bol)
{
$page = curl_exec($curl);
if($page=="")//Or whatever curl_exec returns on timeout
$bol=true;
}

Categories