php cURL request to instsagram API creates page lag - php

I am using cURL toaccess instagrams API on a webpage I am building. THe functionality works great, however, page load is sacrificed. For instance, consider this DOM structure:
Header
Article
Instagram Photos (retrieved via cURL)
Footer
When loading the page, the footer will not load until the instagram hotos have been fully loaded with cURL. Below is the cURL function that is being called:
function fetchData($url){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 20);
$result = curl_exec($ch);
curl_close($ch);
return $result;
}
$result = fetchData("https://api.instagram.com/v1/media/search?lat={$lat}&lng={$lng}&distance={$distance}&access_token={$accessToken}");
$result = json_decode($result);
So after this function is run, then the rest of the DOM is displayed. If I move the function call below the footer, it does not work.
Is there anything I can do to load the entire webpage and have the cURL request setn on top of the loading site (not cause a lag or holdup)?
UPDATE: Is the best solution to load it after the footer, and then append it to another area with js?

You can cache the result json into a file that is saved locally. You can make a cronjob that is called ever minute and update the locally cache file. This makes your page loading much more faster. The downside is that your cache is updating even when you don't have visitors and you have a delay of a minute in data from instagram.

Related

Show CoreTemp Remote Monitoring Temperature on Webpage

I would like to show what CoreTemp is monitoring on my personal website. The CoreTemp reports in what I can tell JSON format. However, it appears that when I do a cron or a GET requests, nothing ever shows from the CoreTemp page. I think this may have to do with it must return 5 results before displaying it on the local webpage at http://localhost:5200/.
Example output from CoreTemp Monitoring page:
{"CpuInfo":
{"uiLoad":[2,3],
"uiTjMax":[100],
"uiCoreCnt":2,
"uiCPUCnt":1,
"fTemp":[49,48],
"fVID":1.065918,
"fCPUSpeed":3292.04028,
"fFSBSpeed":99.7588,
"fMultiplier":33,
"CPUName":"Intel Core i5 4310M (Haswell) ",
"ucFahrenheit":0,
"ucDeltaToTjMax":0,
"ucTdpSupported":1,
"ucPowerSupported":1,
"uiStructVersion":2,
"uiTdp":[37],
"fPower":[5.889584],
"fMultipliers":[33,33]},
"MemoryInfo":{
"TotalPhys":16282,
"FreePhys":8473,
"TotalPage":17304,
"FreePage":8473,
"TotalVirtual":8388608,
"FreeVirtual":8388003,
"FreeExtendedVirtual":1,
"MemoryLoad":47}}
Again, I'm getting stuck even to have any data show up from a simple GET or curl request from php.
Simple php code:
<?php
$exe = curl_init();
curl_setopt($exe, CURLOPT_URL, "http://localhost:5200/");
curl_setopt($exe, CURLOPT_HEADER, 0);
curl_setopt($exe, CURLOPT_RETURNTRANSFER, 1);
$raw = curl_exec($exe);
curl_close($exe);
echo $raw;
What am I missing or is this a problem with the Monitoring plug in itself?

curl PHP does not load full page content but only loading gif

I am trying to crawl through a page but only loading GIF is retrieved not the page content.
$url = "https://www.truecaller.com";
$request = $url;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$request);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 120);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
$data = curl_exec($ch);
print_r($data);
curl_close($ch);
any way to retrieve full page.
There is a reason for that.
Curl is not the browser.so Curl does not have the ability to run javascript.
Curl Does not care what is the response it gets you whatever you give it link for. if it gets a gif it will return gif, it that's doc, video or whatever it will return the response.
so what's happening is that It gets your response as soon as you hit the page. there is a gif that is being loaded at first it will return you loading gif. then on the base of javascript condition remaining page is loaded. as it fails to execute javascript so the only response you are getting is that loading gif.
If you want to load full page content there is a full webkit browser without an interface that helps programmers to achieve results as a browser gets.PhantomJS - Scriptable Headless Browser.
I see you have already tried adding a delay to your curl, but the fact is curl is not the right tool for this job. I would investigate http://phantomjs.org/ which will allow you to capture the page more robustly.
#hassan added below, this site has an API so that is also an option. Thanks hassan.

simple html dom parser with curl hangs too long on different URL's

I use simple HTML DOM parser together with curl (I do not have a big experience with curl) and I try to figure out why is hanging on different URL requests long. I have been trying to log with verbose but I did not get back any useful information. It seems like is a Caching problem because after long response all my other requests are acting the same till I clear Browser Cache
str_get_html(get_data($target));
function get_data($url)
{
$ch = curl_init();
$timeout = 30;
curl_setopt($ch,CURLOPT_URL,$url);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch,CURLOPT_CONNECTTIMEOUT,$timeout);
curl_setopt($ch, CURLOPT_USERAGENT, 'some useragent');
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
You are using CURLOPT_NOBODY curl option in your request. Are you sure what it does? It sends the HEAD request to the target url instead of the GET. There are lot of web servers available in the Internet which do accept the HEAD request and keep the request in stuck until the timeout occures. And this is what you are experiencing right now.

PHP GET Method without output

What I am trying to do is open a url and pass variables using PHP,
Everytime I use the cURL method it opens the page and displays the output and won't follow the redirects afterwards.
The HttpRequest is not found.
What can I do?
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_exec($ch);
curl_close($ch);
For privacy issues I cant post what the url is but thats the code thats giving me the problem when It curls the page it actually displays what it's curling.

PHP mirror a webpage

I am trying to create a mirror of a weather widget which I use for a website. Presently, it is used on an HTTPS page, but widget server does not support that (and IE throws a tantrum with dialogs because the widget is not HTTPS)
To solve this, I would like to do is mirror the page in HTTPS to silence the security warnings. I would normally use file_get_contents() for this, however the page contains images which makes it a little more complicated.
**Also as a side note, there isn't any ads on my website or theirs, so there is no revenue stealing
Use CURL to grab a page's content (images and all). You can put this in a file, then use that URL in place of where you'd use the widget's URL:
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, "http://www.example.com/");
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
See the docs: http://www.php.net/manual/en/function.curl-exec.php

Categories