Increase CURL speed php - php

I am using an API provided by flipkart.com, this allows me to search and get results output as json.
The code I am using is:
$snapword = $_GET['p'];
$snapword = str_replace(' ','+',$snapword);
$headers = array(
'Fk-Affiliate-Id: myaffid',
'Fk-Affiliate-Token: c0f74c4esometokesndad68f50666'
);
$pattern = "#\(.*?\)#";
$snapword = preg_replace($pattern,'',$snapword);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'https://affiliate-api.flipkart.net/affiliate/search/json?query='.$snapword.'&resultCount=5');
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_ENCODING , "gzip");
curl_setopt($ch, CURLOPT_USERAGENT,'php');
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
$snapdeal = curl_exec($ch);
curl_close($ch);
$time_end = microtime(true);
$time = $time_end - $time_start;
echo "Process Time: {$time}";
and the time it is taking is : Process Time: 5.3794288635254
Which is way too much, any ideas on how to reduce this?

Use curl_getinfo() to retrieve more accurate information. It also shows how much time spent resolving DNS etc.
You can see exact times taken for each step with the following keys:
CURLINFO_TOTAL_TIME - Total transaction time in seconds for last transfer
CURLINFO_NAMELOOKUP_TIME - Time in seconds until name resolving was complete
CURLINFO_CONNECT_TIME - Time in seconds it took to establish the connection
CURLINFO_PRETRANSFER_TIME - Time in seconds from start until just before file transfer begins
CURLINFO_STARTTRANSFER_TIME - Time in seconds until the first byte is about to be transferred
CURLINFO_SPEED_DOWNLOAD - Average download speed
CURLINFO_SPEED_UPLOAD - Average upload speed
$info = curl_getinfo($curl);
echo $info['connect_time']; // Same as above, but lower letters without CURLINFO
Most probably, the API is slow.
You could try to change to a faster DNS server (in Linux: /etc/resolv.conf).
Other than that, not much you can do.

I would see if you can determine your servers connection speed in your terminal/console window. This would greatly impact the time it takes to access a resource on the web. Also, you might want to consider thinking about the response time it takes from the resource, as the page needs to get the requested information and send it back.
I would also consider saving as much information that you need using a cronjob late at night so that you don't have to handle this upfront.

Related

Curl how to scrape over 500 urls - the safe and resource-wise way

I have a list of over 500 urls that i have to scrape because my distributor doesn't offer an api or a csv. The list is actually an array containing the ids of those products that i want to keep track of:
$arr = [1,2,3,...,564];
The url is the same, you only change the id at the end of it:
$url = 'https://shop.com/products.php?id='
Now, on localhost i used a foreach loop to scrape each and everyone of those urls:
foreach($arr as $id){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url . $id);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_MAXREDIRS, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($ch);
//preg_meth_all - get the data that i'm looking for
//put that data into an array
curl_close($ch);
}
But the problem is that, first of all, i think that that's not wise at all - and i know that for a fact because when i(accidentally) ran the script(on localhost) my access was banned/blocked to that shop.com - getting the message: Too many requests...429:D.
I was trying to sleep that foreach every 10 loops using 10 as modulus
$x = 0;
foreach($arr as $id){
//ch request - get data and add it into an array
$x++;
if($x % 10 == 0){
sleep(2);
}
}
But this takes like forever to execute.
Even tho i am able to connect and take the date that i need from each individual product i want to find a solution using curl(since there's no api nor csv) that will run that script at once but in a safe/wise way.
Is there something like that? If yes, can you please help me understand how?
Thank you!
have a daemon or cronjob that is constantly updating a db 24/7 at safe a safe pace, and whenever you need instant results, just query the db instead of the actual website. if a safe pace is too slow, just keep adding more IP's (use proxys) until it's at an acceptable pace.
UPDATE:
First of all i want to say thank you to all those who answered my question :D
After a few days of reading and trial and error I've reached a conclusion; on which i'm not completely satisfied - so i'll keep trying to search for a better solution but for now this is my result:
First i've added a new column to my table in which i'm saving the time() based on this every time that cron job runs the script i'm selecting the next 30 products which were not updated in the past 12h - i'm running that cron job every 10 minutes. This is the loop:
//get the id's for 30 products that haven't been updated in the past 12 hours
$now = time() - 43200;
$products = $pdo->query("SELECT id FROM products WHERE last_update < $now LIMIT 30");
$bind = [];
foreach($products as $id){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://shop.com?product=' . $id[0] . '.html');
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_MAXREDIRS, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($ch);
preg_match_all('!<span class="stock">(.*?)<\/span>!', $response, $data);
$stock = $data[1];
array_push($bind, [$stock, time(), $id[0]]);
curl_close($ch);
sleep(2);
}
//then i'm just updating this results trough a query
//i'm using PDO do deal with my db
I think that the most important thing here is to sleep that look each and every time this way i didn't get the 429 and actually the execution for 30 products at a time happens pretty fast i mean it takes around 1.5 minutes to complete but i'm avoiding the too many requests thing and it is ran by cron job so i don't really have to do anything.
The limitation with this way of doing things is that by using time() if you have more prodacts that can be fitted in a 12 hours cycle the script will simply start with the first 30 products that haven't been updated in the past 12 hours - but to solve this "problem" i'm thinking about saving a counter in a db table so i can use it to start from X each and every time that script is ran and then update it to X + 30.
Using curl for crawling websites doesn't seem to me as the resource-wise solution but it is one that can take crawling off of your hands.
Again, i'm not 100% satisfied with the way i've written this script but for now it works.
If i'll ever find a better solution i'll post it here.
Thank you!

why curl timeout when get from url?

why would cURL in PHP return timeout message when get HTML from web page?
Here is the PHP code.
function getFromUrl( $url )
{
$curl = curl_init($url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($curl);
if (curl_errno($curl))
{
echo 'Error:' . curl_error($curl) . '<br>' ;
}
curl_close($curl);
return $result ;
}
I get the expected results when I run the function with www.google.com as the URL.
$url = 'http://www.google.com' ;
$result = getFromUrl($url) ;
But, when I pass in the URL of web page on a 2nd web server, I get a timeout response. The URL exists when I paste it into a browser. Why the timeout message?
$url = "http://xxx.54.20.170:10080/accounting/tester/hello.html" ;
echo $url . '<br>' ;
$rv = getFromUrl( $url ) ;
echo $rv . '<br>' ;
here is the cURL error message:
Error:Failed to connect to xxx.54.20.170 port 10080: Connection timed out
I am looking to transfer data from one web server to another.
thanks,
For PHP,
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 400); //timeout in seconds
From terminal first check if curl is working using below extra options.
--connect-timeout
Maximum time in seconds that you allow the connection to the
server to take. This only limits the connection phase, once
curl has connected this option is of no more use. Since 7.32.0,
this option accepts decimal values, but the actual timeout will
decrease in accuracy as the specified timeout increases in deci‐
mal precision. See also the -m, --max-time option.
If this option is used several times, the last one will be used.
and
-m, --max-time
Maximum time in seconds that you allow the whole operation to
take. This is useful for preventing your batch jobs from hang‐
ing for hours due to slow networks or links going down. Since
7.32.0, this option accepts decimal values, but the actual time‐
out will decrease in accuracy as the specified timeout increases
in decimal precision. See also the --connect-timeout option.
If this option is used several times, the last one will be used.
Try to use them to increase timeout time.
There are many reasons for curl not working. Some of them can be,
1) Response time is slow.
2) Few site has check on few header parameters to respond to request. These parameters include User-Agent, Referer, etc to make sure it is coming from valid source and not through bots.

My website takes too long to load data from my API

It's my first time developing an API, which is why i'm not surpirsed it was running a little slow, taking 2-4 seconds to load (I used a microtime timer on my webpage).
But then I found out how long it took for the API commands to execute, they're around 0.002 seconds. So why when I use CURL in PHP, does it take another 2 seconds to load?
My API Connection Code:
function APIPost($DataToSend){
$APILink = curl_init();
curl_setopt($APILink,CURLOPT_URL, "http://api.subjectplanner.co.uk");
curl_setopt($APILink,CURLOPT_POST, 4);
curl_setopt($APILink,CURLOPT_POSTFIELDS, $DataToSend);
curl_setopt($APILink, CURLOPT_HEADER, 0);
curl_setopt($APILink, CURLOPT_RETURNTRANSFER, 1);
return curl_exec($APILink);
curl_close($APILink);
}
How I retrieve data in my web page:
$APIData=array(
'com'=>'todayslessons',
'json'=>'true',
'sid'=>$_COOKIE['SID']
);
$APIResult = json_decode(APIPost($APIData), true);
if($APIResult['functionerror']==0){
$Lessons['Error']=false;
$Lessons['Data']=json_decode($APIResult['data'], true);
}else{
$Lessons['Error']=true;
$Lessons['ErrorDetails']="An error has occured.";
}
The APIPost function is within a functions.php file, which is included at the begging of my page. The time it took from the begging of the second snippet of code, to the end is about 2.0126 seconds. What is the best way to fetch my API data?
This is just a guess, so please dont beat me up about it. But maybe its waiting for curl to complete i.e. timeout as you dont close curl before doing the return.
Try this tiny amendment see if it helps:
function APIPost($DataToSend){
$APILink = curl_init();
curl_setopt($APILink,CURLOPT_URL, "http://api.subjectplanner.co.uk");
curl_setopt($APILink,CURLOPT_POST, 4);
curl_setopt($APILink,CURLOPT_POSTFIELDS, $DataToSend);
curl_setopt($APILink, CURLOPT_HEADER, 0);
curl_setopt($APILink, CURLOPT_RETURNTRANSFER, 1);
$ret curl_exec($APILink);
curl_close($APILink);
return $ret;
}

PHP file_get_contents very slow when using full url

I am working with a script (that I did not create originally) that generates a pdf file from an HTML page. The problem is that it is now taking a very long time, like 1-2 minutes, to process. Supposedly this was working fine originally, but has slowed down within the past couple of weeks.
The script calls file_get_contents on a php script, which then outputs the result into an HTML file on the server, and runs the pdf generator app on that file.
I seem to have narrowed down the problem to the file_get_contents call on a full url, rather than a local path.
When I use
$content = file_get_contents('test.txt');
it processes almost instantaneously. However, if I use the full url
$content = file_get_contents('http://example.com/test.txt');
it takes anywhere from 30-90 seconds to process.
It's not limited to our server, it is slow when accessing any external url, such as http://www.google.com. I believe the script calls the full url because there are query string variables that are necessary that don't work if you call the file locally.
I also tried fopen, readfile, and curl, and they were all similarly slow. Any ideas on where to look to fix this?
Note: This has been fixed in PHP 5.6.14. A Connection: close header will now automatically be sent even for HTTP/1.0 requests. See commit 4b1dff6.
I had a hard time figuring out the cause of the slowness of file_get_contents scripts.
By analyzing it with Wireshark, the issue (in my case and probably yours too) was that the remote web server DIDN'T CLOSE THE TCP CONNECTION UNTIL 15 SECONDS (i.e. "keep-alive").
Indeed, file_get_contents doesn't send a "connection" HTTP header, so the remote web server considers by default that's it's a keep-alive connection and doesn't close the TCP stream until 15 seconds (It might not be a standard value - depends on the server conf).
A normal browser would consider the page is fully loaded if the HTTP payload length reaches the length specified in the response Content-Length HTTP header. File_get_contents doesn't do this and that's a shame.
SOLUTION
SO, if you want to know the solution, here it is:
$context = stream_context_create(array('http' => array('header'=>'Connection: close\r\n')));
file_get_contents("http://www.something.com/somepage.html",false,$context);
The thing is just to tell the remote web server to close the connection when the download is complete, as file_get_contents isn't intelligent enough to do it by itself using the response Content-Length HTTP header.
I would use curl() to fetch external content, as this is much quicker than the file_get_contents method. Not sure if this will solve the issue, but worth a shot.
Also note that your servers speed will effect the time it takes to retrieve the file.
Here is an example of usage:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://example.com/test.txt');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
Sometimes, it's because the DNS is too slow on your server, try this:
replace
echo file_get_contents('http://www.google.com');
as
$context=stream_context_create(array('http' => array('header'=>"Host: www.google.com\r\n")));
echo file_get_contents('http://74.125.71.103', false, $context);
I had the same issue,
The only thing that worked for me is setting timeout in $options array.
$options = array(
'http' => array(
'header' => implode($headers, "\r\n"),
'method' => 'POST',
'content' => '',
'timeout' => .5
),
);
$context = stream_context_create(array('http' => array('header'=>'Connection: close\r\n')));
$string = file_get_contents("http://localhost/testcall/request.php",false,$context);
Time: 50976 ms (avaerage time in total 5 attempts)
$ch = curl_init();
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, "http://localhost/testcall/request.php");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
echo $data = curl_exec($ch);
curl_close($ch);
Time: 46679 ms (avaerage time in total 5 attempts)
Note: request.php is used to fetch some data from mysql database.
Can you try fetching that url, on the server, from the command line? curl or wget come to mind. If those retrieve the URL at a normal speed, then it's not a network problem and most likely something in the apache/php setup.
I have a huge data passed by API, I'm using file_get_contents to read the data, but it took around 60 seconds. However, using KrisWebDev's solution it took around 25 seconds.
$context = stream_context_create(array('https' => array('header'=>'Connection: close\r\n')));
file_get_contents($url,false,$context);
What I would also consider with Curl is that you can "thread" the requests. This has helped me immensely as I do not have access to a version of PHP that allows threading at the moment .
For example, I was getting 7 images from a remote server using file_get_contents and it was taking 2-5 seconds per request. This process alone was adding 30seconds or something to the process, while the user waited for the PDF to be generated.
This literally reduced the time to about 1 image. Another example, I verify 36 urls in the time it took before to do one. I think you get the point. :-)
$timeout = 30;
$retTxfr = 1;
$user = '';
$pass = '';
$master = curl_multi_init();
$node_count = count($curlList);
$keys = array("url");
for ($i = 0; $i < $node_count; $i++) {
foreach ($keys as $key) {
if (empty($curlList[$i][$key])) continue;
$ch[$i][$key] = curl_init($curlList[$i][$key]);
curl_setopt($ch[$i][$key], CURLOPT_TIMEOUT, $timeout); // -- timeout after X seconds
curl_setopt($ch[$i][$key], CURLOPT_RETURNTRANSFER, $retTxfr);
curl_setopt($ch[$i][$key], CURLOPT_HTTPAUTH, CURLAUTH_ANY);
curl_setopt($ch[$i][$key], CURLOPT_USERPWD, "{$user}:{$pass}");
curl_setopt($ch[$i][$key], CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($master, $ch[$i][$key]);
}
}
// -- get all requests at once, finish when done or timeout met --
do { curl_multi_exec($master, $running); }
while ($running > 0);
Then check over the results:
if ((int)curl_getinfo($ch[$i][$key], CURLINFO_HTTP_CODE) > 399 || empty($results[$i][$key])) {
unset($results[$i][$key]);
} else {
$results[$i]["options"] = $curlList[$i]["options"];
}
curl_multi_remove_handle($master, $ch[$i][$key]);
curl_close($ch[$i][$key]);
then close file:
curl_multi_close($master);
I know that is old question but I found it today and answers didn't work for me. I didn't see anyone saying that max connections per IP may be set to 1. That way you are doing API request and API is doing another request because you use full url. That's why loading directly from disc works. For me that fixed a problem:
if (strpos($file->url, env('APP_URL')) === 0) {
$url = substr($file->url, strlen(env('APP_URL')));
} else {
$url = $file->url;
}
return file_get_contents($url);

Retrieving web content multiple times simultaneously in php

I use curl to retrieve web content from another site but there is two problems;
first, it takes an average 4 sec period to retrieve the contents and I don't know if there is any way to reduce it; second, some times remote server returns a short error message instead of full content.
I was thinking it would very efficient to send 3 request simultaneously, then check the first received response to see if it's an error or not and check the second respond if the first one was an error. by this method it would not be need to wait a full 4 secs to get retry response if the first one was error. but I don't know if it's possible to doing it.
It would be very appreciated if any one write a simple code to send multiple request simultaneously.
Here is my initial code:
<?php
$then = microtime(true);
$url='http://www.example.com/StockInformationHandler.ashx?{%22Type%22:%22getstockprice%22,%22la%22:%22En%22,%22arr%22:%22IRO1LKGH0001,IRO1MADN0001,IRO1KAVR0001,IRO1BHMN0001,IRO1PNBA0001,IRO3ASPZ0001,IRO1IKCO0001,IRO1BANK0001,IRO1IKHR0001,IRO1BDAN0001,IRO1SHND0001,IRO1NBEH0001,IRO1PIAZ0001,IRO1RSAP0001,IRO3DTDZ0001,IRO1TSBE0001,IRO1NAFT0001,IRO1PKHA0001,IRO1AZAB0001,IRO3BMAZ0001,IRO1BANS0001,IRO1BAFG0001,IRO1TAYD0001,IRO1GDIR0001,IRO1SPDZ0001,IRO1NALM0001,IRO1TOSA0001,IRO1BSDR0001,IRO1ZMYD0001,IRO1SBEH0001,IRO3SMBZ0001,IRO1PASN0001,IRO1SPAH0001,IRO1PNES0001,IRO1ALIR0001,IRO1MKBT0001,IRO1FKHZ0001,IRO1RENA0001,IRO1SSHR0001,IRO1PKER0001,IRO1SAHD0001,IRO1BTEJ0001,IRO1DADE0001,IRO1PARK0001,IRO1SKRN0001,IRO1FOLD0001,IRO3KHMZ0001,IRO1NASI0001,IRO3FAYZ0001,IRO1ALMR0001,IRO1NSTH0001,IRO1BMLT0001,IRO1TKSM0001,IRO1AYEG0001,IRO1GTSH0001,IRO1COMB0001,IRO3IMFZ0001,IRO1INDM0001,IRO1DSOB0001,IRO3NPSZ0001,IRO1BPAS0001,IRO1PKOD0001,IRO1OFST0001,IRO3NOLZ0001,IRO1RIIR0001,IRO1ATDM0001,IRO1SNMA0001,IRO1VSIN0001,IRO1MNGZ0001,IRO3PSKZ0001,IRO3KRMZ0001,IRO1AMIN0001,IRO1PRDZ0001,IRO1GLOR0001,IRO1SAJN0001,IRO3BGHZ0001,IRO1PAKS0001,IRO1SIPA0001,IRO1PLKK0001,IRO1KSHJ0001,IRO3BDYZ0001,IRO1LEAB0001,IRO1KHSH0001,IRO1KRIR0001,IRO1PKLJ0001,IRO1HTOK0001,IRO1BPST0001,IRO1TAMI0001,IRO1DTIP0001,IRO1SAND0001,IRO1SHPZ0001,IRO1SHKR0001,IRO1CHAR0001,IRO1ALBZ0001,IRO1LMIR0001,IRO1TRIR0001,IRO3ETLZ0001,IRO1CRBN0001,IRO1SAKH0001,IRO3MSZ93011,IRO1NSAZ0001,IRO3BLSZ0001,IRO1MSMI0001,IRO1TMEL0001,IRO1BALB0001,IRO1LZIN0001,IRO3RFNZ0001,IRO1ABAD0001,IRO1LSMD0001,IRO1DPAK0001,IRO1PLAK0001,IRO1MAVA0001,IRO1GSBE0001,IRO3BIPZ0001,IRO1PSIR0001,IRO1BALI0001,IRO1LIRZ0001,IRO3PKSH0001,IRO1NOVN0001,IRO3GRDZ0001,IRO3PMRZ0001,IRO1ROZD0001,IRO1PABD0001,IRO1SSEP0001,IRO1SGRB0001,IRO1NSPS0001,IRO3ARFZ0001,IRO1GNBO0001,IRO1KHAZ0001,IRO1KIMI0001,IRO1SSIN0001,IRO1PETR0001,IRO1BSTE0001,IRO3DSOZ0001,IRO1TSHE0001,IRO1NMOH0001,IRO1SBAH0001,IRO1TMKH0001,IRO3MNOZ0001,IRO1MAPN0001,IRO1SGAZ0001,IRO1MAGS0001,IRO3PRZZ0001,IRO1HSHM0001,IRO3MSZ93021,IRO1KGND0001,IRO1SGOS0001,IRO1SISH0001,IRO1SKER0001,IRO1KCHI0001,IRO3MRJZ0001,IRO1FRVR0001,IRO1GHEG0001,IRO1KRTI0001,IRO1RINM0001,IRO1GHAT0001,IRO1PSHZ0001,IRO1PNTB0001,IRO1KNRZ0001,IRO1BOTA0001,IRO1GOLG0001,IRT3SATF0001,IRO3AFRZ0001,IRR3ASPZ0101,IRO7ARMP0001,IRO3ZF090001,IRT3SSAF0001,IRT3CASF0001,IRO1ASIA0001,IRO1CONT0001,IRO7BHEP0001,IRO1BAKH0001,IRO1KALZ0001,IRO1KBLI0001,IRO1TRNS0001,IRO1KTAK0001,IRO3BSMZ0001,IRR3BSMZ0101,IRO1SWIC0001,IRO1LAPS0001,IRO1JOSH0001,IRO1MOTJ0001,IRR1MOTJ0101,IRO7MILP0001,IRO1NIRO0001,IRO7BHPP0001,IRO3ZF180001,IRO1ARTA0001,IRO1IPAR0001,IRO1YASA0001,IRO1PASH0001,IRO1TAIR0001,IRO3ZF340001,IRO3ZF140001,IRO7IPTP0001,IRR7IPTP0101,IRO3ZF280001,IRO1DRKH0001,IRO3ZF040001,IRO7SHIP0001,IRO1BARZ0001,IRO1PLST0001,IRO1GAZL0001,IRO1PTAP0001,IRO7HPKP0001,IRO1PIRN0001,IRO7GSIP0001,IRO1MOZI0001,IRO3MSZ93031,IRO3MSZ93041,IRO3MSZ93051,IRO3MSZ93061,IRO3MSZ93071,IRO3MSZ93081,IRO3MSZ93091,IRO3MSZ93101,IRO3MSZ93111,IRO3MSZ93121,IRO3MSZ94011,IRO3MSZ94021,IRO3MSZ94031,IRO3MSZ94041,IRO3MSZ94051,IRO1FROZ0001,IRO1GSKE0001,IRO7NARP0001,IRO1TKNO0001,IRO1MNMH0001,IRO3TORZ0001,IRO1SESF0001,IRO3PMTZ0001,IRO1PMSZ0001,IRO3OSHZ0001,IRO3SGDZ0001,IRO3CGRZ0001,IRO1OFRS0001,IRO1MSKN0001,IRO3PJMZ0001,IRO7BPRP0001,IRO1FIBR0001,IRO1KMSH0001,IRO1KSKA0001,IRO3ZF160001,IRO1NEOP0001,IRO1HJPT0001,IRO3KHZZ0001,IRO7RAHP0001,IRO1HFRS0001,IRO1KFJR0001,IRO7BHKP0001,IRO1AZIN0001,IRO1ATIR0001,IRO1SZPO0001,IRO1RTIR0001,IRO1RADI0001,IRR1CHAR0101,IRO3ZF030001,IRO1FNAR0001,IRO7SDRP0001,IRO1IDOC0001,IRO1KFAN0001,IRO1GOST0001,IRO1LENT0001,IRO1MHKM0001,IRO1MSTI0001,IRO1MNSR0001,IRR1IKCO0101,IRO1MESI0001,IRO1DABO0001,IRO1DOSE0001,IRO1DALZ0001,IRO1PDRO0001,IRO1TMVD0001,IRO1THDR0001,IRO1DJBR0001,IRO7DHVP0001,IRO1DAML0001,IRO1DRZK0001,IRO1DZAH0001,IRO1DSIN0001,IRO1DDPK0001,IRO1ABDI0001,IRO1DFRB0001,IRO1FTIR0001,IRO1DKSR0001,IRO1EXIR0001,IRO1DLGM0001,IRO7PDHP0001,IRO1IRDR0001,IRO3ZOBZ0001,IRO1INFO0001,IRO1EPRS0001,IRO1TKIN0001,IRO1RKSH0001,IRO3ZF120001,IRO3ZF240001,IRO3PZGZ0001,IRO7ZNJP0001,IRO3ZAGZ0001,IRO1AZRT0001,IRO1SDAB0001,IRO1SADB0001,IRO1SURO0001,IRO7IRNP0001,IRO7CBBP0001,IRO1SBOJ0001,IRO3SBZZ0001,IRO1SBHN0001,IRO1PRMT0001,IRO1STEH0001,IRO7SASP0001,IRO1SKHS0001,IRO1SKAZ0001,IRO7SEKP0001,IRO3DBRZ0001,IRO1SDST0001,IRO1SDOR0001,IRO1SROD0001,IRR1SSHR0101,IRO1SIMS0001,IRO1SEFH0001,IRO1SSOF0001,IRB3SSHZ9241,IRO1SFRS0001,IRO1SFKZ0001,IRO1FRDO0001,IRO1SFAS0001,IRO1SFNO0001,IRO1SGEN0001,IRO3SKBZ0001,IRO1SKOR0001,IRO1SMAZ0001,IRO3IBKZ0001,IRO7IENP0001,IRO1SSNR0001,IRO1SHZG0001,IRO1SHGN0001,IRO3SYSZ0001,IRO1SEIL0001,IRO1AMLH0001,IRO3PNLZ0001,IRO1BMPS0001,IRO1PPAM0001,IRO3PTRZ0001,IRO1THSH0001,IRO1TOPI0001,IRO1DODE0001,IRO1SHRG0001,IRO1ZNGN0001,IRO1SEPP0001,IRO7STNP0001,IRO1TSAL0001,IRO1SHSI0001,IRO1PESF0001,IRO1PFRB0001,IRO1SHFS0001,IRR1SHFS0101,IRO1PFAN0001,IRO7PKBP0001,IRO1KAFF0001,IRO1NKOL0001,IRO7SHLP0001,IRO1NPRS0001,IRO1VASH0001,IRT1CSNF0001,IRO1KLBR0001,IRO1BENN0001,IRO1LPAK0001,IRO1MINO0001,IRO1CHCH0001,IRO1KDPS0001,IRO1DMOR0001,IRO1SLMN0001,IRO1SPKH0001,IRO1SPPE0001,IRO1SHAD0001,IRO7MINP0001,IRO1GORJ0001,IRO1GCOZ0001,IRO1MRGN0001,IRO1MRAM0001,IRO1RNAB0001,IRO1NOSH0001,IRO1KIVN0001,IRO1MARK0001,IRO1KSIM0001,IRO1ALTK0001,IRO1SAMA0001,IRO1BAHN0001,IRO1BMAS0001,IRO1BIRI0001,IRO1SPTA0001,IRO1JAMD0001,IRO1FAJR0001,IRO1JSHO0001,IRO1FKAS0001,IRO1FRIS0001,IRO1TFKR0001,IRO3ZF100001,IRO3KZIZ0001,IRO1SEPA0001,IRO1LSDD0001,IRO1SORB0001,IRO1SOLI0001,IRO1LAMI0001,IRO7FANP0001,IRO1NGFO0001,IRO1FVAN0001,IRO1FAIR0001,IRO1GPRS0001,IRO1GPSH0001,IRO3CHRZ0001,IRO1GGAZ0001,IRO1GSHI0001,IRO1GHND0001,IRO3GHSZ0001,IRO1GESF0001,IRO1GMRO0001,IRO1GNJN0001,IRO1ABGN0001,IRO3ZF200001,IRT3SSKF0001,IRO7PKZP0001,IRO1KESF0001,IRO1BAMA0001,IRO1ITAL0001,IRO1IRGC0001,IRO1KPRS0001,IRO1CHML0001,IRO1CHIR0001,IRO1KHFZ0001,IRO1DMVN0001,IRO1TSRZ0001,IRO1ROOI0001,IRO1SINA0001,IRR1SINA0101,IRO1ARDK0001,IRO1PSER0001,IRO1KSAD0001,IRO3KSGZ0001,IRO1TBAS0001,IRO1SHQZ0001,IRO1ALVN0001,IRO1NILO0001,IRO1BHSM0001,IRO1SHMD0001,IRO1VARZ0001,IRO3KBCZ0001,IRO1ASAL0001,IRO1AZMA0001,IRO1PELC0001,IRO1PYAM0001,IRO1JJNM0001,IRO1LKPS0001,IRO1SRMA0001,IRO1KMOA0001,IRO1IAGM0001,IRO3KARZ0001,IRO1BMEL0001,IRO7PMMP0001,IRO3ZF220001,IRR3KHMZ0101,IRO3MIHZ0001,IRO1BROJ0001,IRO1PTOS0001,IRO3ZF060001,IRO1MRIN0001,IRO7BNOP0001,IRO7NIRP0001,IRO3ZF080001,IRO1HMRZ0001,IRO7VHYP0001,IRR1ALBZ0101,IRO1OIMC0001,IRO1TAZB0001,IRO7KARP0001,IRO1BIME0001,IRO1BPAR0001,IRO1DARO0001,IRO1TGOS0001,IRO1TOKA0001,IRO3BKHZ0001,IRO3BMDZ0001,IRO3ZMNZ0001,IRO1SSAP0001,IRO1SDID0001,IRO1SKBV0001,IRR1SKBV0101,IRO7SNAP0001,IRO7SHOP0001,IRO1GBEH0001,IRO7TKDP0001,IRO1KRAF0001,IRO7KOSP0001,IRO3IRNZ0001,IRO7BVMP0001,IRO1MELT0001,IRO1GMEL0001,IRO1SNRO0001,IRO1NIKI0001,IRR3ETLZ0101,IRR1BARZ0101,IRO1KVRZ0001,IRR1TRIR0101,IRR3ZNDZ0101,IRR3ZNDZ0101,IRR1KSKA0101,IRR1DALZ0101,IRO3BLKZ0001,IRR1TKIN0101,IRO1KHOC0001,IRO1CIDC0001,IRR1PNTB0101,IRR1PRDZ0101,IRR3PTRZ0101,IRO3LIAZ0001,IRR1GNJN0101,IRO3TBSZ0001,IRR1PKER0101,IRR1SGAZ0101,IRO1BVMA0001,IRO1MOBN0001,IRR1BMEL0101,IRO3FOHZ0001,IRR1BPAS0101,IRO3BHLZ0001,,%22}';
$ch=curl_init();
$timeout=15;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
// Get URL content
$lines_string=curl_exec($ch);
curl_close($ch);
$now = microtime(true);
echo sprintf("Elapsed: %f", $now-$then);
$lines_string;
echo "\r\n";
?>

Categories