I would like if it's possible to optimize this request with multi curl handler ?
Thanks
$array_img = array(
'https://www.foooooobbbaaarrr.fr/images/1.jpeg',
'https://www.foooooobbbaaarrr.fr/images/2.jpeg',
'https://www.foooooobbbaaarrr.fr/images/3.jpeg',
'https://www.foooooobbbaaarrr.fr/images/4.jpeg');
foreach ($array_img as $k => $v)
{
$ch = curl_init($v);
$name = ($k + 1).'.jpeg';
$fp = fopen($name, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
}
That's a very open-ended question, but with the PHP/CURL's multi functions you can at least get all those files in parallel instead of getting them in a serial manner.
Related
When using the fopen handle, loading times become drastically huge. About 10 seconds or so to be precise.
$handle = fopen("http://services.runescape.com/m=hiscore_oldschool/index_lite.ws?player={$pname}", "r");
if (FALSE === $handle) {
echo "Failed";
exit("Failed to open stream to URL");
}
$contents = '';
while (!feof($handle)) {
$contents .= fread($handle, 32);
}
Above is the code I am using.
Is this an inefficient way to do it? If so, what is the efficient way?
I also explode these results and print them. But even after removing the printing of the results part, the load times are unchanged.
Maybe you can speed it up if you use some other way to do the http request.
You can use cURL:
$url = "http://services.runescape.com/m=hiscore_oldschool/index_lite.ws?player={$pname}";
$c = curl_init();
curl_setopt($c, CURLOPT_URL, $url);
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
curl_setopt($c, CURLOPT_TIMEOUT, 10);
$contents = curl_exec($c);
curl_close($c);
Or try it via:
$contents = file_get_contents( "http://services.runescape.com/m=hiscore_oldschool/index_lite.ws?player={$pname}" );
I want example_homepage.html update once a day , how ?
<?php
$ch = curl_init("http://rss.news.yahoo.com/rss/oddlyenough");
$fp = fopen("example_homepage.html", "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
$xml = simplexml_load_file('example_homepage.html');
print "<ul>\n";
foreach ($xml->channel->item as $item){
print "<li>$item->title</li>\n";
}
print "</ul>";
?>
Do you know what i mean? This page is very slow , so I want example_homepage.html update once a day. But I dont know how to do ?
you can use cron tab for this in Linux environment
for more info
https://help.ubuntu.com/community/CronHowto
or you can use third party cron services
like
http://cronless.com/
My current script to upload photos goes like this:
foreach($files as $file) {
$data = base64_encode(file_get_contents($file));
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($ch);
}
I can have up to 1000 files to upload to a remote server and it can take a long time for curl to process everything. The solution seems to be multi-curl, however there is a unique aspect:
what I need from multi-curl is to save the response into an array like $upload_results[] = array($file, 'response')
How do I do this?
Thanks!
Essentially, this can be done by creating the handles in an array with the file names as keys and then reading the results into another array with the same keys.
function uploadLotsOfFiles($files) {
$mh = curl_multi_init();
$handles = array();
$results = array();
foreach ($files as $file) {
$handles[$file] = curl_init();
curl_setopt($handles[$file], CURLOPT_RETURNTRANSFER, true);
//File reading code and other options from your question go here
curl_multi_add_handle($mh, $handles[$file]);
}
$running = 0;
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh); //Prevent eating CPU
} while($running > 0);
foreach($handles as $file => $handle) {
$results[$file] = curl_multi_getcontent($handle);
curl_multi_remove_handle($mh, $handle);
}
return $results;
}
If you really need the result to be in the format you specified (which I don't recommend since it's less elegant than using the file as a key):
$results[] = array($file, curl_multi_getcontent($handle));
Can be used in place of $results[$file] = curl_multi_getcontent($handle);
I'm trying to use cURL to download images from an URL with multiple connections to speed up the process.
Here's my code:
function multiRequest($data, $options = array()) {
// array of curl handles
$curly = array();
// data to be returned
$result = array();
// multi handle
$mh = curl_multi_init();
// loop through $data and create curl handles
// then add them to the multi-handle
foreach ($data as $id => $d) {
$path = 'image_'.$id.'.png';
if(file_exists($path)) { unlink($path); }
$fp = fopen($path, 'x');
$url = $d;
$curly[$id] = curl_init($url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_FILE, $fp);
fclose($fp);
curl_multi_add_handle($mh, $curly[$id]);
}
// execute the handles
$running = null;
do {
curl_multi_exec($mh, $running);
} while($running > 0);
// get content and remove handles
foreach($curly as $id => $c) {
curl_multi_remove_handle($mh, $c);
}
// all done
curl_multi_close($mh);
}
And executing:
$data = array(
'http://example.com/img1.png',
'http://example.com/img2.png',
'http://example.com/img3.png'
);
$r = multiRequest($data);
So it's not really working. It creates the 3 files, but with zero bytes (empty), and giving me the following error (3 times) and it's printing some kind of content of the original .PNGs:
Warning: curl_multi_exec(): CURLOPT_FILE resource has gone away, resetting to default in /Applications/MAMP/htdocs/test.php on line 34
So please, could you let me know how to work it out?
Thanks in advance for your help!
What you are doing is creating a file handle then closing it before the end of the loop. This is going to result in curl not having any file to write to. Try something like this:
//$fp = fopen($path, 'x'); Remove
$url = $d;
$curly[$id] = curl_init($url);
curl_setopt($curly[$id], CURLOPT_HEADER, 0);
curl_setopt($curly[$id], CURLOPT_FILE, fopen($path, 'x'));
//fclose($fp); Remove
I have tried to download an image from a PHP link. When I try the link in a browser it downloads the image. I enabled curl and I set “allow_url_fopen” to true. I’ve used the methods discussed here Saving image from PHP URL but it didn’t work. I've tried "file_get_contents" too, but it didn't work.
I made few changes, but still it doesn’t work. This is the code
$URL_path='http://…/index.php?r=Img/displaySavedImage&id=68';
$ch = curl_init ($URL_path);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
$raw=curl_exec($ch);
curl_close ($ch);
$fp = fopen($path_tosave.'temp_ticket.jpg','wb');
fwrite($fp, $raw);
fclose($fp);
Do you have any idea to make it works? Please help. Thanks
<?php
if( ini_get('allow_url_fopen') ) {
//set the index url
$source = file_get_contents('http://…/index.php?r=Img/displaySavedImage&id=68');
$filestr = "temp_ticket.jpg";
$fp = fopen($filestr, 'wb');
if ($fp !== false) {
fwrite($fp, $source);
fclose($fp);
}
else {
// File could not be opened for writing
}
}
else {
// allow_url_fopen is disabled
// See here for more information:
// http://php.net/manual/en/filesystem.configuration.php#ini.allow-url-fopen
}
?>
This is what I used to save an image without an extension (dynamic image generated by server). Hope it works for you. Just make sure that the file path location is fully qualified and points to an image. As #ComFreek pointed out, you can use file_put_contents which is the equivalent to calling fopen(), fwrite() and fclose() successively to write data to a file. file_put_contents
You can use it as a function :
function getFile($url){
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
$tmp = curl_exec($ch);
curl_close($ch);
if ($tmp != false){
return $tmp;
}
}
And to call it :
$content = getFile(URL);
Or save its content to a file :
file_put_contents(PATH, getFile(URL));
You're missing a closing quote and semicolon on the first line:
$URL_path='http://…/index.php?r=Img/displaySavedImage&id=68';
Also, your URL is in $URL_path but you initialise cURL with $path_img which is undefined based on the code in the question.
Why use cURL when file_get_contents() does the job?
<?php
$img = 'http://…/index.php?r=Img/displaySavedImage&id=68';
$data = file_get_contents( $img );
file_put_contents( 'img.jpg', $data );
?>