PHP curl download remote stream / chunked data - php

I have audio files on a remote server that are streamed / chunked to the user. This all works great in the clients browser.
But when I try to download and save the files locally from another server using curl, it only seems to be able to download small files (less than 10mb) sucessfully, anything larger and it seems to only download the header.
I assume this is because of the chunking, so my question is how do I make curl download the larger (chunked) files?
With wget on the cli on linux this is as simple as :
wget -cO - https://example.com/track?id=460 > mytrack.mp3
This is the func I have written using curl in PHP, but like I say it's only downloading headers on large files :
private function downloadAudio($url, $fn){
$ch = curl_init($url);
$path = TEMP_DIR . $fn;
$fp = fopen($path, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_AUTOREFERER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
if (file_exists($path)) {
return true;
}
return false;
}

In my case it was failing as I had forgotten to increase the default PHP memory_limit on the origin server.
It turned out after posting this question that it was actually successfully downloading any files that seemed to be below the 100mb mark, not 10mb as I had stated in the question. As soon as I realised this I checked the memory_limit and low and behold it was set to the default 128m.
I hadn't noticed any problems client side as it was being chunked, but when the server tried to grab an entire 300mb file in less than 1 second the memory limit must have been reached.

Related

How to upload file from local to server using curl in php?

I want to upload the file from local to server using curl in php and without using the form (which is html).
My php version is 5.2.6.
I have tried many different way and make many research about (upload file to server using curl in php), but the solution in google cannot solve my problem.
Below is my code:
// open file descriptor
$fp = fopen ("user.com/user/fromLocal.txt", 'w+') or die('Unable to write a file');
// file to download
$ch = curl_init('C:/wamp/www/user/fromLocal.txt');
// enable SSL if needed
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
// output to file descriptor
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
// set large timeout to allow curl to run for a longer time
curl_setopt($ch, CURLOPT_TIMEOUT, 1000);
curl_setopt($ch, CURLOPT_USERAGENT, 'any');
// Enable debug output
curl_setopt($ch, CURLOPT_VERBOSE, true);
echo $ch;
echo $fp;
curl_exec($ch);
curl_close($ch);
fclose($fp);
Expected output:
The file can upload to server and view.
Actual output:
Unable to write the file
I think you miss important information.
fopen ("user.com/user/fromLocal.txt", 'w+')
this means nothing.
To send a file to the server the server has to be configured to accept a POST request and you have to send the file through that endpoint.
If you have a form, do you send it to: "user.com/user/fromLocal.txt" ??
You have to create the form data with curl and send it to a server ready to accept your request. There are different ways to accomplish that. And the most simple is exactly to send a form using curl and not the HTML. But absolutly you cannot write a file like that in a server.

Downloading a large file (~500 Mb) to local storage from FTP, without breaking the server

I'm required to download a large XML file from a remote FTP server to local storage so that I can process it.
I've defined an FTP driver that can access the file. However, because of the size of the file, PHP gives up allocating memory for the operation.
Storage::disk('ftp')->get('path/to/file/bigass.xml');
Is there any way that doesn't eat up memory and can download file without issues?
I suggest to switch to a "plain old" curl solution, using something like this:
$curl = curl_init();
$fh = fopen("localfilename.xml", 'w');
curl_setopt($curl, CURLOPT_URL, "ftp://{$ftp_username}:{$ftp_password}#{$ftp_server}/path/to/file/bigass.xml");
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($curl);
fwrite($fh, $result);
fclose($fh);
curl_close($curl);

Slow Download Speed on Windows LocalHost

I installed localhost (xammp, wampserver) on a VDS.  When I try to get a file using PHP Curl and file_get_contents, the download speed is very low. I can download a 100mb file in 10 minutes. If I try to download the same file with a browser, the duration is only 3 seconds. What can be the reason?
Thanks for your interest.
Downloading content at a specific URL is common practice on the internet, especially due to increased usage of web services and APIs offered by Amazon, Alexa, Digg, etc. PHP's cURL library, which often comes with default shared hosting configurations, allows web developers to complete this task.
u can try
/* gets the data from a URL */
function get_data($url) {
$ch = curl_init();
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
The Usage
$returned_content = get_data('http://davidwalsh.name'); //something like this
Alternatively, you can use the file_get_contents function remotely, but many hosts don't allow this.

Unable to save picture from url to my server

I'm try to save some pictures from url to my server, but i'm not able to do it.
this is my code (it's standard, i've found on internet):
$ch = curl_init($url);
$fp = fopen($img, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
but for each link i put inside $url variable, on my server is always saved a 26 byte image (that's not the original image).
What's wrong?
I can successfully download image using your curl code. It can happen that your server is not allowed to connect the outside web links.
This is a curl equivalent code that download images as well. I believe from your server, you can not download image using this code.
file_put_contents("img.jpg", file_get_contents("http://www.letteratu.it/wp-content/uploads/2014/03/cielo.jpg"));
Run your curl with verbose mode to see the curl's debug messages, and show that us.
curl_setopt($ch, CURLOPT_VERBOSE, 1);
I'm pretty sure you need to include the http:// in the URL. I'm fairly certain it thinks that it is a local file without it (i.e., an implicit file://).

php curl_exec doesn't react to server response

I'm trying to upload files to another server via Curl. The script uploads a file to my own server. My server processes the upload and responses, my script keeps going as it should. Hovever if the uploaded file is of a big (About 500MB) size (seems do differ) the script keeps running even after my server responded (AFAIK by logging the output of the serverside script). The client is a Windows 7 x64 machine running Xampp (different versions tried). If I upload the same file to the same serverside script via an html form everything works fine and I get my response. What could be the problem?
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, array(
"MAX_FILE_SIZE" => "2147483648",
"action" => "upload",
"userfile" => "#".$filepath));
if($uploadresponse = curl_exec($ch)){
echo"Upload done!";
} else {
echo"Curl error no. ".curl_errno($ch)." (".curl_error($ch).")";
}
curl has a default timeout to allow it execute, regardless the success. You can extend this time using this curl option(as I am suspecting you are getting timed out while uploading a large file):
curl_setopt($ch, CURLOPT_TIMEOUT, 5 * 60); // 5 minutes
It is possible to set no time limit to CURLOPT_TIMEOUT by simply setting the timeout to 0.
curl_setopt($ch, CURLOPT_TIMEOUT, 0);

Categories