Downloading remote files with PHP/cURL: a bit more robustly - php

I have a script that pulls URLs from the database and downloads them (pdf or jpg) to a local file.
Code is:
$cp = curl_init($remote_url);
$fp = fopen($dest_temp, "w");
#curl_setopt($cp, CURLOPT_FILE, $fp);
#curl_setopt($ch, CURLOPT_HEADER, TRUE);
curl_exec($cp);
curl_close($cp);
fclose($fp);
If the remote file is there, it works fine. If the remote file is not there, it just bombs and the browser hangs forever.
What's the best approach to handling this, should I somehow ping for the file first? or can I set options above that will handle this. I tried setting timeouts but it had no effect.
this is my first experience using cURL

I used to use wget much as you're using curl and got frustrated with the lack of ability to know what is going on because its essentially calling out to an external program.
I use perl WWW:Mechanize and the link below is a PHP version which might be a bit more robust for you to be able to deal with such instances.
http://www.compasswebpublisher.com/php/www-mechanize-for-php
Hope this helps.

Related

PHP file_get_contents behavior across versions and operating systems

I am using file_get_contents in PHP to get information from a client's collections on contentDM. CDM has an API so you can get that info by making php queries, like, say:
http://servername:port/webutilities/index.php?q=function/arguments
It has worked pretty well thus far, across computers and operating systems. However, this time things work a little differently.
http://servername/utils/collection/mycollectionname/id/myid/filename/myname
For this query I fill in mycollection, myid, and myname with relevant values. myid and mycollection have to exist in the system, obviously. However, myname can be anything you want. When you run the query, it doesn't return a web page or anything to your browser. It just automatically downloads a file with myname as the name of the file, and puts it in your local /Downloads folder.
I DON'T WISH TO DOWNLOAD THIS FILE. I just want to read the contents of the file it returns directly into PHP as a string. The file I am trying to get just contains xml data.
file_get_contents works to get the data in that file, if I use it with PHP7 and Apache on my laptop running Ubuntu. But, on my desktop which runs Windows 10, and XAMPP (Apache and PHP5), I get this error (I've replaced sensitive data with ###):
Warning:
file_get_contents(###/utils/collection/###/id/1110/filename/1111.cpd):
failed to open stream: No such file or directory in
D:\Titus\Documents\GitHub\NativeAmericanSCArchive\NASCA-site\api\update.php
on line 18
My coworkers have been unable to help me so I am curious if anyone here can confirm or deny whether this is an operating system issue, or a PHP version issue, and whether there's a solid alternative method that is likely to work in PHP5 and on both Windows and Ubuntu.
file_get_contents() is a simple screwdriver. It's very good for getting data by simply GET requests where the header, HTTP request method, timeout, cookiejar, redirects, and other important things do not matter.
fopen() with a stream context or cURL with setopt are powerdrills with every bit and option you can think of.
In addition to this, due to some recent website hacks, we had to secure our sites more. In doing so, we discovered that file_get_contents failed to work, where curl still would work.
Not 100%, but I believe that this php.ini setting may have been blocking the file_get_contents request.
; Disable allow_url_fopen for security reasons
allow_url_fopen = 0
Either way, our code now works with curl.
reference :
http://25labs.com/alternative-for-file_get_contents-using-curl/
http://phpsec.org/projects/phpsecinfo/tests/allow_url_fopen.html
So, You can solve this problem by using PHP cURL extension. Here is an example that does the same thing you were trying:
function curl($url)
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
$url = 'your_api_url';
$data = curl($url);
And finally you can check your data by print_r($data). Hope it you it will works and you will understand.
Reference : http://php.net/manual/en/book.curl.php

How to upload big files via Storage-ftp in Laravel 5.1

I'm trying to upload big file to ftp server with Laravel Storage function and it's keep give me the erorr
Out of memory (allocated 473432064) (tried to allocate 467402752 bytes)
I've tried to change the memory limit on php.ini and it still won't work, when I upload file to the server normally its work and the size dosent metter.
I've tried anything but nothing work.
Again - I'm trying to upload via FTP.
One more question : there is a way to uplaod the file direct to the ftp server from the client? I see that the storage upload first to my server and then transfer to the second server...
It definately sounds like a PHP Limitation, raising the memory limit probably isn't the best way to do it though, that leads to nothing but hassle, trust me.
Best method i can think of from the top of my head is to use Envoy (the server script method not the deployment service) to put together an SSH task, that way your job is being executed outside of PHP so you're not subject to the same memory limitations. Your Envoy script (envoy.blade.php in your project root) would probably look something like this;
#servers(['your_server_name' => 'your.server.ip'])
#task('upload', ['on' => ['your_server_name']])
// perform your FTP setup, login etc.
put your_big_file.extension
#endtask
I've only got one of these set up for a deployment job which is called from Jenkins so i'm not sure if you can launch it from within Laravel, but i launch from the command line like this;
vendor/bin/envoy run myJobName
Like i said the only thing i can't quite remember is if you can run Envoy from within Laravel itself, and the docs seem a little hazy on it, Definately an option worth checking out though :)
https://laravel.com/docs/5.1/envoy
Finally I solve the problem by using curl instead of Storage.
$ch = curl_init();
$localfile = $file->getRealPath();
$fp = fopen($localfile, 'r');
curl_setopt($ch, CURLOPT_URL, 'ftp://domain/' . $fileName);
curl_setopt($ch, CURLOPT_USERPWD, "user:pass");
curl_setopt($ch, CURLOPT_UPLOAD, 1);
curl_setopt($ch, CURLOPT_INFILE, $fp);
curl_setopt($ch, CURLOPT_INFILESIZE, filesize($localfile));
curl_exec($ch);
$error_no = curl_errno($ch);
curl_close($ch);
Work perfect! Better then Storage option of laravel.

Downloading Large file to server with php

I need to download larger files to my server .I have dedicated Server ..which is 100mbps .But its taking too much time to download 8mb file.I use below code.Is there any Class to download files quickly ? which chunks the file and download it real quick ?
<?php
$url = 'http://www.example.com/a-large-file.zip';
$path = '/path/to/a-large-file.zip';
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);
?>
Edit : File is mp4 file
If your line is as quick as you say, then it's likely to be a bottleneck at the other end.
Don't forget, file transfer speed is affect by not only your download speed, but also the other ends upload speed.
As such, it's unlikely there is anything you go do to improve this. Certainly no class or code will help you improve it by more than a few milliseconds - your best bet is to look at the network.
Specifically, check the number of concurrent connections from your end - it's all well and good having a decent line, but if it's being used for 100 connections, it's always going to be slower than one being used for 1 connection.
Likewise, check the download from another machine/server - if it's just a slow from a different server, then it's almost certainly a bottleneck at the other end.

PHP Downloading a Huge Movie File (500 MB) with cURL

Okay, I have a problem that I hope you can help me fix.
I am running a server that stores video files that are very large, some up to 650 MB. I need a user to be able to request this page and have it download the file to their machine. I have tried everything, but a plain readfile() request hangs for about 90 seconds before quitting and gives me a "No data received error 324 code," a chunked readfile script that I have found from several websites doesn't even start a download, FTP through PHP solutions did nothing but give me errors when I tried to get the file, and the only cURL solutions that I have found just create another file on my server. That is not what I need.
To be clear I need the user to be able to download the file to their computer and not to the server.
I don't know if this code is garbage or if it just needs a tweak or two, but any help is appreciated!
<?php
$fn = $_GET["fn"];
echo $fn."<br/>";
$url = $fn;
$path = "dl".$fn;
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);
?>
I wouldn't recommend serving large binary files using PHP or any other scripting technology for that matter. They where never design for this -- you can use apache, nginx or whatever standard http server you have on the back end. If you still need to use PHP, then you should probably check out readfile_chunked.
http://php.net/readfile#48683
and here's a great tutorial.
http://teddy.fr/blog/how-serve-big-files-through-php
good luck.
readfile() doesnt buffer. However, php itself might buffer. Turn buffering off
while (ob_get_level())
ob_end_clean();
readfile($file);
Your web server might buffer. Turn that off too. How you do it depends on the webserver, and why its buffering.
I see two problem that can happen:
First: Your web server may be closed the connection by timeout. you should look the web server config.
Second: Timeout with curl. I recommend to see this post.

How can i download a file to my server, from another ftp server

I am a realtor/web designer. I am trying to write a script to download a zip file to my server from a ftp server. My website is all in php/mysql. The problem that I am having is that I cannot actually log in to the ftp server. The file is provided through a link that is available to me, but access to the actual server is not available. Here is the link i have access to.
ftp://1034733:ze3Kt699vy14#idx.living.net/idx_fl_ftp_down/idx_ftmyersbeach_dn/ftmyersbeach_data.zip
php's normal functions for accomplishing this give me a connection error. Php does not have permission to access this server... Any solutions to this problem would be a life saver for me. I am looking to use a cron job to run this script every day so i don't have to physically download this file and upload it to my (godaddy) server, which is my current solution (not a good one, i know!).
Also, I can figure out how to unzip the file myself, as I have done some work with php's zip extension, but any tips for an efficient way to do this would be appreciated as well. I am looking to access a text file inside of the zip archive called "ftmyers_data.txt"
Do you have shell access to the server on which your PHP application runs? If so, you might be able to automate the retrieval using a shell script and the ftp shell command.
Use php_curl
$curl = curl_init();
$fh = fopen("file.zip", 'w');
curl_setopt($curl, CURLOPT_URL, "ftp://1034733:ze3Kt699vy14#idx.living.net/idx_fl_ftp_down/idx_ftmyersbeach_dn/ftmyersbeach_data.zip");
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec($curl);
fwrite($fh, $result);
fclose($fh);
curl_close($curl);
Easy as pi. :)

Categories