I am trying to download a file using PHP and CURL, here is my code:
set_time_limit(0);
// Set file to write to
$file = fopen($nameGenerated['path'], 'w+');
$ch = curl_init();
if ($ch == false) {
die('Failed to create curl handle');
}
// Set Curl options
curl_setopt($ch, CURLOPT_URL, str_replace(" ", "%20", $uri));
curl_setopt($ch, CURLOPT_FILE, $file);
// get curl response
curl_exec($ch);
curl_close($ch);
fclose($file);
The file is empty and curl_exec always returns false. When I tried to get the errors using curl_error($ch) there was no error.
Note: I am using Yii2.
Can someone help me understand what's the matter?
I managed to fix the problem by using the ipv4 addess instead of the usual 127.0.0.1. Now I know the problem, the CURL was excepting an ip address, not the localhost or 127.0.0.1. Thanks for all the people who tried to help!
Currently I am successfully downloading a file using a PHP cURL request. The issue is that the remote file seems to get deleted after each download. Is there a setting to tell cURL to not delete the file after the download? I'm not finding anything on this when researching online. I'm actually only finding questions asking how to delete the files after downloading which is obviously working for me already but not my desired result.
Here is basically the code I am running currently:
$url = 'ftp://ftp.example.com/file.txt';
$username = 'username';
$password = 'password';
$filename = dirname(__FILE__) . '/file.csv';
$fp = fopen($filename, 'w');
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_USERPWD, $username . ':' . $password);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FILE, $fp);
$output = curl_exec($ch);
curl_close($ch);
fclose($fp);
Would anybody happen to be able to provide me with some suggestions as to how NOT to delete the remote files?
Thanks!
Well I've contacted the remote source and sure enough they were archiving the files just to keep the directory clean.
Thanks to all!
I'm using cUrl to get the file's contents of the same website's page, and writing to another file ( To convert dynamic php file into static php file for menu caching purpose )
$dynamic = 'http://mysite.in/menu.php';
$static = "../menu-static.php" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$dynamic);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
file_put_contents($static, $file);
die();
It works perfect on localhost. But taking too much time when it is running on hosted website, and at last even the output file ($static = "../menu-static.php") is empty.
I can't determine where is the problem .. Please help
I've also tried file_get_contents instead of cUrl with no luck ..
this is my code but this is not working whats a problem in this code.u have any another code.give me idea to download a file from server using ftp.i am trying this code in my localhost also in my own server.
$curl = curl_init();
$file = fopen("ftpfile/file.csv", 'w');
curl_setopt($curl, CURLOPT_URL, "ftp:http://www.address.com/file.csv"); #input
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_FILE, $file); #output
curl_setopt($curl, CURLOPT_USERPWD, "myusername:mypassword");
curl_exec($curl);
I'm guessing one of two things: the fopen call failed (and you're not checking if it succeeded by seeing if $file !== false), or the double-use of _returntransfer AND _file is not acting as you expect.
returntransfer tells curl to return the retrieved data in the exec call, instead of outputting it directly to the browser. I suspect that if you did $data = curl_exec($curl); file_put_contents('ftpfile/file.csv', $data) you'd end up with a properly populatd file. So... either remove the returntransfer option, or eliminate the whole _file business and output the file yourself using what the _exec call returns.
In my web hosting server, file_get_contents() function is disabled. I am looking for an alternative. please help
file_get_contents() pretty much does the following:
$filename = "/usr/local/something.txt";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
fclose($handle);
Since file_get_contents() is disabled, I'm pretty convinced the above won't work either though.
Depending on what you are trying to read, and in my experience hosts disable remote file reading usually, you might have other options. If you are trying to read remote files (over the network, ie http etc.) You could look into the cURL library functions
You can open the file with fopen, get the contents of the file and use them? And maybe cURL is usefull to you? http://php.net/manual/en/book.curl.php
A bit of everything.
function ff_get($f) {
if (!file_exists($f)) { return false; }
$result = #file_get_contents($f);
if ($result) { return $result; }
else {
$handle = #fopen($f, "r");
$contents = #fread($handle, #filesize($f));
#fclose($handle);
if ($contents) { return $contents; }
else if (!function_exists('curl_init')) { return false; }
else {
$ch = #curl_init();
#curl_setopt($ch, CURLOPT_URL, $f);
#curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$output = #curl_exec($ch);
#curl_close($ch);
if ($output) { return $output; }
else { return false; }}}}
The most obvious reason why file_get_contents() is disabled is because it loads the whole file in main memory first. The code from code_burgar could pose problems if your hoster has assigned you a very low memory limit.
As a general rule, use file_get_contents()(or -replacement) only when you are sure the file to be loaded is small. With SplFileObject you can walk trough a file line-by-line with a convenient interface. Use this in case your file is big.
Try this code:
$ch = curl_init();
$timeout = 5; // set to zero for no timeout
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$content = curl_exec($ch);
curl_close($ch);
I assume you are trying to access a file remotely through http:// or ftp://.
In theory, there are alternatives like fread() and, if all else fails, fsockopen().
But if the provider is any good at what they do, those will be disabled too.
Use the PEAR package Compat. It is like a official replacement of native PHP functions with PHP coded solutions.
require_once 'PHP/Compat.php';
PHP_Compat::loadFunction('file_get_contents');
Or, if you don't wish to use the class, you can load it manually.
require_once 'PHP/Compat/Function/file_put_contents.php';
All compat functions are wrapped by if(!function_exists()) so it is really fail save if your webhoster upgrades the server features later.
All functions can be used exactly as the same as the native PHP, also the related constants are available!
List of all available functions
If all you are trying to do is trigger a hit on a given url and don't need to read the output you can use curl() provided your web host has it enabled on your server.
The documentation here gives an example of calling a url using curl.
If all else fails, there's always cURL. There's a good chance it's installed.