file_get_contents not working for local files - php

I recently upgraded my XAMPP from PHP 5.2 to 5.3.1
I seem to be having a problem with file_get_contents().
I can use the function to get something like "http://www.google.com", but it times out when I use it on a domain I have setup locally e.g. "http://localhost/my_dir/my_css_file.css".
I'm not really sure what the problem is. If it's a bug, is there a viable alternative?
Kindly advise.

Try to use include() instead of file_get_contents().
<?php include($_SERVER['HTTP_HOST'] . "/my_dir/my_css_file.css"); ?>
or
<?php include($_SERVER['DOCUMENT_ROOT'] . "/my_dir/my_css_file.css"); ?>
Updates corresponding your comments:
$string = get_include_contents('somefile.php');
function get_include_contents($filename) {
if (is_file($filename)) {
ob_start();
include $filename;
$contents = ob_get_contents();
ob_end_clean();
return $contents;
}
return false;
}
This will get file data into variable $string.

Any chance you're on a Windows system?
There is a bug in a combination of Windows, file_get_contents and localhost which is not going to be fixed. See Bug 38826 and Bug 40881
Try using 127.0.0.1 instead of localhost or set up any different domain name. Then you should get it working.

Solved this by using CURL. Here's the code. It will work with remote files e.g.
http://yourdomain.com/file.ext
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, ''.$file_path_str.'');
curl_setopt($ch, CURLOPT_HTTPGET, 1);
curl_setopt ($ch, CURLOPT_HEADER, 0);
curl_setopt ($ch, CURLOPT_USERAGENT, sprintf("Mozilla/%d.0",rand(4,5)));
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_MAXREDIRS, 10);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$curl_response_res = curl_exec ($ch);
curl_close ($ch);
I could not use #James solution because I'm using ob_start and ob_flush elsewhere in my code, so that would have messed things up for me.

I recently solved this problem. On my windows machine it is acceptable to have spaces in your folder names. However PHP wasnt able to read this path.
I changed my folder name
From:
C:\Users\JasonPC\Desktop\Jasons Work\Project
To:
C:\Users\JasonPC\Desktop\JasonsWork\Project
Then PHP was able to read my files again.

Related

Curl can not download file with php

I am trying to download a file using PHP and CURL, here is my code:
set_time_limit(0);
// Set file to write to
$file = fopen($nameGenerated['path'], 'w+');
$ch = curl_init();
if ($ch == false) {
die('Failed to create curl handle');
}
// Set Curl options
curl_setopt($ch, CURLOPT_URL, str_replace(" ", "%20", $uri));
curl_setopt($ch, CURLOPT_FILE, $file);
// get curl response
curl_exec($ch);
curl_close($ch);
fclose($file);
The file is empty and curl_exec always returns false. When I tried to get the errors using curl_error($ch) there was no error.
Note: I am using Yii2.
Can someone help me understand what's the matter?
I managed to fix the problem by using the ipv4 addess instead of the usual 127.0.0.1. Now I know the problem, the CURL was excepting an ip address, not the localhost or 127.0.0.1. Thanks for all the people who tried to help!

How to download file using PHP cURL and NOT delete file from remote server

Currently I am successfully downloading a file using a PHP cURL request. The issue is that the remote file seems to get deleted after each download. Is there a setting to tell cURL to not delete the file after the download? I'm not finding anything on this when researching online. I'm actually only finding questions asking how to delete the files after downloading which is obviously working for me already but not my desired result.
Here is basically the code I am running currently:
$url = 'ftp://ftp.example.com/file.txt';
$username = 'username';
$password = 'password';
$filename = dirname(__FILE__) . '/file.csv';
$fp = fopen($filename, 'w');
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_USERPWD, $username . ':' . $password);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_FILE, $fp);
$output = curl_exec($ch);
curl_close($ch);
fclose($fp);
Would anybody happen to be able to provide me with some suggestions as to how NOT to delete the remote files?
Thanks!
Well I've contacted the remote source and sure enough they were archiving the files just to keep the directory clean.
Thanks to all!

php curl or file_get_contents too slow on hosted site

I'm using cUrl to get the file's contents of the same website's page, and writing to another file ( To convert dynamic php file into static php file for menu caching purpose )
$dynamic = 'http://mysite.in/menu.php';
$static = "../menu-static.php" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$dynamic);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
file_put_contents($static, $file);
die();
It works perfect on localhost. But taking too much time when it is running on hosted website, and at last even the output file ($static = "../menu-static.php") is empty.
I can't determine where is the problem .. Please help
I've also tried file_get_contents instead of cUrl with no luck ..

How to download a file from server using ftp and how to save that file in my local folder

this is my code but this is not working whats a problem in this code.u have any another code.give me idea to download a file from server using ftp.i am trying this code in my localhost also in my own server.
$curl = curl_init();
$file = fopen("ftpfile/file.csv", 'w');
curl_setopt($curl, CURLOPT_URL, "ftp:http://www.address.com/file.csv"); #input
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_FILE, $file); #output
curl_setopt($curl, CURLOPT_USERPWD, "myusername:mypassword");
curl_exec($curl);
I'm guessing one of two things: the fopen call failed (and you're not checking if it succeeded by seeing if $file !== false), or the double-use of _returntransfer AND _file is not acting as you expect.
returntransfer tells curl to return the retrieved data in the exec call, instead of outputting it directly to the browser. I suspect that if you did $data = curl_exec($curl); file_put_contents('ftpfile/file.csv', $data) you'd end up with a properly populatd file. So... either remove the returntransfer option, or eliminate the whole _file business and output the file yourself using what the _exec call returns.

Is there any alternative for the function file_get_contents()?

In my web hosting server, file_get_contents() function is disabled. I am looking for an alternative. please help
file_get_contents() pretty much does the following:
$filename = "/usr/local/something.txt";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
fclose($handle);
Since file_get_contents() is disabled, I'm pretty convinced the above won't work either though.
Depending on what you are trying to read, and in my experience hosts disable remote file reading usually, you might have other options. If you are trying to read remote files (over the network, ie http etc.) You could look into the cURL library functions
You can open the file with fopen, get the contents of the file and use them? And maybe cURL is usefull to you? http://php.net/manual/en/book.curl.php
A bit of everything.
function ff_get($f) {
if (!file_exists($f)) { return false; }
$result = #file_get_contents($f);
if ($result) { return $result; }
else {
$handle = #fopen($f, "r");
$contents = #fread($handle, #filesize($f));
#fclose($handle);
if ($contents) { return $contents; }
else if (!function_exists('curl_init')) { return false; }
else {
$ch = #curl_init();
#curl_setopt($ch, CURLOPT_URL, $f);
#curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$output = #curl_exec($ch);
#curl_close($ch);
if ($output) { return $output; }
else { return false; }}}}
The most obvious reason why file_get_contents() is disabled is because it loads the whole file in main memory first. The code from code_burgar could pose problems if your hoster has assigned you a very low memory limit.
As a general rule, use file_get_contents()(or -replacement) only when you are sure the file to be loaded is small. With SplFileObject you can walk trough a file line-by-line with a convenient interface. Use this in case your file is big.
Try this code:
$ch = curl_init();
$timeout = 5; // set to zero for no timeout
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$content = curl_exec($ch);
curl_close($ch);
I assume you are trying to access a file remotely through http:// or ftp://.
In theory, there are alternatives like fread() and, if all else fails, fsockopen().
But if the provider is any good at what they do, those will be disabled too.
Use the PEAR package Compat. It is like a official replacement of native PHP functions with PHP coded solutions.
require_once 'PHP/Compat.php';
PHP_Compat::loadFunction('file_get_contents');
Or, if you don't wish to use the class, you can load it manually.
require_once 'PHP/Compat/Function/file_put_contents.php';
All compat functions are wrapped by if(!function_exists()) so it is really fail save if your webhoster upgrades the server features later.
All functions can be used exactly as the same as the native PHP, also the related constants are available!
List of all available functions
If all you are trying to do is trigger a hit on a given url and don't need to read the output you can use curl() provided your web host has it enabled on your server.
The documentation here gives an example of calling a url using curl.
If all else fails, there's always cURL. There's a good chance it's installed.

Categories