In my web hosting server, file_get_contents() function is disabled. I am looking for an alternative. please help
file_get_contents() pretty much does the following:
$filename = "/usr/local/something.txt";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
fclose($handle);
Since file_get_contents() is disabled, I'm pretty convinced the above won't work either though.
Depending on what you are trying to read, and in my experience hosts disable remote file reading usually, you might have other options. If you are trying to read remote files (over the network, ie http etc.) You could look into the cURL library functions
You can open the file with fopen, get the contents of the file and use them? And maybe cURL is usefull to you? http://php.net/manual/en/book.curl.php
A bit of everything.
function ff_get($f) {
if (!file_exists($f)) { return false; }
$result = #file_get_contents($f);
if ($result) { return $result; }
else {
$handle = #fopen($f, "r");
$contents = #fread($handle, #filesize($f));
#fclose($handle);
if ($contents) { return $contents; }
else if (!function_exists('curl_init')) { return false; }
else {
$ch = #curl_init();
#curl_setopt($ch, CURLOPT_URL, $f);
#curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$output = #curl_exec($ch);
#curl_close($ch);
if ($output) { return $output; }
else { return false; }}}}
The most obvious reason why file_get_contents() is disabled is because it loads the whole file in main memory first. The code from code_burgar could pose problems if your hoster has assigned you a very low memory limit.
As a general rule, use file_get_contents()(or -replacement) only when you are sure the file to be loaded is small. With SplFileObject you can walk trough a file line-by-line with a convenient interface. Use this in case your file is big.
Try this code:
$ch = curl_init();
$timeout = 5; // set to zero for no timeout
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$content = curl_exec($ch);
curl_close($ch);
I assume you are trying to access a file remotely through http:// or ftp://.
In theory, there are alternatives like fread() and, if all else fails, fsockopen().
But if the provider is any good at what they do, those will be disabled too.
Use the PEAR package Compat. It is like a official replacement of native PHP functions with PHP coded solutions.
require_once 'PHP/Compat.php';
PHP_Compat::loadFunction('file_get_contents');
Or, if you don't wish to use the class, you can load it manually.
require_once 'PHP/Compat/Function/file_put_contents.php';
All compat functions are wrapped by if(!function_exists()) so it is really fail save if your webhoster upgrades the server features later.
All functions can be used exactly as the same as the native PHP, also the related constants are available!
List of all available functions
If all you are trying to do is trigger a hit on a given url and don't need to read the output you can use curl() provided your web host has it enabled on your server.
The documentation here gives an example of calling a url using curl.
If all else fails, there's always cURL. There's a good chance it's installed.
Related
I am trying to download a file using PHP and CURL, here is my code:
set_time_limit(0);
// Set file to write to
$file = fopen($nameGenerated['path'], 'w+');
$ch = curl_init();
if ($ch == false) {
die('Failed to create curl handle');
}
// Set Curl options
curl_setopt($ch, CURLOPT_URL, str_replace(" ", "%20", $uri));
curl_setopt($ch, CURLOPT_FILE, $file);
// get curl response
curl_exec($ch);
curl_close($ch);
fclose($file);
The file is empty and curl_exec always returns false. When I tried to get the errors using curl_error($ch) there was no error.
Note: I am using Yii2.
Can someone help me understand what's the matter?
I managed to fix the problem by using the ipv4 addess instead of the usual 127.0.0.1. Now I know the problem, the CURL was excepting an ip address, not the localhost or 127.0.0.1. Thanks for all the people who tried to help!
I'm using cUrl to get the file's contents of the same website's page, and writing to another file ( To convert dynamic php file into static php file for menu caching purpose )
$dynamic = 'http://mysite.in/menu.php';
$static = "../menu-static.php" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$dynamic);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
file_put_contents($static, $file);
die();
It works perfect on localhost. But taking too much time when it is running on hosted website, and at last even the output file ($static = "../menu-static.php") is empty.
I can't determine where is the problem .. Please help
I've also tried file_get_contents instead of cUrl with no luck ..
I need to include the output/result of a PHP file in another file.
I found info online about using curl to do this, but it doesn't seem to work so well, and so efficiently.
This is my current code:
function curl_load($url){
curl_setopt($ch=curl_init(), CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$response = curl_exec($ch);
curl_close($ch);
return $response;
}
$url = "http://domain.com/file.php";
$output = curl_load($url);
echo "output={$output}";
Any recommendations on what I can use to make this more efficient/work better?
I'm looking for whichever method would be the fastest and most efficient, since I have a bunch of connections/users that will be using this file constantly to get updated information.
Thanks!
file_get_contents() may suitable for you
$homepage = file_get_contents('http://www.example.com/');
echo $homepage;
you can also use the file() or fopen()
$homepage = file('http://www.example.com/');
$homepage = fopen("http://www.example.com/", "r");
I ended up going with a PHP include statement, to include the other file.
I had forgotten to mention that the file was local, and this seems to make the most sense at this point - instead of echoing the result in the other PHP file, I'm just setting the result as a variable, and then pulling that variable in my other file.
I'm having trouble downloading a remote file via PHP.
I've tried using cURL and streaming, neither of which produces an error.
Here's my current code for streaming.
$url = "http://commissiongeek.com/files/text.txt";
$path = "/files/cb.txt";
file_put_contents($path, file_get_contents($url));
I'll be downloading a zip file when I get this working, but in theory this should work just fine...
The folder's permissions are set to 777, and as said before, no errors are being thrown.
What could cause this?
Split this up into multiple sections, so you can verify that each stage is working:
$url = 'http://...';
$txt = file_get_contents($url);
var_dump($txt);
var_dump(file_put_contents('/files/cb.txt', $txt));
The first dump SHOULD show you whatever that text that url returns. The second dump should output a boolean true/false depending on if the file_put failed or not.
It seems you have an absolute path that you are trying to save in. I believe you want the path changed to "files/cb.txt" instead and do not have any access to /files/
If you have allow_url_fopen set to true:
$url = 'http://example.com/image.php';
$img = '/my/folder/flower.gif';
file_put_contents($img, file_get_contents($url));
Else use cURL:
$ch = curl_init('http://example.com/image.php');
$fp = fopen('/my/folder/flower.gif', 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
I recently upgraded my XAMPP from PHP 5.2 to 5.3.1
I seem to be having a problem with file_get_contents().
I can use the function to get something like "http://www.google.com", but it times out when I use it on a domain I have setup locally e.g. "http://localhost/my_dir/my_css_file.css".
I'm not really sure what the problem is. If it's a bug, is there a viable alternative?
Kindly advise.
Try to use include() instead of file_get_contents().
<?php include($_SERVER['HTTP_HOST'] . "/my_dir/my_css_file.css"); ?>
or
<?php include($_SERVER['DOCUMENT_ROOT'] . "/my_dir/my_css_file.css"); ?>
Updates corresponding your comments:
$string = get_include_contents('somefile.php');
function get_include_contents($filename) {
if (is_file($filename)) {
ob_start();
include $filename;
$contents = ob_get_contents();
ob_end_clean();
return $contents;
}
return false;
}
This will get file data into variable $string.
Any chance you're on a Windows system?
There is a bug in a combination of Windows, file_get_contents and localhost which is not going to be fixed. See Bug 38826 and Bug 40881
Try using 127.0.0.1 instead of localhost or set up any different domain name. Then you should get it working.
Solved this by using CURL. Here's the code. It will work with remote files e.g.
http://yourdomain.com/file.ext
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, ''.$file_path_str.'');
curl_setopt($ch, CURLOPT_HTTPGET, 1);
curl_setopt ($ch, CURLOPT_HEADER, 0);
curl_setopt ($ch, CURLOPT_USERAGENT, sprintf("Mozilla/%d.0",rand(4,5)));
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_MAXREDIRS, 10);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$curl_response_res = curl_exec ($ch);
curl_close ($ch);
I could not use #James solution because I'm using ob_start and ob_flush elsewhere in my code, so that would have messed things up for me.
I recently solved this problem. On my windows machine it is acceptable to have spaces in your folder names. However PHP wasnt able to read this path.
I changed my folder name
From:
C:\Users\JasonPC\Desktop\Jasons Work\Project
To:
C:\Users\JasonPC\Desktop\JasonsWork\Project
Then PHP was able to read my files again.