I have this code on PHP that load a local file:
$filename = "fille.txt";
$fp = fopen($filename, "rb");
$content = fread($fp, 25699);
fclose($fp);
print_r($content);
With this code I can see all the contents of the file. But when I change the $filename to a external link, like:
$filename = "https:/.../texts/fille.txt";
I can't see all the contents of the file, he appears cut to me. Whats the problem?
The fread() function can be used for network operations. But network connections work different than file system operations. A network cannot read a bigger file in a single attempt, that is not how typical networks work. Instead they work package based. So data arrives in chunks.
And if you take a look into the documentation of the function you use then you will see that:
Reading stops as soon as one of the following conditions is met:
[...]
a packet becomes available or the socket timeout occurs (for network streams)
[...]
So what you observe actually is documented behavior. You need to continue to read packages in a loop to get the whole file. Until you received an EOF.
Take a look yourself: https://www.php.net/manual/en/function.fread.php
And further down in that documentation you will see that example:
Example #3 Remote fread() examples
<?php
$handle = fopen("http://www.example.com/", "rb");
if (FALSE === $handle) {
exit("Failed to open stream to URL");
}
$contents = '';
while (!feof($handle)) {
$contents .= fread($handle, 8192);
}
fclose($handle);
?>
Related
I have a simple PHP script to read a remote file line-by-line, and then JSON decode it. On the production server all works ok, but on my local machine (MAMP stack, OSX) the PHP hangs. It is very slow, and takes more than 2 minutes to produce the JSON file. I think it's the json_decode() that is freezing. Why only on MAMP?
I think it's stuck in while loop, because I can't show the final $str variable that is the result of all the lines.
In case you are wondering why I need to read the file line-by-line, it's because in the real scenario, the remote JSON file is a 40MB text file. My only good performance result is like this, but any good suggestion?
Is there a configuration in php.ini to help solve this?
// The path to the JSON File
$fileName = 'http://www.xxxx.xxx/response-single.json';
//Open the file in "reading only" mode.
$fileHandle = fopen($fileName, "r");
//If we failed to get a file handle, throw an Exception.
if($fileHandle === false){
error_log("erro handle");
throw new Exception('Could not get file handle for: ' . $fileName);
}
//While we haven't reach the end of the file.
$str = "";
while(!feof($fileHandle)) {
//Read the current line in.
$line = fgets($fileHandle);
$str .= $line;
}
//Finally, close the file handle.
fclose($fileHandle);
$json = json_decode($str, true); // decode the JSON into an associative array
Thanks for your time.
I found the cause. It is path protocol.
With
$filename = 'http://www.yyy/response.json';
It freezes the server for 1 to 2 minutes.
I changed the file to another server with https protocol, and used
$filename = 'https://www.yyy/response.json';
and it works.
I'm trying to learn about creating web bots and I'm working my way through a book called Webbots, Spiders, and Screen Scrapers by Michael Schrenk. In the book he gives example code for a basic bot that downloads a webpage. I have copied the code exactly as it is in the book (sans comments):
<?
$target = "http://www.schrenk.com/nostarch/webbots/hello_world.html";
$downloaded_page_array = file($target);
for($xx=0; $xx<count($downloaded_page_array); $xx++)
echo $downloaded_page_array[$xx];
?>
I put this code in a php file and uploaded to my site. When I navigate to it in the browser however, nothing happens. It just loads a blank page. No content.
Earlier I tried another snippet that the author provided, again, this one was copied EXACTLY from the book, only with this one I didn't really get a blank page, the page just tried to load until it eventually timed out. Never got the correct content back:
$target = "http://www.schrenk.com/nostarch/webbots/hello_world.html";
$file_handle = fopen($target, "r");
while (!feof($file_handle))
echo fgets($file_handle, 4096);
fclose($file_handle);
I have checked the URL to make sure the file exists and it does. I have no idea why this wouldn't work. I've read through how to use the file(); and fopen(); functions in PHP but from what I can tell they are both being used correctly. What am I doing wrong here?
Accessing URLs via fopen() is a bad idea. It requires you to have allow_url_fopen enabled in your PHP config, which opens the door to a vast number of exploits (hosters disable it for a reason).
Try using cURL functions instead: they will give you much more flexibility and control. PHP documentation gives you some great examples to start with.
Not fgets($file_handle, 4096) but fread($file_handle, 4096) ;
$target = "http://www.schrenk.com/nostarch/webbots/hello_world.html";
$file_handle = fopen($target, "r");
while (!feof($file_handle))
echo fread($file_handle, 4096);
fclose($file_handle);
Then later if you want to create a new file from the extracted text :
// extracting text operation
$target = "http://www.schrenk.com/nostarch/webbots/hello_world.html";
$file_handle = fopen($target, "r");
$getText = fread($file_handle, 4096);
fclose($file_handle);
// writing file operation
$writeHandle = fopen ("folder/text.txt","w"); // file will be created if not existed
$writeFile = fwrite($writeHandle,$getText );
fclose($writeHandle );
First you should put error_reporting(E_ALL); ini_set('display_errors', '1'); to your script to enable displaying errors in your script as AbraCadaver mentioned in his comment.
A reason could be, that allow_url_fopen is disabled on your hosting.
This option enables the URL-aware fopen wrappers that enable accessing URL object like files. Default wrappers are provided for the access of remote files using the ftp or http protocol, some extensions like zlib may register additional wrappers.
See: http://php.net/manual/en/filesystem.configuration.php#ini.allow-url-fopen
You can check that via:
var_dump(ini_get('allow_url_fopen'));
Your script requires true to run correct.
If allow_url_fopen is not true or 1 you can try to use file_get_contents() to load a url.
<?php
$homepage = file_get_contents('http://www.example.com/');
echo $homepage;
?>
See: http://php.net/manual/en/function.file-get-contents.php
I have a problem with the fopen function and the opening mode argument.
Code
function writeOnFile($url, $text)
{
$fp = fopen($url, "r");
echo "<br>read: ".fgets($fp);
//fwrite($fp, $text);
fclose($fp);
}
If I use "r" as the opening mode the echo line works... but if I change this argument for any other (using the same url file) it stops working and only see "read:" and nothing else.
I tried with "w+" "r+" "a+"... but no one works.
What I am trying to read is a txt file and I changed the permissions of the file and now are 777...
What am I doing wrong?
Given your variable naming, $url suggests you're trying to write to a http://example.com/.... This is not possible. You cannot "write" to a url, because PHP has absolutely NO idea what protocol the remote server is expecting. E.g. PHP by some miracle decides to let this URL through, and uses http POST... but the server is expecting an http PUT. Ooops, that won't work.
As well, never EVER assume an operation on an external resource succeeded. Always assume failure, and treat success as a pleasant surprise:
function writeOnFile($url, $text)
if (!is_writeable($url)) {
die("$url is not writeable");
}
$fp = fopen($url, "w");
if (!$fp) {
die("Unable to open $url for writing");
}
etc...
}
In PHP, is there a way to pipe or stream the contents of one "file" to another?
Here's what I'm doing now,
$rfp = fopen('php://input', 'r');
$wfp = fopen($this->file_path, 'x');
while(!feof($rfp)) {
fwrite($wfp, fread($rfp, 1048576));
}
fclose($wfp);
fclose($rfp);
Which works fine, but it seems funny to read an arbitrary chunk from the input stream before writing it to the output stream. I'd imagine the file system or OS could do it a bit more efficiently if I could just tell it to read from one place and send it straight to another.
In Node you can 'pipe' one file stream to another. Is there a function for this in PHP?
try
copy("php://input", $this->file_path);
or second method:
$objStream = fopen("php://input", "r");
$dest = fopen($this->file_path, 'x');
stream_copy_to_stream($objStream , $dest);
I am trying to find an alternative to reading php://input. I use this for getting XML data from a CURL PUT.
I usually do this with:
$xml = file_get_contents('php://input');
However, I'm having a few issues with file_get_contents() on Windows.
Is there an alternative, perhaps using fopen() or fread()?
Yes, you can do:
$f = fopen('php://input', 'r');
if (!$f) die("Couldn't open input stream\n");
$data = '';
while ($buffer = fread($f, 8192)) $data .= $buffer;
fclose($f);
But, the question you have to ask yourself is why isn't file_get_contents working on windows? Because if it's not working, I doubt fopen would work for the same stream...
Ok. I think I've found a solution.
$f = #fopen("php://input", "r");
$file_data_str = stream_get_contents($f);
fclose($f);
Plus, with this, I'm not mandated to put in a file size.