Use php://temp wrapper with XMLWriter - php

Is it possible to use the php://temp wrapper to generate an XML file with XMLWriter? I like the features it provides (memory for small files, transparent temporary file for larger output) but I can't get the syntax (if it's even possible):
<?php
header('Content-type: text/xml; charset=UTF-8');
$oXMLWriter = new XMLWriter;
$oXMLWriter->openURI('php://temp');
$oXMLWriter->startDocument('1.0', 'UTF-8');
$oXMLWriter->startElement('test');
$oXMLWriter->text('Hello, World!');
$oXMLWriter->endElement();
$oXMLWriter->endDocument();
// And now? *******
$oXMLWriter->flush();

I don't understand the purpose of writing to a temp file. Perhaps you want:
$oXMLWriter->openURI('php://output');
I haven't ever used XMLWriter but it doesn't seem to take a handle to a file pointer. I think that's really what you want.
For giggles, here's something that wraps the temp interface:
class WeirdStream
{
static public $files = array();
private $fp;
public function stream_open($path)
{
$url = parse_url($path);
self::$files[$url['host']] = fopen('php://temp', 'rw');
$this->fp = &self::$files[$url['host']];
return true;
}
public function stream_write($data)
{
return fwrite($this->fp, $data);
}
}
stream_wrapper_register('weird', 'WeirdStream');
$oXMLWriter = new XMLWriter;
$oXMLWriter->openURI('weird://a');
// .. do stuff
$oXMLWriter->flush();
Now you can get at the file pointer:
$fp = WeirdStream::$files['a'];
It may be purely in memory, or it may be a temporary file on disk.
You could then loop through the data line by line:
fseek($fp, 0, SEEK_SET);
while (!feof($fp)) $line = fgets($fp);
But this is all very odd to me.

What do you need to do with the contents of php://temp eventually? If you just need a temporary, memory-only storage, then you can use openMemory():
$oXMLWriter = new XMLWriter;
$oXMLWriter->openMemory();
$oXMLWriter->startDocument('1.0', 'UTF-8');
$oXMLWriter->startElement('test');
$oXMLWriter->text('Hello, World!');
$oXMLWriter->endElement();
$oXMLWriter->endDocument();
echo $oXMLWriter->outputMemory ();

Related

Transfer a file of any type in 1k chunks over HTTP

I need to transfer files of any type or size over HTTP/GET in ~1k chunks. The resulting file hash needs to match the source file. This needs to be done in native PHP without any special tools. I have a basic strategy but I'm getting odd results. This proof of concept just copies the file locally.
CODE
<?php
$input="/home/lm1/Music/Ellise - Feeling Something Bad.mp3";
$a=pathinfo($input);
$output=$a["basename"];
echo "\n> ".md5_file($input);
$fp=fopen($input,'rb');
if ($fp) {
while(!feof($fp)) {
$buffer=base64_encode(fread($fp,1024));
// echo "\n\n".md5($buffer);
write($output,$buffer);
}
fclose($fp);
echo "\n> ".md5_file($output);
echo "\n";
}
function write($file,$buffer) {
// echo "\n".md5($buffer);
$fp = fopen($file, 'ab');
fwrite($fp, base64_decode($buffer));
fclose($fp);
}
?>
OUTPUT
> d31e102b1cae9c73bbf5a12615a8ea36
> 9f03f6c88ed61c07cb534922d6d31864
Thanks in advance.
fread already advances the file pointer position, so there's no need to keep track of it. Same with frwite, so consecutive calls automatically append to the given file. Thus, you could simplify your approach to (code adapted from this answer on how to efficiently write a large input stream to a file):
$src = "a.test";
$dest = "b.test";
$fp_src = fopen($src, 'rb');
if ($fp_src) {
$fp_dest = fopen($dest, 'wb');
$buffer_size = 1024;
while(!feof($fp_src)) {
fwrite($fp_dest, fread($fp_src, $buffer_size));
}
fclose($fp_src);
fclose($fp_dest);
echo md5_file($src)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
echo md5_file($dest)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
}
If you want to keep both processes separated, you'd do:
$src = "a.test";
$dest = "b.test";
if (file_exists($dest)) {
unlink($dest); // So we don't append to an existing file
}
$fp = fopen($src,'rb');
if ($fp) {
while(!feof($fp)){
$buffer = base64_encode(fread($fp, 1024));
write($dest, $buffer);
}
fclose($fp);
}
function write($file, $buffer) {
$fp = fopen($file, 'ab');
fwrite($fp, base64_decode($buffer));
fclose($fp);
}
echo md5_file($src)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
echo md5_file($dest)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
As for how to stream files over HTTP, you might want to have a look at:
Streaming a large file using PHP

Read and iterate .txt in PHP

I sense that I am almost there.
Here is a .txt file, which is about 60 Kbytes and full of German words. Every word is on a new line.
I want to iterate through it with this code:
<?php
$file = "GermanWords.txt";
$f = fopen($file,"r");
$parts = explode("\n", $f);
foreach ($parts as &$v)
{
echo $v;
}
?>
When I execute this code, I get: Resourceid#2
The word resource is not in the .txt, I do not know where it comes from.
How can I manage to show up all words in the txt?
No need for fopen just use file_get_contents:
$file = "GermanWords.txt";
$contents = file_get_contents($file);
$lines = explode("\n", $contents); // this is your array of words
foreach($lines as $word) {
echo $word;
}
fopen() just opens the file, it doesn't read it -- In your code, $f contains a file handle, not the file contents. This is where the word "Resource" comes from; it's PHP's internal name for the file handle.
One answer would be to replace fopen() with file_get_contents(). This opens and reads the file in one action. This would solve the problem, but if the file is big, you probably don't want to read the whole thing into memory in one go.
So I would suggest instead using SplFileObject(). The code would look like this:
<?php
$file = "GermanWords.txt";
$parts = new SplFileObject($file);
foreach ($parts as $line) {
echo $line;
}
?>
It only reads into memory one line at at time, so you don't have to worry about the size of the file.
Hope that helps.
See the PHP manual for more info: http://php.net/manual/en/splfileobject.construct.php
$f, the result of fopen is a resource, not the contents of the file. If you just want an array of the lines contained in the file, you can use file:
$parts = file('GermanWords.txt');
foreach($parts as $v){
echo $v;
}
Alternatively, if you want to stick with fopen you can use fread to read the content:
$f = fopen('GermanWords.txt', 'r');
// read the entire file into $contents
$contents = fread($f, filesize('GermanWords.txt'));
fclose($handle);
$parts = explode("\n", $contents);
The SplFileObject provides a way to do that :
$file = new SplFileObject("file.txt");
while (!$file->eof()) {
echo $file->fgets();
}
And if you prefer the foreach loop, you can create a generator function for that :
function lines($filename) {
$file = new SplFileObject($filename);
while (!$file->eof()) {
yield $file->fgets();
}
}
foreach (lines('German.txt') as $line) {
echo $line;
}
Reading the entire content of the file (with file_get_contents) before treating it can be memory consuming.
If you want to treat a file line by line, this class might help you.
It implements an Iterator (see phpdoc about it), that can be walked through in a foreach loop. Only the last line read is stored in memory.
class TxtFileIterator implements \Iterator{
protected $fileHandler;
protected $key;
protected $current;
protected $fileName;
function __construct($fileName){
$this->fileHandler = fopen($fileName, "r") or die("Unable to open file!");
$this->fileName = $fileName;
$this->key = 0;
}
function __destruct(){
fclose( $this->fileHandler );
}
//Iterator interface
public function current (){
return $this->current;
}
public function key (){
return $this->key;
}
public function next (){
if ( $this->valid() ){
$this->current = fgets( $this->fileHandler );
$this->key++;
}
}
public function rewind (){
$this->__destruct();
$this->__construct( $this->fileName );
}
public function valid (){
return !feof( $this->fileHandler );
}
Usage :
$iterator = new TxtFileIterator("German.txt");
foreach ($iterator as $line) {
echo $line;// or do whatever you want with line
}

What is the best way to write a large file to disk in PHP?

I have a PHP script that occasionally needs to write large files to disk. Using file_put_contents(), if the file is large enough (in this case around 2 MB), the PHP script runs out of memory (PHP Fatal error: Allowed memory size of ######## bytes exhausted). I know I could just increase the memory limit, but that doesn't seem like a full solution to me--there has to be a better way, right?
What is the best way to write a large file to disk in PHP?
You'll need a temporary file in which you put bits of the source file plus what's to be appended:
$sp = fopen('source', 'r');
$op = fopen('tempfile', 'w');
while (!feof($sp)) {
$buffer = fread($sp, 512); // use a buffer of 512 bytes
fwrite($op, $buffer);
}
// append new data
fwrite($op, $new_data);
// close handles
fclose($op);
fclose($sp);
// make temporary file the new source
rename('tempfile', 'source');
That way, the whole contents of source aren't read into memory. When using cURL, you might omit setting CURLOPT_RETURNTRANSFER and instead, add an output buffer that writes to a temporary file:
function write_temp($buffer) {
global $handle;
fwrite($handle, $buffer);
return ''; // return EMPTY string, so nothing's internally buffered
}
$handle = fopen('tempfile', 'w');
ob_start('write_temp');
$curl_handle = curl_init('http://example.com/');
curl_setopt($curl_handle, CURLOPT_BUFFERSIZE, 512);
curl_exec($curl_handle);
ob_end_clean();
fclose($handle);
It seems as though I always miss the obvious. As pointed out by Marc, there's CURLOPT_FILE to directly write the response to disk.
Writing line by line (or packet by packet in case of binary files) using functions like fwrite()
Try this answer:
$file = fopen("file.json", "w");
$pieces = str_split($content, 1024 * 4);
foreach ($pieces as $piece) {
fwrite($file, $piece, strlen($piece));
}
fclose($file);

Write to a file using PHP

Bassicly what I want to do is using PHP open a xml file and edit it using php now this I can do using fopen() function.
Yet my issue it that i want to append text to the middle of the document. So lets say the xml file has 10 lines and I want to append something before the last line (10) so now it will be 11 lines. Is this possible. Thanks
Depending on how large that file is, you might do:
$lines = array();
$fp = fopen('file.xml','r');
while (!feof($fp))
$lines[] = trim(fgets($fp));
fclose($fp);
array_splice($lines, 9, 0, array('newline1','newline2',...));
$new_content = implode("\n", $lines);
Still, you'll need to revalidate XML-syntax afterwards...
If you want to be able to modify a file from the middle, use the c+ open mode:
$fp = fopen('test.txt', 'c+');
for ($i=0;$i<5;$i++) {
fgets($fp);
}
fwrite($fp, "foo\n");
fclose($fp);
The above will write "foo" on the fifth line, without having to read the file entirely.
However, if you are modifying a XML document, it's probably better to use a DOM parser:
$dom = new DOMDocument;
$dom->load('myfile.xml');
$linenum = 5;
$newNode = $dom->createElement('hello', 'world');
$element = $dom->firstChild->firstChild; // skips the root node
while ($element) {
if ($element->getLineNo() == $linenum) {
$element->parentNode->insertBefore($newNode, $element);
break;
}
$element = $element->nextSibling;
}
echo $dom->saveXML();
Of course, the above code depends on the actual XML document structure. But, the $element->getLineNo() is the key here.

Unpack large files with gzip in PHP

I'm using a simple unzip function (as seen below) for my files so I don't have to unzip files manually before they are processed further.
function uncompress($srcName, $dstName) {
$string = implode("", gzfile($srcName));
$fp = fopen($dstName, "w");
fwrite($fp, $string, strlen($string));
fclose($fp);
}
The problem is that if the gzip file is large (e.g. 50mb) the unzipping takes a large amount of ram to process.
The question: can I parse a gzipped file in chunks and still get the correct result? Or is there a better other way to handle the issue of extracting large gzip files (even if it takes a few seconds more)?
gzfile() is a convenience method that calls gzopen, gzread, and gzclose.
So, yes, you can manually do the gzopen and gzread the file in chunks.
This will uncompress the file in 4kB chunks:
function uncompress($srcName, $dstName) {
$sfp = gzopen($srcName, "rb");
$fp = fopen($dstName, "w");
while (!gzeof($sfp)) {
$string = gzread($sfp, 4096);
fwrite($fp, $string, strlen($string));
}
gzclose($sfp);
fclose($fp);
}
try with
function uncompress($srcName, $dstName) {
$fp = fopen($dstName, "w");
fwrite($fp, implode("", gzfile($srcName)));
fclose($fp);
}
$length parameter is optional.
If you are on a Linux host, have the required privilegies to run commands, and the gzip command is installed, you could try calling it with something like shell_exec
SOmething a bit like this, I guess, would do :
shell_exec('gzip -d your_file.gz');
This way, the file wouldn't be unzip by PHP.
As a sidenote :
Take care where the command is run from (ot use a swith to tell "decompress to that directory")
You might want to take a look at escapeshellarg too ;-)
As maliayas mentioned, it may lead to a bug. I experienced an unexpected fall out of the while loop, but the gz file has been decompressed successfully. The whole code looks like this and works better for me:
function gzDecompressFile($srcName, $dstName) {
$error = false;
if( $file = gzopen($srcName, 'rb') ) { // open gz file
$out_file = fopen($dstName, 'wb'); // open destination file
while (($string = gzread($file, 4096)) != '') { // read 4kb at a time
if( !fwrite($out_file, $string) ) { // check if writing was successful
$error = true;
}
}
// close files
fclose($out_file);
gzclose($file);
} else {
$error = true;
}
if ($error)
return false;
else
return true;
}

Categories