I'm trying to extend the PHP mailer class from Worx by adding a method which allows me to add attachments using string data rather than path to the file.
I came up with something like this:
public function addAttachmentString($string, $name='', $encoding = 'base64', $type = 'application/octet-stream')
{
$path = 'php://memory/' . md5(microtime());
$file = fopen($path, 'w');
fwrite($file, $string);
fclose($file);
$this->AddAttachment($path, $name, $encoding, $type);
}
However, all I get is a PHP warning:
PHP Warning: fopen() [<a href='function.fopen'>function.fopen</a>]: Invalid php:// URL specified
There aren't any decent examples with the original documentation, but I've found a couple around the internet (including one here on SO), and my usage appears correct according to them.
Has anyone had any success with using this?
My alternative is to create a temporary file and clean up - but that will mean having to write to disc, and this function will be used as part of a large batch process and I want to avoid slow disc operations (old server) where possible. This is only a short file but has different information for each person the script emails.
It's just php://memory. For example,
<?php
$path = 'php://memory';
$h = fopen($path, "rw+");
fwrite($h, "bugabuga");
fseek($h, 0);
echo stream_get_contents($h);
yields "bugabuga".
Quickly looking at http://php.net/manual/en/wrappers.php.php and the source code, I don't see support for the "/' . md5(microtime());" bit.
Sample Code:
<?php
print "Trying with md5\n";
$path = 'php://memory/' . md5(microtime());
$file = fopen($path, 'w');
if ($file)
{
fwrite($file, "blah");
fclose($file);
}
print "done - with md5\n";
print "Trying without md5\n";
$path = 'php://memory';
$file = fopen($path, 'w');
if ($file)
{
fwrite($file, "blah");
fclose($file);
}
print "done - no md5\n";
Output:
buzzbee ~$ php test.php
Trying with md5
Warning: fopen(): Invalid php:// URL specified in test.php on line 4
Warning: fopen(php://memory/d2a0eef34dff2b8cc40bca14a761a8eb): failed to open stream: operation failed in test.php on line 4
done - with md5
Trying without md5
done - no md5
The problem here simply is the type and the syntax:
php://memory and php://temp are read-write streams that allow temporary data to be stored in a file-like wrapper. The only difference between the two is that php://memory will always store its data in memory, whereas php://temp will use a temporary file once the amount of data stored hits a predefined limit (the default is 2 MB). The location of this temporary file is determined in the same way as the sys_get_temp_dir() function.
In short, the type you want is temp instead and the syntax you want is:
php://temp/maxmemory:$limit
The $limit is in bytes. You want to count that using safe byte functions.
Related
I have the contents of a file in a string. I need to pass this file to a function where the function is expecting the parameter to be the name of the file, not the contents. The obvious and probably simplest way to do this would be to write the contents to a temp file, then pass that file name to the function, and unlink the file once I'm finished.
However, I'm looking for a solution that doesn't involve writing the file out to the file system and then reading it back in. I've had a need for this in multiple cases, so I'm not looking for a work-around to a specific function, but more of a generic method that will work for any function expecting a file name (like file_get_contents(), for instance).
Here are some thoughts, but not sure how to pursue these yet:
Is it possible to write the contents somewhere in memory, and then
pass that to the function as a filename? Perhaps something using
php://memory.
Is it possible to write the contents to a pipe, then pass the name of the
pipe to the function?
I did a short proof-of-concept trying with php://memory as follows, but no luck:
$data = "This is some file data.\n";
file_put_contents( 'php://memory', $data );
echo file_get_contents( 'php://memory' );
Would be interested in knowing of good ways to address this. Googling hasn't come up with anything for me.
It mainly depends on what the target function does with the file name. If you're lucky, you can register your own stream wrapper:
stream_wrapper_register('demo', 'DemoStream');
$data = "This is some file data.\n";
$filename = 'demo://foo';
file_put_contents($filename, $data );
echo file_get_contents($filename);
Why not use a file in the /tmp/ directory? Like this:
<?php
$filename = '/tmp/mytmpfile';
$data = "This is some file.\n";
file_put_contents($filename, $data);
$result = file_get_contents($filename);
var_dump($result);
Well, as you say you don't want to use a file, you shouldn't use file_get_contents().
But you can achieve the same result by using stream_get_contents(), like this:
<?php
$data = "This is some file data.\n";
$handle = fopen('php://memory', 'r+'); // open an r/w handle to memory
fputs($handle, $data); // write the data
rewind($handle); // rewind the pointer
echo stream_get_contents($handle); // retrieve the contents
One of our systems is running on PHP 4 and no I can't change that.
The fgetcsv function seems to return null no matter what file I upload.
Very simply put:
$handle = fopen($file,"r");
var_dump(fgetcsv($handle));
fclose($handle);
This will print out "NULL".
Doing var_dump on the $handle object does give me a resource:
resource(33) of type (stream)
But I just get NULL when using fgetcsv
I can get the contents of the file using file_get_contents, but then it's more awkawrd to parse it as a csv.
As I say, I can't really do anything about it being on PHP 4. Does anyone know what might be causing this, or shall I find another way?
Thanks
Your original issue may be related to temporary uploaded file usage.
Try to open it after move_uploaded_file
Also, fseek($handle, 0) can help theoretically, because it was read already anywhere.
I can get the contents of the file using file_get_contents
You can try to use tmpfile then:
$csv = file_get_contents($file);
$temp = tmpfile();
fwrite($temp, $csv);
fseek($temp, 0); // prepare for read at start
$data = fgetcsv($temp);
fclose($temp); // file autoremoved here
I was curious to do a test. My question is if is it possible to open file for both reading and writing, so if I have more read-write operations to do on one file I do not need to close the reading status, read, open for write status, write and so on in a loop.
$filename = "test.txt";
$handle = fopen($filename, "rwb");
fseek( $handle , 15360 );
$contents = fread($handle, 51200);
$start = microtime (true);
fseek( $handle , 1 );
fwrite ( $handle , $contents );
fclose($handle);
This test does not work. I expected, I will read the data and move the fseek pointer to begin of the file either 1 or 0 position and then I will write the data. But this action failed for some reason with a result 0 (int) bytes written. Hence my question is, is it possible to do it? Or I need to close file for reading first?
As a related sub-question - is it possible that more users can read or write from files simultaneously from different position. As this should simulate database read/write operations. You know how mysql works - more users can write same table - same file any time. I know this is not problem in C/C++ but is it possible to do it in php?
You can create multiple file handlers on the same file. Just fopen() it twice, one with read only, the other with read/write. Although I'm not sure why you'd want to do so unless you're reading and writing from two different point in the file.
$filename = "test.txt";
$rw_handle = fopen($filename, "c+"); //open for read/write, allow fseek
$r_handle = fopen($filename, "r");
If you want to have multiple processes reading and writing a file from different locations, you'll want to file lock with flock()
Which would be the best way to download a file from another domain in PHP?
i.e. A zip file.
The easiest one is file_get_contents(), a more advanced way would be with cURL for example. You can store the data to your harddrive with file_put_contents().
normally, the fopen functions work for remote files too, so you could do the following to circumvent the memory limit (but it's slower than file_get_contents)
<?php
$remote = fopen("http://www.example.com/file.zip", "rb");
$local = fopen("local_name_of_file.zip", 'w');
while (!feof($remote)) {
$content = fread($remote, 8192);
fwrite($local, $content);
}
fclose($local);
fclose($remote);
?>
copied from here: http://www.php.net/fread
You may use one code line to do this:
copy(URL, destination);
This function returns TRUE on success and FALSE on failure.
Code works fine, except for the fact that there is a problem here:
//Log Events
function logEvent($newinput) {
if ($newinput !== NULL) {
// Add a timestamp to the start of the $message
$newinput = date("Y/m/d H:i:s").': '.$newinput;
$fp = fopen('log.txt', 'w');
fwrite($fp, $newinput."\n");
fclose($fp);
}
}
//Problem writing these two lines to log.txt?
//The bad, the two lines below are not on the log.txt
logEvent('Selection'.$selections[$selection]);
logEvent('Change' . $change. 'cents.');
//This line is written to a text file (log.txt), okay that's good.
logEvent('Input' . $newinput);
i think you're not appending to the file, you're rewriting it. try fopen with 'a' instead of 'w'.
You need to use the append modifier when opening the file, you've gone
fopen('log.txt', 'w');
this means that every time you call that function, the log file is getting blown out and recreated, if you instead used
fopen('log.txt', 'a');
then your new log entries will append to the file instead.
You could also look into keeping the file open for later inserts too, but there may be issues with multiple updates in other requests.