Question about this helper http://codeigniter.com/user_guide/helpers/download_helper.html
If, for example, program.exe weights 4 GB, will it take a lot of PHP memory for reading and delivering that file?
$data = file_get_contents("/path/to/program.exe"); // Read the file's contents
$name = 'software.exe';
force_download($name, $data);
force_download function just set the proper HTTP headers to make the client's browser download the file. So, it won't open the file, just pass it's URL to the client.
Check the helper source code, if you need: https://bitbucket.org/ellislab/codeigniter-reactor/src/31b5c1dcf2ed/system/helpers/download_helper.php
Edit: I'd sugest creating your own version of the helper, and, instead of using strlen to get the file size, use the php function filesize, which takes only the file name as argument and returns the size in bytes.
More info, at http://www.php.net/manual/en/function.filesize.php
Yea... that could get... bad...
file_get_contents reads the entire contents of a file into a string. For large files, that can get, well, bad. I would look into readfile. Please remember too -- since CI automatically caches when you are loading a view, that means there will be no discernible benefit to readfile if it is used in a CI view. It would almost be better to handle this with an external script or by outputting directly from the controller and not calling the view at all.
Related
I am trying to generate an archive on-the-fly in PHP and send it to the user immediately (without saving it). I figured that there would be no need to create a file on disk as the data I'm sending isn't persistent anyway, however, upon searching the web, I couldn't find out how. I also don't care about the file format.
So, the question is:
Is it possible to create and manipulate a file archive in memory within a php script without creating a tempfile along the way?
I had the same problem but finally found a somewhat obscure solution and decided to share it here.
I came accross the great zip.lib.php/unzip.lib.php scripts which come with phpmyadmin and are located in the "libraries" directory.
Using zip.lib.php worked as a charm for me:
require_once(LIBS_DIR . 'zip.lib.php');
...
//create the zip
$zip = new zipfile();
//add files to the zip, passing file contents, not actual files
$zip->addFile($file_content, $file_name);
...
//prepare the proper content type
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=my_archive.zip");
header("Content-Description: Files of an applicant");
//get the zip content and send it back to the browser
echo $zip->file();
This script allows downloading of a zip, without the need of having the files as real files or saving the zip itself as a file.
It is a shame that this functionality is not part of a more generic PHP library.
Here is a link to the zip.lib.php file from the phpmyadmin source:
https://github.com/phpmyadmin/phpmyadmin/blob/RELEASE_4_5_5_1/libraries/zip.lib.php
UPDATE:
Make sure you remove the following check from the beginning of zip.lib.php as otherwise the script just terminates:
if (! defined('PHPMYADMIN')) {
exit;
}
UPDATE:
This code is available on the CodeIgniter project as well:
https://github.com/patricksavalle/CodeIgniter/blob/439ac3a87a448ae6c2cbae0890c9f672efcae32d/system/helpers/zip_helper.php
what are you using to generate the archive? You might be able to use the stream php://temp or php://memory to read and write to/from the archive.
See http://php.net/manual/en/wrappers.php.php
Regarding your comment that php://temp works for you except when you close it, try keeping it open, flushing the output, then rewind it back to 0 and read it.
Look here for more examples: http://us.php.net/manual/en/function.tmpfile.php
Also research output buffering and capturing: http://us.php.net/manual/en/function.ob-start.php
You need to use ZipArchive::addFromString - if you use addFile() the file is not actually added until you go to close it. (Horrible bug IMHO, what if you are trying to move files into a zip and you delete them before you close the zip...)
The addFromString() method adds it to the archive immediately.
Is there really a performance issue here, or does it just offend your sense of rightness? A lot of processes write temporary files and delete them, and often they never hit the disk due to caching.
A tempfile is automatically deleted when closed. That's it's nature.
There are only two ways I can think of to create a zip file in memory and serve it and both are probably more trouble than they are worth.
use a ram disk.
modify the ziparchive class to add a method that does everything the close() method does, except actually close the file. (Or add a leave-open parameter to close()).
This might not even be possible depending on the underlying C libraries.
I have a script called "image.php" that is used to count impressions and then print the image.
This script is called in this way:
<img src="path/image.php?id=12345" />
And it's used very often by my users, i see thousand of request per day
So I am looking to understand what is the best way to output the image at the end of this script:
Method 1 (actually in use):
header("Content-type: $mime"); //$mime is found with getimagesize function
readfile("$image_url");
exit;
Method 2 (pretty sure that is slowest):
header("Content-type: $mime");
echo file_get_contents("$image_url");
exit;
Method 3:
header('Location: '.$image_url);
exit();
Is method 3 better / faster than method 1?
Ok first of all Method 3 is way faster when redirected to the original file.
The first 2 methods need file access and read the file and also they don't use the browser cache!
Also when you store the rendered images, you can better let apache handle your static files.
Apache is way faster than PHP and it uses the right browser caching (3 or 4 times faster wouldn't be a suprise).
What happens is when you request a static file, apache send the Last-Modified header
If your client requests the same image again it sends the If-Modified-Since header with that same date. If the file isn't changed you server respond with an 304 Not Modified header without any data wich saves you a lot IO operations (Besides the ETAG header wich is also used)
For your impressions count of the image, you could create a cronjob that parses your apache access logs so the end-user won't even notice it. But in your case it's easier to count the impressions in your script and then redirect
Essentially, what readfile does is it reads the file directly into the output buffer while file_get_contents loads the file into the memory (string). So, when you output the results the data is copied from the memory into the output buffer, making it two times slower than readfile.
I am trying to fetch the meta information from URL results passed after a search. I have been using the OpenGraph library and also PHP's native get_meta_tags function to retrieve the meta tags.
My problem is when I am reading through the contents of a URL that happens to be a .m4v extension. The program tries to read the contents of that file but it is way too large (and not mention, completely useless as it is all junk) and my program refuses to let it go. Therefore, I am stuck till the program throws a timeout error and moves on.
Is there any way to stop reading the contents of the file if it is way too large? I tried file_get_contents() with the maxlen parameter, but it still seems to read through the entire page. How can I quickly determine if a file is structured with tags before I dive in to farm it for meta?
get_headers() is what you need, there's a Content-Type and Content-Length in the response that you might be interested in.
You might want to:
$headers=get_headers($url,1);
Use php's filesize($yourFile); to find the file size in bytes:
$size = filesize($yourFile);
if ($size < 1000) {
$string = file_get_contents($yourFile);
}
I've got a script, largely based on an example uploading PHP file from jQuery Uploader. It gets file type with the following code (it gets this $_FILES component)...
$fileType = (isset($_SERVER['HTTP_X_FILE_TYPE']) ? $_SERVER['HTTP_X_FILE_TYPE'] : $upload['type']);
Note; $upload['type'] comes from the $_FILES['files']['type'].
Now, this is fine - except for the fact that some files seem to have no fileType information from this. I can get more accurate responses from using file info and mimetype functions in PHP - but they don't work on $_FILES objects and I'm trying to do this check before I transfer the file to s3 so I don't really want to load it locally.
Can anyone advise if there's something I can to get more accurately report type from $_FILES or is it going to have to load locally in order to run these alternative PHP functions?
finfo is the only way to do this. You cannot rely on information the client sends you, it is far too easy to fake from the client side.
There is no reason that it won't work with $_FILES, you would simply pass $_FILES['files']['tmp_name'] as the file path - this is still a valid file path, and you don't need to call move_uploaded_file() to access the data. Leaving the file in the temp location also has the advantage that it will be destroyed when the script is finished if you haven't done anything with it.
I am trying to generate an archive on-the-fly in PHP and send it to the user immediately (without saving it). I figured that there would be no need to create a file on disk as the data I'm sending isn't persistent anyway, however, upon searching the web, I couldn't find out how. I also don't care about the file format.
So, the question is:
Is it possible to create and manipulate a file archive in memory within a php script without creating a tempfile along the way?
I had the same problem but finally found a somewhat obscure solution and decided to share it here.
I came accross the great zip.lib.php/unzip.lib.php scripts which come with phpmyadmin and are located in the "libraries" directory.
Using zip.lib.php worked as a charm for me:
require_once(LIBS_DIR . 'zip.lib.php');
...
//create the zip
$zip = new zipfile();
//add files to the zip, passing file contents, not actual files
$zip->addFile($file_content, $file_name);
...
//prepare the proper content type
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=my_archive.zip");
header("Content-Description: Files of an applicant");
//get the zip content and send it back to the browser
echo $zip->file();
This script allows downloading of a zip, without the need of having the files as real files or saving the zip itself as a file.
It is a shame that this functionality is not part of a more generic PHP library.
Here is a link to the zip.lib.php file from the phpmyadmin source:
https://github.com/phpmyadmin/phpmyadmin/blob/RELEASE_4_5_5_1/libraries/zip.lib.php
UPDATE:
Make sure you remove the following check from the beginning of zip.lib.php as otherwise the script just terminates:
if (! defined('PHPMYADMIN')) {
exit;
}
UPDATE:
This code is available on the CodeIgniter project as well:
https://github.com/patricksavalle/CodeIgniter/blob/439ac3a87a448ae6c2cbae0890c9f672efcae32d/system/helpers/zip_helper.php
what are you using to generate the archive? You might be able to use the stream php://temp or php://memory to read and write to/from the archive.
See http://php.net/manual/en/wrappers.php.php
Regarding your comment that php://temp works for you except when you close it, try keeping it open, flushing the output, then rewind it back to 0 and read it.
Look here for more examples: http://us.php.net/manual/en/function.tmpfile.php
Also research output buffering and capturing: http://us.php.net/manual/en/function.ob-start.php
You need to use ZipArchive::addFromString - if you use addFile() the file is not actually added until you go to close it. (Horrible bug IMHO, what if you are trying to move files into a zip and you delete them before you close the zip...)
The addFromString() method adds it to the archive immediately.
Is there really a performance issue here, or does it just offend your sense of rightness? A lot of processes write temporary files and delete them, and often they never hit the disk due to caching.
A tempfile is automatically deleted when closed. That's it's nature.
There are only two ways I can think of to create a zip file in memory and serve it and both are probably more trouble than they are worth.
use a ram disk.
modify the ziparchive class to add a method that does everything the close() method does, except actually close the file. (Or add a leave-open parameter to close()).
This might not even be possible depending on the underlying C libraries.