the relevant code for the download script:
$fp = #fopen($file, 'rb');
while(!feof($fp) && ($p = ftell($fp)) <= $end) {
if ($p + $buffer > $end) {
$buffer = $end - $p + 1;
}
echo fread($fp, $buffer);
flush();
if($limits["Max_DL"]) sleep(1);
}
fclose($fp);
while a download is in progress, any other pages for the same site dont load. but they do still load in another browser. I am thinking this has something to do with the fact that the download page is continuously "loading" so stopping other pages from loading?
any suggestions on how i can fix this? for large files i dont want the user to not be able to browse the site while they are in the process of a download
If your download script calls session_start() at any point, you will need to call session_write_close() before you stream your file to the user.
This closes out the PHP session file and allows users to load another page which presumably is calling session_start (and thus is waiting for a lock on the session file).
NOTE: You can still read $_SESSION after calling session_write_close(), but any modification will be thrown away - you just told PHP you're done making changes to the session.
More info:
http://us2.php.net/manual/en/function.session-write-close.php
PHP is not the best solution for delivering large files as you'll occupy a process on the server for each user for the entire length of their download; if you are using Apache or nginx you should look into using mod_xsendfile or XSendfile to serve files.
If that's not possible, you could always try streamlining the process of delivering the file a little by using the readfile() function.
Related
<?php
// cache - will work online - not locally
// location and prefix for cache files
define('CACHE_PATH', "siteCache/");
// how long to keep the cache files (hours)
define('CACHE_TIME', 12);
// return location and name for cache file
function cache_file()
{
return CACHE_PATH . md5($_SERVER['REQUEST_URI']);
}
// display cached file if present and not expired
function cache_display()
{
$file = cache_file();
// check that cache file exists and is not too old
if(!file_exists($file)) return;
if(filemtime($file) < time() - CACHE_TIME * 3600) return;
// if so, display cache file and stop processing
readfile($file);
exit;
}
// write to cache file
function cache_page($content)
{
if(false !== ($f = #fopen(cache_file(), 'w'))) {
fwrite($f, $content);
fclose($f);
}
return $content;
}
// execution stops here if valid cache file found
cache_display();
// enable output buffering and create cache file
ob_start('cache_page');
?>
This is the cache code that I am using in a dynamic website in db file. And every page contains this code at top.
<?php session_start();
include("db.php"); ?>
Pages are being cached and its working but on form submission, on user login, on variable passing through pages, nothing is happening. Old pages are being displayed. How do I use this caching code so that it may work but site remain functional as well.
I wonder how wordpress plugins do it. Wp Super Cache and W3T Cache cache everything, yet blog remains functional. Should I selectively use it at parts of website.
Like this:
<?php
// TOP of your script
$cachefile = 'cache/'.basename($_SERVER['SCRIPT_URI']);
$cachetime = 120 * 60; // 2 hours
// Serve from the cache if it is younger than $cachetime
if (file_exists($cachefile) && (time() - $cachetime < filemtime($cachefile))) {
include($cachefile);
echo "<!-- Cached ".date('jS F Y H:i', filemtime($cachefile))." -->";
exit;
}
ob_start(); // start the output buffer
// Your normal PHP script and HTML content here
// BOTTOM of your script
$fp = fopen($cachefile, 'w'); // open the cache file for writing
fwrite($fp, ob_get_contents()); // save the contents of output buffer to the file
fclose($fp); // close the file
ob_end_flush(); // Send the output to the browser
?>
But it will not work as well, because its about pageURL (whole page caching), not the selective content from page.
Please advise. Is there any easy script do do this. Pear::Cache_Lite seems good but its looks difficult to implement.
Update: I have used Cache_Lite. Its the same. Caches everything or included php file. There are few configuration options to play with. But if used as a whole, it will also ignore get, post, session data updates...and will show previous cached pages unless they are deleted.
I think you could separate display from logic.
I mean, change the action attribute of the form and point it to a php that does not have cache logic (and you must check the referer or other parameter, using tokens or sessions, or something, to avoid security issues like CSRF).
Other thing that I want to point out is you should look to cache only the most visited pages (i.e. the homepage), generally you don't have a "one size fits all" with caching, and it is better not to worry about pages that don't have speed/load issues. Or it may be better to cache the data if your speed issues comes from a database query (you should profile your application before implementing caching).
Other approach that migth work is checking the request method and disable the cache if it is post (given that all your forms use the POST method) using $_SERVER['REQUEST_METHOD'] == 'POST'.
I'm having the following problem with my VPS server.
I have a long-running PHP script that sends big files to the browser. It does something like this:
<?php
header("Content-type: application/octet-stream");
readfile("really-big-file.zip");
exit();
?>
This basically reads the file from the server's file system and sends it to the browser. I can't just use direct links(and let Apache serve the file) because there is business logic in the application that needs to be applied.
The problem is that while such download is running, the site doesn't respond to other requests.
The problem you are experiencing is related to the fact that you are using sessions. When a script has a running session, it locks the session file to prevent concurrent writes which may corrupt the session data. This means that multiple requests from the same client - using the same session ID - will not be executed concurrently, they will be queued and can only execute one at a time.
Multiple users will not experience this issue, as they will use different session IDs. This does not mean that you don't have a problem, because you may conceivably want to access the site whilst a file is downloading, or set multiple files downloading at once.
The solution is actually very simple: call session_write_close() before you start to output the file. This will close the session file, release the lock and allow further concurrent requests to execute.
Your server setup is probably not the only place you should be checking.
Try doing a request from your browser as usual and then do another from some other client.
Either wget from the same machine or another browser on a different machine.
In what way doesn't the server respond to other requests? Is it "Waiting for example.com..." or does it give an error of any kind?
I do something similar, but I serve the file chunked, which gives the file system a break while the client accepts and downloads a chunk, which is better than offering up the entire thing at once, which is pretty demanding on the file system and the entire server.
EDIT: While not the answer to this question, asker asked about reading a file chunked. Here's the function that I use. Supply it the full path to the file.
function readfile_chunked($file_path, $retbytes = true)
{
$buffer = '';
$cnt = 0;
$chunksize = 1 * (1024 * 1024); // 1 = 1MB chunk size
$handle = fopen($file_path, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
I have tried different approaches (reading and sending the files in small chunks [see comments on readfile in PHP doc], using PEARs HTTP_Download) but I always ran into performance problems when the files are getting big.
There is an Apache mod X-Sendfile where you can do your business logic and then delegate the download to Apache. The download will not be publicly available. I think, this is the most elegant solution for the problem.
More Info:
http://tn123.org/mod_xsendfile/
http://www.brighterlamp.com/2010/10/send-files-faster-better-with-php-mod_xsendfile/
The same happens go to me and i'm not using sessions.
session.auto_start is set to 0
My example script only runs "sleep(5)", and adding "session_write_close()" at the beginning doesn't solve the problem.
Check your httpd.conf file. Maybe you have "KeepAlive On" and that is why your second request hangs until the first is completed. In general your PHP script should not allow the visitors to wait for long time. If you need to download something big, do it in a separate internal request that user have no direct control of. Until its done, return some "executing" status to the end user and when its done, process the actual results.
Here's my code:
$cachefile = "cache/ttcache.php";
if(file_exists($cachefile) && ((time() - filemtime($cachefile)) < 900))
{
include($cachefile);
}
else
{
ob_start();
/*resource-intensive loop that outputs
a listing of the top tags used on the website*/
$fp = fopen($cachefile, 'w');
fwrite($fp, ob_get_contents());
fflush($fp);
fclose($fp);
ob_end_flush();
}
This code seemed like it worked fine at first sight, but I found a bug, and I can't figure out how to solve it. Basically, it seems that after I leave the page alone for a period of time, the cache file empties (either that, or when I refresh the page, it clears the cache file, rendering it blank). Then the conditional sees the now-blank cache file, sees its age as less than 900 seconds, and pulls the blank cache file's contents in place of re-running the loop and refilling the cache.
I catted the cache file in the command line and saw that it is indeed blank when this problem exists.
I tried setting it to 60 seconds to replicate this problem more often and hopefully get to the bottom of it, but it doesn't seem to replicate if I am looking for it, only when I leave the page and come back after a while.
Any help?
In the caching routines that I write, I almost always check the filesize, as I want to make sure I'm not spewing blank data, because I rely on a bash script to clear out the cache.
if(file_exists($cachefile) && (filesize($cachefile) > 1024) && ((time() - filemtime($cachefile)) < 900))
This assumes that your outputted cache file is > 1024 bytes, which, usually it will be if it's anything relatively large. Adding a lock file would be useful as well, as noted in the comments above to avoid multiple processes trying to write to the same lock file.
you can double check the file size with the filesize() function, if it's too small, act as if the cache was old.
if there's no PHP in the file, you may want to either use readfile() for performance reasons to just spit the file back out to the end user.
This first script gets called several times for each user via an AJAX request. It calls another script on a different server to get the last line of a text file. It works fine, but I think there is a lot of room for improvement but I am not a very good PHP coder, so I am hoping with the help of the community I can optimize this for speed and efficiency:
AJAX POST Request made to this script
<?php session_start();
$fileName = $_POST['textFile'];
$result = file_get_contents($_SESSION['serverURL']."fileReader.php?textFile=$fileName");
echo $result;
?>
It makes a GET request to this external script which reads a text file
<?php
$fileName = $_GET['textFile'];
if (file_exists('text/'.$fileName.'.txt')) {
$lines = file('text/'.$fileName.'.txt');
echo $lines[sizeof($lines)-1];
}
else{
echo 0;
}
?>
I would appreciate any help. I think there is more improvement that can be made in the first script. It makes an expensive function call (file_get_contents), well at least I think its expensive!
This script should limit the locations and file types that it's going to return.
Think of somebody trying this:
http://www.yoursite.com/yourscript.php?textFile=../../../etc/passwd (or something similar)
Try to find out where delays occur.. does the HTTP request take long, or is the file so large that reading it takes long.
If the request is slow, try caching results locally.
If the file is huge, then you could set up a cron job that extracts the last line of the file at regular intervals (or at every change), and save that to a file that your other script can access directly.
readfile is your friend here
it reads a file on disk and streams it to the client.
script 1:
<?php
session_start();
// added basic argument filtering
$fileName = preg_replace('/[^A-Za-z0-9_]/', '', $_POST['textFile']);
$fileName = $_SESSION['serverURL'].'text/'.$fileName.'.txt';
if (file_exists($fileName)) {
// script 2 could be pasted here
//for the entire file
//readfile($fileName);
//for just the last line
$lines = file($fileName);
echo $lines[count($lines)-1];
exit(0);
}
echo 0;
?>
This script could further be improved by adding caching to it. But that is more complicated.
The very basic caching could be.
script 2:
<?php
$lastModifiedTimeStamp filemtime($fileName);
if (isset($_SERVER['HTTP_IF_MODIFIED_SINCE'])) {
$browserCachedCopyTimestamp = strtotime(preg_replace('/;.*$/', '', $_SERVER['HTTP_IF_MODIFIED_SINCE']));
if ($browserCachedCopyTimestamp >= $lastModifiedTimeStamp) {
header("HTTP/1.0 304 Not Modified");
exit(0);
}
}
header('Content-Length: '.filesize($fileName));
header('Expires: '.gmdate('D, d M Y H:i:s \G\M\T', time() + 604800)); // (3600 * 24 * 7)
header('Last-Modified: '.date('D, d M Y H:i:s \G\M\T', $lastModifiedTimeStamp));
?>
First things first: Do you really need to optimize that? Is that the slowest part in your use case? Have you used xdebug to verify that? If you've done that, read on:
You cannot really optimize the first script usefully: If you need a http-request, you need a http-request. Skipping the http request could be a performance gain, though, if it is possible (i.e. if the first script can access the same files the second script would operate on).
As for the second script: Reading the whole file into memory does look like some overhead, but that is neglibable, if the files are small. The code looks very readable, I would leave it as is in that case.
If your files are big, however, you might want to use fopen() and its friends fseek() and fread()
# Do not forget to sanitize the file name here!
# An attacker could demand the last line of your password
# file or similar! ($fileName = '../../passwords.txt')
$filePointer = fopen($fileName, 'r');
$i = 1;
$chunkSize = 200;
# Read 200 byte chunks from the file and check if the chunk
# contains a newline
do {
fseek($filePointer, -($i * $chunkSize), SEEK_END);
$line = fread($filePointer, $i++ * $chunkSize);
} while (($pos = strrpos($line, "\n")) === false);
return substr($line, $pos + 1);
If the files are unchanging, you should cache the last line.
If the files are changing and you control the way they are produced, it might or might not be an improvement to reverse the order lines are written, depending on how often a line is read over its lifetime.
Edit:
Your server could figure out what it wants to write to its log, put it in memcache, and then write it to the log. The request for the last line could be fulfulled from memcache instead of file read.
The most probable source of delay is that cross-server HTTP request. If the files are small, the cost of fopen/fread/fclose is nothing compared to the whole HTTP request.
(Not long ago I used HTTP to retrieve images to dinamically generate image-based menus. Replacing the HTTP request by a local file read reduced the delay from seconds to tenths of a second.)
I assume that the obvious solution of accessing the file server filesystem directly is out of the question. If not, then it's the best and simplest option.
If not, you could use caching. Instead of getting the whole file, you just issue a HEAD request and compare the timestamp to a local copy.
Also, if you are ajax-updating a lot of clients based on the same files, you might consider looking at using comet (meteor, for example). It's used for things like chats, where a single change has to be broadcasted to several clients.
I'm trying to write a script that will create a file on the server then use header() to redirect the user to that file. Then, after about 10 seconds I want to delete the file. I've tried this:
header('Location: '.$url);
flush();
sleep(10);
unlink($url);
But the browser just waits for the script to complete then gets redirected, but the file hes been deleted by that time. Is there someway to tell the browser "end of file", then keep computing? Or maybe have PHP start another script, but not wait for that script to finish?
You might be better off having the PHP page serve the file. No need to create a temporary file in this case and delete it, just send out the data you intended to write to the temporary file. You will need to set the headers correctly so the browser can identify the type of file you are sending. i.e. Content-Type: text/xml; for xml or image/jpeg for jpg's.
This method also handles slow clients that take longer to download the file.
The only way I've discovered to do this so far is to provide the content length in the header. Try adding this:
header("Content-Length: 0");
before your flush();
http://us2.php.net/ignore_user_abort
Be very careful using this, you can pretty quickly kill a server by abusing it.
Alternatively.... instead of messing with dynamically generating files on the fly... why not make a handler like so:
tempFile.php?key={md5 hash}
tempFile.php then either queries a DB, memcache ( with additional prepended key ), or apc for the content.
You can try doing smth like that:
<iframe src="<?=$url?>"></iframe>
....
<?
sleep(10);
unlink($url);
?>
Other option is to use curl - then you load file in request and display to the user.
Question - do you want to delete the file that user cannot have it - I'm afraid it's impossible, when user loads file it is temporaly in his browser - so he can save it.
Next option - if you know type of this file, you can generate content/type header so user will download the file. And then you delete it.
It's just simple ideas, don't know which will work for you( if any:) )
If you want to implement your original design, read this question about running a command in PHP that is "fire and forget" Asynchronous shell exec in PHP
As seen at \Symfony\Component\HttpFoundation\Response::send
/**
* Sends HTTP headers and content.
*
* #return Response
*
* #api
*/
public function send()
{
$this->sendHeaders();
$this->sendContent();
if (function_exists('fastcgi_finish_request')) {
fastcgi_finish_request();
} elseif ('cli' !== PHP_SAPI) {
// ob_get_level() never returns 0 on some Windows configurations, so if
// the level is the same two times in a row, the loop should be stopped.
$previous = null;
$obStatus = ob_get_status(1);
while (($level = ob_get_level()) > 0 && $level !== $previous) {
$previous = $level;
if ($obStatus[$level - 1]) {
if (version_compare(PHP_VERSION, '5.4', '>=')) {
if (isset($obStatus[$level - 1]['flags']) && ($obStatus[$level - 1]['flags'] & PHP_OUTPUT_HANDLER_REMOVABLE)) {
ob_end_flush();
}
} else {
if (isset($obStatus[$level - 1]['del']) && $obStatus[$level - 1]['del']) {
ob_end_flush();
}
}
}
}
flush();
}
return $this;
}
You're going about this the wrong way. You can create the file and serve it to them, and delete it in one step.
<?php
$file_contents = 'these are the contents of your file';
$random_filename = md5(time()+rand(0,10000)).'.txt';
$public_directory = '/www';
$the_file = $public_directory.'/'.$random_filename;
file_put_contents($the_file, $file_contents);
echo file_get_contents($the_file);
unlink($the_file);
?>
If you do it that way, the files get deleted immediately after the user sees them. Of course, this means that the file need not exist in the first place. So you could shorten the code to this:
<?php
$file_contents = 'these are the contents of your file';
echo $file_contents;
?>
It all depends on where you're getting the content you want to show them. If it's from a file, try:
<?php
$file_contents = file_get_contents($filename_or_url);
echo $file_contents;
?>
As for deleting files automatically, just setup a cron job that runs every 10 seconds, and deletes all the files in your temp folder that where filemtime($filename) is greater than 5 minutes' worth of seconds.