I have a ton of data to send to the browser, maybe 100mb or so. I've chunked them up into smaller files so I can simulate streaming. Let's say I have 200 files of 500kb each. I build an array of the 200 files in javascript, and then loop over that and make ajax calls for each. It works fine. Then I wanted to improve it, so I gziped everything on the server and they went down to about 20% of the original chunk size. My ajax calls the following file:
fileserver.php?file=/temp/media_C46_20110719_113332_ori-0.js.gz
In fileserver.php, I have, very simply:
$filepath = isset($_GET['file']) ? $_GET['file'] : '';
if($filepath!=''){
if(substr($filepath,-2,2)=='gz'){
header("Content-Type: text/plain" );
header("Content-Length: " .(string)(filesize($filepath)) );
header("Content-Encoding: gzip");
readfile(filepath);
}
else{
header("Location: ".$filepath );
}
}
Again, this totally works. The problem is that it takes forever! Looking at my network tab in chrome, it's taking 15 seconds or so to get a 100kb chunk. I can download that file directly in less than a second. The php script above should take virtually no time to run. I know the client (browser) needs to spend a bit of time to inflate the content, but it's got to be less than a second. So what's taking 15 seconds! Are there any other tools I can use to check this out?
I know I could set the header variables in apache, but I don't have access to that, and doing it in php is functionally equivalent, right? Are those the correct headers to set?
I just figured out the problem. The filesize() function wasn't getting the correct path, so It was printing as blank. I fixed that to send the correct info and it works much much faster now.
Related
I have a script called "image.php" that is used to count impressions and then print the image.
This script is called in this way:
<img src="path/image.php?id=12345" />
And it's used very often by my users, i see thousand of request per day
So I am looking to understand what is the best way to output the image at the end of this script:
Method 1 (actually in use):
header("Content-type: $mime"); //$mime is found with getimagesize function
readfile("$image_url");
exit;
Method 2 (pretty sure that is slowest):
header("Content-type: $mime");
echo file_get_contents("$image_url");
exit;
Method 3:
header('Location: '.$image_url);
exit();
Is method 3 better / faster than method 1?
Ok first of all Method 3 is way faster when redirected to the original file.
The first 2 methods need file access and read the file and also they don't use the browser cache!
Also when you store the rendered images, you can better let apache handle your static files.
Apache is way faster than PHP and it uses the right browser caching (3 or 4 times faster wouldn't be a suprise).
What happens is when you request a static file, apache send the Last-Modified header
If your client requests the same image again it sends the If-Modified-Since header with that same date. If the file isn't changed you server respond with an 304 Not Modified header without any data wich saves you a lot IO operations (Besides the ETAG header wich is also used)
For your impressions count of the image, you could create a cronjob that parses your apache access logs so the end-user won't even notice it. But in your case it's easier to count the impressions in your script and then redirect
Essentially, what readfile does is it reads the file directly into the output buffer while file_get_contents loads the file into the memory (string). So, when you output the results the data is copied from the memory into the output buffer, making it two times slower than readfile.
I have a networked camera that generates a video snapshot upon hitting http://192.0.0.8/image/jpeg.cgi. The problem is that by accessing the root (i.e. 192.0.0.8) directly, users can access a live video feed, so I hope to hide the address altogether.
My proposed solution is to use PHP to retrieve the image and display it at http://intranet/monitor/view.php. Although users could create motion by hitting this new address repeatedly, I see that as unlikely.
I have tried using include() and readfile() in various ways, but do not really use PHP often enough to understand if I'm going in the right direction. My best attempt to date resulted in outputting the jpeg contents, but I did not save the code long enough to share with you.
Any advice would be appreciated.
If you want to limit requests per user then use this:
$timelimit = 30;//Limit in seconds
if(!isset($_SESSION['last_request_time'])) {
$_SESSION['last_request_time'] = time();
}
if(time() > $_SESSION['last_request_time'] + $timelimit) {
//prepare and serve a new image
} else {
//serve an old image
}
If you want to limit image refresh time then use the same script but save the last_request_time in place shared for all users(DB, file, cache)
A succinct way to do this is as follows:
header('Content-Type: image/jpeg');
readfile('http://192.0.0.8/image/jpeg.cgi');
The content of the jpeg is then streamed back to the browser as a file, directly from http://intranet/monitor/view.php.
I'm currently looking into a way of showing the file download status on a page.
I know this isnt needed since the user usually has a download status in the browser, but I would like to keep the user on the page he is downloading from, as long as the download is lasting. To do that, the download status should match the status the file actually has (not a fake prograss bar). Maybe it will also display the speed the user is downloading at, and estimate the time it will take, depending on the current download rate.
Can this be done using PHP and Javascript? Or does it realy require Flash or Java?
Should not somewhere on the Server be an information about who is downloading what at what speed and how much?
Thank you for your help in advance.
Not really possible cross-browser, but have a look into http://markmail.org/message/kmrpk7w3h56tidxs#query:jquery%20ajax%20download%20progress+page:1+mid:kmrpk7w3h56tidxs+state:results for a pretty close effort. IE (as usual) is the main culprit for not playing ball.
You can do it with two seperate php files, first file for downloading process.
Like as:
$strtTime=time();
$download_rate=120; //downloading rate
$fp = fopen($real, "r");
flush();// Flush headers
while (!feof($fp)) {
$downloaded=round($download_rate * 1024);
echo fread($fp,$downloaded );
ob_flush();
flush();
if (connection_aborted ()) {
// unlink("yourtempFile.txt" ;
exit;
}
$totalDw +=$downloaded;
// file_put_contents("yourtempFile.txt", "downloaded: $totalDw ; StartTime:$strtTime");
sleep(1);
}
fclose($fp);
// unlink("yourtempFile.txt") ;
Second file would be used for reading yourtempFile.txt by Ajax continusly. Using Sessions and Cookies wouldn't be used because of starting print.
My PHP script is outputting the contents of a .sql file, after it has been called by a POST request from my Delphi Desktop Client.
Here is what is happening:
My Desktop Client sends a POST request to my PHP Script.
The Script then calls mysqldump and generates a file - xdb_backup.sql
The Script then include "xdb_backup.sql"; which will print and return it to the Desktop Client, whereafter it deletes the SQL file.
The problem is, that the size of the SQL file can vary (for testing, I generated one that is 6 mb). I would like my desktop client to be able to show the progress, however the PHP script does not expose it's size, so I have no Progressbar.Max value to assign.
How can I make my PHP script let the Client know how big it is before the whole thing is over ?
Note: Downloading the SQL file is not an option, as the script has to destroy it. :)
You would do
$fsize = filesize($file_path);
where $file_path will be path to the generated file xdb_backup.sql,
to get the filesize in server and return headers with the following line attached.
header("Content-Length: " . $fsize);
Take a look at http://www.hotscripts.com/forums/php/47774-download-script-not-sending-file-size-header-corrupt-files-since-using-remote-file-server.html which explains a download php script.
You have to send a Content-Length header using header function. Something like this:
header('Content-Length: '.filesize('yourfile.sql'));
You may want to send the file using readfile instead of include.
You can set the Content-Length header with the size of xdb_backup.sql
I'm trying find a way to have PHP to indicate to the browser that all page output is complete. After the page is done we're running some statistics code that usually doesn't take to long but in case it does I don't want to have the users browser waiting for more data. This can't be done via JavaScript because it needs to work with mobile phones.
I'm already starting output buffering using
mb_http_output("UTF8");
ob_start("mb_output_handler");
to insure I don't have issues with my sites MB text (Japanese). I was hoping that ob_end_flush() would do the trick but if I place sleep(10); after the ob_end_flush() the browser waits an additional 10 seconds. Does anyone have any ideas about this?
UPDATE:
Using jitters approach below I "ob_gzhandler" to get it working with gzip any one see any possible issues here?
//may be also add headers for cache-control and expires (IE)
header("Connection: close"); //tells browser that connection will be closed
ob_start();
ob_start("ob_gzhandler");
//page content
ob_end_flush();
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
UPDATE AGAIN:
Please take another look at the code above. You need to do an ob_start(); before the ob_start("ob_gzhandler"); and then call ob_end_flush(); prior to calling ob_get_length() so that you get the correct gzip compressed size.
Use something along these lines
//may be also add headers for cache-control and expires (IE)
header("Connection: close"); //tells browser that connection will be closed
ob_start();
//generate your output
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
//continue statistic processing
I don't think there is a way to notify the browser that the output is complete, at least from the script that sends the output. If you use some other script that will monitor the output of your first script and use an iframe maybe then you might be able to do it.
The browser knows when the output is complete when the page is considered loaded. That is what the browser knows.
You could fork a new php process in the background and let that take care of the stats. Something like:
shell_exec('php stats.php &');
The & at the end makes sure that it's run in the background, so even if the stats.php takes 20 seconds, the visitor won't notice it.
You would probably need to pass data to the stats script, which you can do by passing in parameters, like this:
shell_exec('php stats.php -b '. escapeshellarg($_SERVER['HTTP_USER_AGENT']) .' &');
In stats.php, you'd use the $argv variable to get that data.
But I wouldn't do this if the statistics code doesn't take that long to run, since forking a new process for every page load like this has some overhead. I don't know what it is that makes the stats code take a long time to process, but another solution might be to insert the raw data into a database, and let a background job work on that data to create usable statistics. That could be done either by a cron job, or having a screen run in an infinte loop that processes the queue.
Try to move your statistics code to a seperate function and call this function with an ajax call in the dom.ready or onload event in javascript code on your rendered page like in this meta code:
<html>
<script type="text/javascript">
dom.onready = Ajax.call(location.href + '?do_stats');
</script>
<body>...
</html>
The dom.ready event can be provided by jQuery or Prototype libraries. Downside is it will only work with js enabled.
Alternatively you could just record all needed information for the stats to a database and dispatch a script collecting the queued data from there and working on it in the background - eg by using cron.