PHP connection_aborted() does not work when page changes - php

I'm trying to serve an image while adding a MySQL row for each second the image was viewed.
I'm serving it in chunks of 1024 bits (total size is of image is 20kb)
The problem is that if I load the page where the image is displayed and then close the window or click a link that takes me to a different page the script keeps running and does not die as it should.
ignore_user_abort(false);
$file = 'a.jpg';
header('Content-Type: image/jpeg');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
$conn = mysql_connect("localhost","user","pass");
mysql_select_db("mydb",$conn);
$fp=fopen($file,"rb");
$i=0;
while (!feof($fp)) {
print(fread($fp,1024));
sleep(1);
mysql_query("INSERT INTO table (VIEWTIME) VALUES ('$i')");
$i++;
flush();
ob_flush();
if (connection_aborted()) {
die();
}
}
I'm trying to find a 'server-side' only solution since I have some technical restrictions that prevents me from using any JS or any client side languages.

Perhaps
ignore_user_abort(false);
Should be
ignore_user_abort(true);
Docs

Related

Downloading File From S3 Fails

I am having some major problems when with a script I have to download files from S3.
The problem I'm encountering is that every time I try to download a file, the download starts perfectly, but about halfway through a download the file just stops. Every time. On every file. These are video files, so a lot of them are significantly big. Not sure what to do or how to approach this. Here's my script:
<?php
// other code exists; this is the main download logic
set_time_limit(0);
ignore_user_abort(false);
ini_set('output_buffering', 0);
ini_set('zlib.output_compression', 0);
$chunk = 10 * 1024 * 1024; // bytes per chunk (10 MB)
$fh = fopen($video->getMp4Source(), "rb");
if ($fh === false) {
echo "Unable open file";
}
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header('Expires: 0');
header('Pragma: public');
header('Content-Description: File Transfer');
header('Content-type: MP4');
header('Content-length: ' . $video->getFileSize());
header('Content-Disposition: attachment; filename="'.$video->getMp4Source().'"');
while (!feof($fh)) {
echo fread($fh, $chunk);
ob_flush(); // flush output
flush();
}
exit;
I'm not sure what is wrong, or why it keeps happening. There are no errors being logged, and no errors are occurring outside from the file failing to complete its download. I have tried this with readfile(), but it was using up too many resources and it wasn't completing, regardless.
Any help would be great.

php download do not show progress in browser

I have problem with a php managed file download where the browser do no show progress of a file download. In fact, the browser appears to be waiting and waiting and waiting, until the file is completely downloaded. The file will then appear in the download list (with chrome and firefox). I cannot even download the file with IE8. I would like the browser to show the actual file size and the progress of the download.
Strangely the download is not even visible in firebug (no line appear in the network tab if you paste the download url).
I suspected problem with compression/zlib so I disabled both: no change. I disabled output buffering with the same result.
Live example can be found here: http://vps-1108994-11856.manage.myhosting.com/download.php
Phpinfo: http://vps-1108994-11856.manage.myhosting.com/phpinfo.php
The code is below, your help is appreciated.
<?php
$name = "bac.epub";
$publicname = "bac.epub";
#apache_setenv('no-gzip', 1);
ini_set("zlib.output_compression", "Off");
header("Content-Type: application/epub+zip");
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($name));
header("Content-disposition: attachment; filename=" . $publicname) );
ob_end_flush();
flush();
// dump the file and stop the script
$chunksize = 1 * (128 * 1024); // how many bytes per chunk
$size = filesize($name);
if ($size > $chunksize) {
$handle = fopen($name, 'rb');
$buffer = '';
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
sleep(1);
}
fclose($handle);
} else {
readfile($name);
}
exit;
The sleep in the code was to ensure that the download is long enough to see the progress.
Keep it, really, really, simple.
<?php
header("Content-Type: application/epub+zip");
header("Content-disposition: attachment; filename=" . $publicname) );
if(!readfile($name))
echo 'Error!';
?>
It is all you really need.
header("Content-Type: application/epub+zip");
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($file_path));
header("Content-disposition: attachment; filename=" . $local_file_name);
// dump the file and stop the script
$chunksize = 128 * 1024; // how many bytes per chunk (128 KB)
$size = filesize($file_path);
if ($size > $chunksize)
{
$handle = fopen($file_path, 'rb');
$buffer = '';
while (!feof($handle))
{
$buffer = fread($handle, $chunksize);
echo $buffer;
flush();
sleep(1);
}
fclose($handle);
}
else
{
readfile($file_path);
}
I have modified your code Francis. And now it works ... :)
This is likely caused by a firewall or some sort of proxy between you and the remote site. I was wrestling with the same problem - disabling gzip, flushing buffers etc. until I tried it under a web VPN and the progress indicator re-appeared.
I don't think that the progress indicator is buggy - it's just that the content is being embargoed before it gets to you, which appears as a waiting state in the download. Then when the content is scanned or approved, it may come down very quickly relative to the normal download speed of your site. For large enough files, maybe you could see a progress indicator at this stage.
Nothing you can do about it except to determine if this is the real reason for this behaviour.

Serve file to user over http via php

If I goto http://site.com/uploads/file.pdf I can retrieve a file.
However, if I have a script such as:
<?php
ini_set('display_errors',1);
error_reporting(E_ALL|E_STRICT);
//require global definitions
require_once("includes/globals.php");
//validate the user before continuing
isValidUser();
$subTitle = "Attachment";
$attachmentPath = "/var/www/html/DEVELOPMENT/serviceNow/selfService/uploads/";
if(isset($_GET['id']) and !empty($_GET['id'])){
//first lookup attachment meta information
$a = new Attachment();
$attachment = $a->get($_GET['id']);
//filename will be original file name with user name.n prepended
$fileName = $attachmentPath.$_SESSION['nameN'].'-'.$attachment->file_name;
//instantiate new attachmentDownload and query for attachment chunks
$a = new AttachmentDownload();
$chunks= $a->getRecords(array('sys_attachment'=>$_GET['id'], '__order_by'=>'position'));
$fh = fopen($fileName.'.gz','w');
// read and base64 encode file contents
foreach($chunks as $chunk){
fwrite($fh, base64_decode($chunk->data));
}
fclose($fh);
//open up filename for writing
$fh = fopen($fileName,'w');
//open up filename.gz for extraction
$zd = gzopen($fileName.'.gz', "r");
//iterate over file and write contents
while (!feof($zd)) {
fwrite($fh, gzread($zd, 60*57));
}
fclose($fh);
gzclose($zd);
unlink($fileName.'.gz');
$info = pathinfo($fileName);
header('Content-Description: File Transfer');
header('Content-Type: '.Mimetypes::get($info['extension']));
header('Content-Disposition: attachment; filename=' . basename($fileName));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($fileName));
ob_clean();
flush();
readfile($fileName);
exit();
}else{
header("location: ".$links['status']."?".urlencode("item=incident&action=view&status=-1&place=".$links['home']));
}
?>
This results in sending me the file, but when I open it I receive an error saying:
"File type plain text document (text/plain) is not supported"
First off, I'd start by checking the HTTP headers. You can do this in Firefox easily using the "Live HTTP headers" extension; not sure about equivalents in other browsers offhand. This will let you verify if the header is actually getting set to "application/pdf" and whether your other headers are getting set as well.
If none of the headers are getting set, you might be inadvertently sending output before the calls to header(). Is there any whitespace before the <?php tag?
Are you sure application/pdf is the header your browser is actually seeing?
You can check that out with various HTTP dev tools, for instance HTTP Client for the Mac or Firebug for Firefox.
I use this one and it works.
if(file_exists($file_serverfullpath))
{
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private", false);
//sending download file
header("Content-Type: application/octet-stream"); //application/octet-stream is more generic it works because in now days browsers are able to detect file anyway
header("Content-Disposition: attachment; filename=\"" . basename($file_serverfullpath) . "\""); //ok
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($file_serverfullpath)); //ok
readfile($file_serverfullpath);
}
Try prepending "error_reporting(0);". I found this in the comments at http://php.net/readfile (where you took this example from).
Another thing that could be a problem is your file size. There have been issues reported in the past about PHP5 (we're talking 2005 here, so i hope this is fixed by now) having trouble reading files >2MB. If your file size exceeds this you may want to verify that it reads the whole file.

temp. download links (with codeigniter)

I was wondering how I could start generating temporarily download links based on files from a protected directory (e.g. /downloads/). These links need to be valid until someone used it 5 times or so or after a week or so, after that the link shouldn't be accessible anymore.
Any help would be appreciated.
One clever solution I've stumbled upon lately if you're using apache (or lighty) is to use mod_xsendfile (http://tn123.ath.cx/mod_xsendfile/), an apache module that uses a header to determine which file to deliver to the user.
It's very simple to install (see link above), and afterward, just include these lines in your .htaccess file:
XSendFile on
XSendFileAllowAbove on
Then in your php code, do something like this when you want the user to receive the file:
header('X-Sendfile: /var/notwebroot/files/secretfile.zip')
Apache will intercept any response with an X-Sendfile header, and instead of sending whatever content you output (you may as well return a blank page), apache will deliver the file.
This takes out all the pain of dealing with mimetypes, chunking, and miscellaneous headers.
Use a database. Every time a file is downloaded the database would be updated, as soon as a certain file has reached it's limit it can be either removed or it's access could be denied. For example:
$data = $this->some_model->get_file_info($id_of_current_file);
if ( $data->max_downloads <= 5 )
{
// Allow access to the file
}
I generally keep files outside of the website directory structure for security and request like so:
function retrive_file($file_hash)
{
$this->_redirect();
$this->db->where('file_hash', $file_hash);
$query = $this->db->get('file_uploads');
if($query->num_rows() > 0)
{
$file_info = $query->row();
if($file_info->protect == 1){
$this->_checklogin();
}
$filesize = filesize($file_info->file_path . $file_info->file_name);
$file = fopen($file_info->file_path . $file_info->file_name, "r");
// Generate the server headers
if (strstr($_SERVER['HTTP_USER_AGENT'], "MSIE"))
{
header('Content-Type: "application/octet-stream"');
header('Content-Disposition: attachment; filename="'.$file_info->file_name.'"');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header("Content-Transfer-Encoding: binary");
header('Pragma: public');
header("Content-Length: ".$filesize);
}
else
{
header('Content-Type: "application/octet-stream"');
header('Content-Disposition: attachment; filename="'.$file_info->file_name.'"');
header("Content-Transfer-Encoding: binary");
header('Expires: 0');
header('Pragma: no-cache');
header("Content-Length: ".$filesize);
}
if($file)
{
while(!feof($file)){
set_time_limit(0);
echo fread($file, $filesize);
flush();
ob_flush();
}
}
fclose($file);
}
}
It would be pretty trivial to add byte/request counting to this.

sudden database disconnection in following code

I am recording the time at which the user downloaded a specific file using the following code. However, in this code, initially the download time is coming but later it's disconnecting the data base connection between client and server. If I remove the 'exit' (as shown), everything is coming fine but the downloaded file can be corrupted or damaged.
Can anyone check this code and explain what is wrong with it? I think the problem is with the exit, but what can I use instead of exit?
<?php
$f_name = $_POST["fn"];
$file = "../mt/sites/default/files/ourfiles/$f_name";
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
//ob_clean();
// flush();
readfile($file);
// exit;
}
$con = mysql_connect("localhost","mt","mt");
if (!$con) {
die('Could not connect: ' . mysql_error());
} else {
echo "Connected";
}
// Create table
mysql_select_db("mt", $con);
mysql_query("INSERT INTO down_time (FileName,DateTime)
VALUES ('".$f_name."',NOW())");
mysql_close($con);
?>
If that happens on large files and/or slow connections, try to tweak max_execution_time in php.ini or from script using ini_set function.
well if you include the exit, your code just doesn't come to the point where it should insert the filename into db.
if you don't include the exit, you send the file contents and append "Connected" to it so the file has to be corrupted.
maybe you can try ob_start and ob_end_clean around your db stuff: http://php.net/manual/en/function.ob-start.php
this prevents anything from being sent to the output, so you don't have to use exit but nothing gets sent to the output after your file so it doesn't get corrupted
something like:
readfile($file);
}
ob_start();
$con = mysql_connect("localhost","mt","mt");
//all the DB stuff
mysql_close($con);
ob_end_clean();
?>
you can include exit after the ob_end_clean() just to be sure but this should work just fine.
Try:
<?php
$f_name = $_POST["fn"];
$file = "../mt/sites/default/files/ourfiles/$f_name";
if (!file_exists($file)) { die('File not found'); }
if (!$con = mysql_connect("localhost","mt","mt")) { die(mysql_error()); }
if (!mysql_select_db("mt")) { die(mysql_error()); }
$q = "INSERT INTO `down_time` (`FileName`, `DateTime`) VALUES ('"
. mysql_real_escape_string($f_name) . "',NOW())";
if (!mysql_query($q)) { die(mysql_error()); }
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Content-Length: ' . filesize($file));
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
readfile($file);

Categories