PHP PDF download script is executing twice - php

I have a web application using PHP and PDO with SQLSRV prepared statements to display links to files for users to download. The back-end PHP script 'download.php' checks various items before serving the PDF to the user to download. The download.php file then should update a few SQL tables, and serve the PDF file to the user.
Please read my
Previous Question
and the troubleshooting completed there, if you need more information.
After troubleshooting, the error I thought was occurring (and thus the previous question I had asked) was incorrect. My download script is getting executed more than once for every file download.
I have searched the server logs and while debugging with Firebug, I can see my download.php script making multiple GET requests to the server. Sometimes the script completes only once as expected. Other times the script executes three to four request for the one click of the download link.
Now that I more fully understand what error is occurring, I need a bit of help fixing it.
I need to prevent the script from running multiple times, and thus updating the SQL table with records that are within a few milliseconds of each other.
The view page checks the SQL database for files the current user is allowed access to, and displays a list of links:
<a href='download.php?f={$item['name']}&t={$type}' target='_blank'>{$item['name']}</a>
Because the values are needed for the download.php script to work, I cannot change the request to a $_POST instead of $_GET.
What I have tried:
Checking/setting a session variable for 'downloading' state, before the getfile() which unsets right before the exit(0)
Putting the SQL statements in a separate PHP file and require'ing that
Adding a sleep(1) after the getfile()
Commenting out the header/PDF information
The first three measures did not work to prevent the double/triple execution of the PHP download script. However, the last measure DOES prevent the double/triple execution of the PHP script, but of course the PDF is never delivered to the client browser!
Question: How can I ensure that only ONE insert/update PER DOWNLOAD is inserted into the database, or at the least, how can I prevent the PHP script from being executed multiple times?
UPDATE
Screenshot of issue in firebug:
One request:
Two requests:
download.php script
<?php
session_start();
require("cgi-bin/auth.php");
// Don't timeout when downloading large files
#ignore_user_abort(1);
#set_time_limit(0);
//error_reporting(E_ALL);
//ini_set('display_errors',1);
function getfile() {
if (!isset($_GET['f']) || !isset($_GET['t'])) {
echo "Nothing to do!";
exit(0);
}
require('cgi-bin/connect_db_pdf.php');
//Update variables
$vuname = strtolower(trim($_SESSION['uname']));
$file = trim(basename($_GET['f'])); //Filename we're looking for
$type = trim($_GET['t']);//Filetype
if (!preg_match('/^[a-zA-Z0-9_\-\.]{1,60}$/', $file) || !preg_match('/^av|ds|cr|dp$/', $type)) {
header('Location: error.php');
exit(0);
}
try {
$sQuery = "SELECT TOP 1 * FROM pdf_info WHERE PDF_name=:sfile AND type=:stype";
$statm = $conn->prepare($sQuery);
$statm->execute(array(':sfile'=>$file,':stype'=>$type));
$result = $statm->fetchAll();
$count = count($result);
$sQuery = null;
$statm = null;
if ($count == 1 ){ //File was found in the database so let them download it. Update the time as well
$result = $result[0];
$sQuery = "INSERT INTO access (PDF_name,PDF_type,PDF_time,PDF_access) VALUES (:ac_file, :ac_type, GetDate(), :ac_vuname); UPDATE pdf_info SET last_view=GetDate(),viewed_uname=:vuname WHERE PDF_name=:file AND PDF_type=:type";
$statm = $conn->prepare($sQuery);
$statm->execute(array( ':ac_vuname'=>$vuname, ':ac_file'=>$file, ':ac_type'=>$type,':vuname'=>$vuname, ':file'=>$file, ':type'=>$type));
$count = $statm->rowCount();
$sQuery = null;
$statm = null;
//$result is the first element from the SELECT query outside the 'if' scope.
$file_loc = $result['floc'];
$file_name = $result['PDF_name'];
// Commenting from this line to right after the exit(0) updates the database only ONCE, but then the PDF file is never sent to the browser!
header("Content-Type: application/pdf");
header("Pragma: no-cache");
header("Cache-Control: no-cache");
header("Content-Length: " . filesize($file_loc));
header("Accept-Ranges: bytes");
header("Content-Disposition: inline; filename={$file_name}");
ob_clean();
flush();
readfile($file_loc);
exit(0);
} else { //We did not find a file in the database. Redirect the user to the view page.
header("Location: view.php");
exit(0);
}
} catch(PDOException $err) {//PDO SQL error.
//echo $err;
header('Location: error.php');
exit(0);
}
}
getfile();
?>

If you really need to make sure that a link only creates an event once, then you need to implement a token system, where when a hyperlink (or a form post target) is generated, a use once token is generated and stored (in the session or wherever), and then is checked in the calling script.
So your hyperlink may look like this:
<a href='download.php?token={some-token}&f={$item['name']}&t={$type}' target='_blank'>{$item['name']}</a>
On the php side this is a really simplified idea of what you might do:
<?php
session_start();
if (!isset($_REQUEST['token']) die(); // or fail better
if (!isset($_SESSION['oneTimeTokens'][$_REQUEST['token']) die(); // or fail better
if ($_SESSION['oneTimeTokens'][$_REQUEST['token']=='used') die(); // or fail better
$_SESSION['oneTimeTokens'][$_REQUEST['token']='used';
// we're good from this point
This would solve the effects of your problem, though not the double running itself. However since you want to make sure a link is firing an event only once NO MATTER WHAT, you probably implement this in some form or another as it's the only way to guarantee that any link generated has a one real use life that I can think of.
When generating the link you would do something like this in your code:
<?php
$tokenID = {random id generation here};
$_SESSION['oneTimeTokens'][$tokenID] = 'not used';
I'd also somewhere put a cleanup routine to remove all used tokens. Also, it's not a bad idea to expire tokens beyond a certain age, but I think this explains it.

Related

How can i update data sessions by SID after headers sended?

Sometimes, after user request, is necesary to keep run current script in background to process some events about updates. Specific events must update some keys on user session.
My code:
<php
ini_set('session.save_path',__DIR__ . '/../!PHPSESSID');
session_start();
// ...... page content - db queries and other stuff. here can be generated some events
//close user connection and keep running script in background
ignore_user_abort(true);
//fastcgi_finish_request(); -- no need in my case
header('Connection: close');
header('Content-Length: '.ob_get_length());
session_write_close();
ob_end_flush();
flush();
//OK, user connection are closed. This script is running now in bacground
set_time_limit(120); // for me is more than enough - OR set 0
//... my database store all sids by each user id for each connections (e.g. same user, by id 123, have connections: 2 from pc(difrent browser), 1 from mobile .... ). I always know how ~ many connections user have and how to find him to send some events
if($haveSomeSpecificEvents){
foreach($specificEvents as $item){
if(!file_exists(session_save_path().'/sess_'.$item['sid'])){
continue; //skip
}
session_id($item['sid']); // generate warning: headers already sent
session_start(); // generate warning: headers already sent
//check if session is what i am looking for like $_SESSION['id'] == $item['user_id'], if not - just skip this
//place some updates (flag) in $_SESSION, but is always NULL.
session_write_close(); // save changes
}
}
Of cource, i can do request using CURL to my domain - i think this is not a good ideea.
P.S: English is not my first language
Well, I waited for someone to answer my question, but I had to answer it myself. Maybe this will help someone. My solution: turn on output buffering. Just add ob_start(); on top the script.

Need help changing code for successful file upload, and cancel file upload

I am having problem with my cancel button when trying to cancel my upload and I needs some help in probably changing the design of my code a little.
The problem:
The "cancelaudio.php" page does not have the information it needs until after the "audioupload.php" script has been run. It's a cart and horse situation. If the client clicks the cancel button during the HTTP request, the "audioupload.php" script (on the server side) never gets executed. However the client-side activities in jQuery would still get run.
The solution I want to acheive:
Client fills in the form and submits it, resulting an a POST request, accompanied by the file. The POST request may take several seconds to complete, depending on the size of the file, speed of the connections, etc.
Only after the HTTP upload has completed for all of the files, will PHP gain control. Your PHP "action" script on the server gets control via a POST-method request. If any errors occurred during the upload, $_FILES['error'] will be loaded with the right code. At this point you can check for values in $_FILES, move_uploaded_file(), load the file name into the $_SESSION array, etc.
So to sum up, if the human client clicks a "cancel button" while the POST request is in process (or before starting the upload), and this causes cancellation of the file upload, the PHP "action" script that handles the uploads never gets control. The server never has an opportunity to move the uploaded file and load the variables into the database or session array.
I just need help on coding the problem to be able to reach the solution. Can anybody help? Below are the necessary code:
AUDIOUPLOAD.PHP
<?php
ini_set('display_errors',1);
error_reporting(E_ALL);
// connect to the database
include('connect.php');
/* check connection */
if (mysqli_connect_errno()) {
printf("Connect failed: %s\n", mysqli_connect_error());
die();
}
$result = 0;
if( file_exists("AudioFiles/".$_FILES['fileAudio']['name'])) {
$parts = explode(".",$_FILES['fileAudio']['name']);
$ext = array_pop($parts);
$base = implode(".",$parts);
$n = 2;
while( file_exists("AudioFiles/".$base."_".$n.".".$ext)) $n++;
$_FILES['fileAudio']['name'] = $base."_".$n.".".$ext;
move_uploaded_file($_FILES["fileAudio"]["tmp_name"],
"AudioFiles/" . $_FILES["fileAudio"]["name"]);
$result = 1;
}
else
{
move_uploaded_file($_FILES["fileAudio"]["tmp_name"],
"AudioFiles/" . $_FILES["fileAudio"]["name"]);
$result = 1;
}
$audiosql = "INSERT INTO Audio (AudioFile)
VALUES (?)";
if (!$insert = $mysqli->prepare($audiosql)) {
// Handle errors with prepare operation here
}
//Dont pass data directly to bind_param store it in a variable
$insert->bind_param("s",$aud);
//Assign the variable
$aud = 'AudioFiles/'.$_FILES['fileAudio']['name'];
$insert->execute();
if ($insert->errno) {
// Handle query error here
}
$insert->close();
$lastAudioID = $mysqli->insert_id;
$_SESSION['lastAudioID'] = $lastAudioID;
$_SESSION['AudioFile'] = $_FILES["fileAudio"]["name"];
$audioquestionsql = "INSERT INTO Audio_Question (AudioId, QuestionId)
VALUES (?, ?)";
if (!$insertaudioquestion = $mysqli->prepare($audioquestionsql)) {
// Handle errors with prepare operation here
echo "Prepare statement err audioquestion";
}
$qnum = (int)$_POST['numaudio'];
$insertaudioquestion->bind_param("ii",$lastAudioID, $qnum);
$insertaudioquestion->execute();
if ($insertaudioquestion->errno) {
// Handle query error here
}
$insertaudioquestion->close();
?>
CANCELAUDIO.PHP
<?php
// connect to the database
include('connect.php');
/* check connection */
if (mysqli_connect_errno()) {
printf("Connect failed: %s\n", mysqli_connect_error());
die();
}
unlink("AudioFiles/" . $_SESSION['AudioFile']);
$delete = $mysqli->prepare('DELETE FROM Audio WHERE AudioId = ?');
$delete->bind_param("i",$_SESSION['lastAudioID']);
$delete->execute();
$deleteaud = $mysqli->prepare('DELETE FROM Audio_Question WHERE AudioId = ?');
$deleteaud->bind_param("i",$_SESSION['lastAudioID']);
$deleteaud->execute();
?>
HTML FORM CODE:
<form action='audioupload.php' method='post' enctype='multipart/form-data' target='upload_target_audio' onsubmit='return audioClickHandler(this);' class='audiouploadform' >
Audio File: <input name='fileAudio' type='file' class='fileAudio' /></label><br/><br/><label class='audiolbl'>
<input type='submit' name='submitAudioBtn' class='sbtnaudio' value='Upload' /></label>
<input type='hidden' class='numaudio' name='numaudio' value='" + GetFormAudioCount() + "' />
<input type='reset' name='audioCancel' class='audioCancel' value='Cancel' /></label>
<iframe class='upload_target_audio' name='upload_target_audio' src='#' style='width:300px;height:300px;border:0px;solid;#fff;'></iframe></form>
JQUERY CODE:
function startAudioUpload(audiouploadform){
$(audiouploadform).find('.audiof1_upload_process').css('visibility','visible');
$(audiouploadform).find('.audiof1_cancel').css('visibility','visible');
$(audiouploadform).find('.audiof1_upload_form').css('visibility','hidden');
sourceAudioForm = audiouploadform;
$(audiouploadform).find(".audioCancel").on("click", function(event) {
$('.upload_target_audio').get(0).contentwindow
$("iframe[name='upload_target_audio']").attr("src", "cancelaudio.php");
return stopAudioUpload(2);
});
return true;
}
First you could try visiting this link about Session Upload Progress, this should show you how to stop the progress of the upload without killing the session :)
PHP: Session Upload Progress
You could try killing the MySQL connection then re-opening it, i assume this would be one method to stop the connection... i assume if you use the code in conjunction with a cancel button (e.g onclick->blah blah coding blah) you could cancel the connection then return it once connection has been terminated?
here's a link -> PHP: MySQL_close
after this just write a simple line of code to start running the action scripts (point to the function that you need to run), this should allow you to continue running the scripts and not loose the session that you need.
ps. I'm not entirely sure this will work... just thinking logically :P...
Ok I'll try something from what I understood.
The user submits a file using the form. This could take some time, so when the user clicks on "Cancel", 2 cases are possible:
The form (and its data, i.e. the file content) is still being sent out by the browser and is not finished uploading. In that case, all you have to do is cancel the form submission. Ask the browser to stop it.
That way, the audioupload.php will never execute, and you won't need to remove it by calling cancelaudio.php.
When the browser completed the file upload / form submission, there's a short time frame where the server is still processing the data and has not responded to you yet. In that case, the uploaded file may or may not be saved (we don"t know). You need to call your cancelaudio.php page to delete it from the hard drive.
Only one of the 2 cases will be true, but you can do both actions since none will conflict with the other one.
The answer to first case is here (since you're using an <iframe> to submit the form) : Is it possible to cancel file upload that uses hidden iframe?
The anwser to second case is to always make an Ajax call to cancelaudio.php, just in case the file has been processed and saved but we haven't been notified (well, the time frame is short, but it still can happen).
try this with jquery:
$(".audioCancel").live("click", function(){
var file = $(".fileAudio").val();
var canc = "Cancel";
$.post("CANCELAUDIO.PHP", {file:file,canc:canc}, function(result){
alert(result);
window.location.reload();
});
});
then in your php add this:
$file = $_POST["file"];
$canc = $_POST["canc"];
if($canc){
unlink("AudioFiles/" . $canc);
}else{
unlink("AudioFiles/" . $_SESSION['AudioFile']);
}
try this also:
When the session.upload_progress.enabled INI option is enabled, PHP will be able to track the upload progress of individual files being uploaded. This information isn't particularly useful for the actual upload request itself, but during the file upload an application can send a POST request to a separate endpoint (via XHR for example) to check the status.
The upload progress will be available in the $_SESSION superglobal when an upload is in progress, and when POSTing a variable of the same name as the session.upload_progress.name INI setting is set to. When PHP detects such POST requests, it will populate an array in the $_SESSION, where the index is a concatenated value of the session.upload_progress.prefix and session.upload_progress.name INI options. The key is typically retrieved by reading these INI settings, i.e.
<?php
$key = ini_get("session.upload_progress.prefix") . $_POST[ini_get("session.upload_progress.name")];
var_dump($_SESSION[$key]);
?>
It is also possible to cancel the currently in-progress file upload, by setting the $_SESSION[$key]["cancel_upload"] key to TRUE. When uploading multiple files in the same request, this will only cancel the currently in-progress file upload, and pending file uploads, but will not remove successfully completed uploads. When an upload is cancelled like this, the error key in $_FILES array will be set to UPLOAD_ERR_EXTENSION.
http://php.net/manual/en/session.upload-progress.php

Email Open Length Tracking with PHP

I have been tracking emails for years using a "beacon" image and for those clients that allow the images to download it has worked great to track how many people have opened the email.
I came across the service "DidTheyReadIt" which shows how long the client actually read the email, I tested it with their free service and it is actually pretty close to the times I opened the email.
I am very curious in how they achieve the ability to track this, I am certain that whatever solution is chosen it will put a lot of load on the server / database and that many of the community will reply with "Stop, No and Dont" but I do want to investigate this and try it out, even if its just enough for me to run a test on the server and say "hell no".
I did some googling and found this article which has a basic solution http://www.re-cycledair.com/tracking-email-open-time-with-php
I made a test using sleep() within the beacon image page:
<?php
set_time_limit(300); //1000 seconds
ignore_user_abort(false);
$hostname_api = "*";
$database_api = "*";
$username_api = "*";
$password_api = "*";
$api = mysql_pconnect($hostname_api, $username_api, $password_api) or trigger_error(mysql_error(),E_USER_ERROR);
mysql_select_db($database_api, $api);
$fileName = "logo.png";
$InsertSQL = "INSERT INTO tracker (FileName,Time_Start,Time_End) VALUES ('$fileName',Now(),Now()+1)";
mysql_select_db($database_api, $api);
$Result1 = mysql_query($InsertSQL, $api) or die(mysql_error());
$TRID = mysql_insert_id();
//Open the file, and send to user.
$fp = fopen($fileName, "r");
header("Content-type: image/png");
header('Content-Length: ' . filesize($fileName));
readfile($fileName);
set_time_limit(60);
$start = time();
for ($i = 0; $i < 59; ++$i) {
// Update Read Time
$UpdateSQL = "UPDATE tracker SET Time_End = Now() WHERE TRID = '$TRID'";
mysql_select_db($database_api, $api);
$Result1 = mysql_query($UpdateSQL, $api) or die(mysql_error());
time_sleep_until($start + $i + 1);
}
?>
The problem with the code above (other than updating the database every second) is that once the script runs it continues to run even if the user disconnects (or moves to another email in this case).
I added "ignore_user_abort(false);", however as there is no connection to the mail client and the headers are already written I dont think the "ignore_user_abort(false);" can fire.
I looked at the post Track mass email campaigns and one up from the bottom "Haragashi" says:
"You can simply build a tracking handler which returns the tracking image byte by byte. After every byte flush the response and sleep for a period of time.
If you encounter a stream closed exception the client has closed the e-mail (deleted or changed to another e-mail who knows).
At the time of the exception you know how long the client 'read' the e-mail."
Does anyone know how I could "simply build a tracking handler" like this or know of a solution I can implement into my code that will force the code to stop running when the user disconnects?
I think the problem is that you aren't doing a header redirect every so often. The reason that it is necessary is because once a script starts executing in PHP+Apache, it basically disregards the client until finished. If you force a redirect every X seconds, it makes the server re-evaluate if the client is still connected. If the client isn't connected, it can't force the redirect, and therefore stops tracking the time.
When I played around with this stuff, my code looked like:
header("Content-type: image/gif");
while(!feof($fp)) {
sleep(2);
if(isset($_GET['clientID'])) {
$redirect = $_SERVER['REQUEST_URI'];
} else {
$redirect = $_SERVER['REQUEST_URI'] . "&clientID=" . $clientID;
}
header("Location: $redirect");
exit;
}
If the client id was set, then above this block of code I would log this attempt at reading the beacon in the database. It was easy to simply increment the time on email column by 2 seconds every time the server forced a redirect.
Would you not do something more like this:
<?php
// Time the request
$time = time();
// Ignore user aborts and allow the script
// to run forever
ignore_user_abort(true);
set_time_limit(0);
// Run a pointless loop that sometime
// hopefully will make us click away from
// page or click the "Stop" button.
while(1)
{
// Did the connection fail?
if(connection_status() != CONNECTION_NORMAL)
{
break;
}
// Sleep for 1 seconds
sleep(1);
}
// Connention is now terminated, so insert the amount of seconds since start
$duration = time() - $time;

How to protect processing files

So I've a php form processing file; say a file name process.php with the codes as
<?php
$value = $_POST['string']; //Assume string is safe for database insertion
$result = mysql_query("INSERT into table values {$value}");
if($result) {
return true;
} else {
return false;
}
?>
Ideally, only someone who's logged in to my website shall be allowed to send that POST request to perform that insertion. But here, anyone who know this processing file's path and the request being sent can send any spoof POST request from any domain (if I'm not wrong). This will lead to insertion of unwanted data into the database.
One thing I did is, before the insertion, I checked whether a user is logged in or not. If not, I ignore the POST request. But how exactly should I secure my processing files from exploits?
As it stands this is vulnerable to SQL Injection. Make sure you use a parametrized query library like PDO for inserting the file and the mysql "blob" or "long blob" type. You should never use mysql_query().
You should also keep track of the user's id for user access control. It doesn't look like you have taken this into consideration.

Describe how a php page template can write another php page as html

Describe the mechanics in PHP relevant terms of a PHP/MYSQL page (A.php) that will 1) use one template to write itself (simple), 2) take input from the user to update a database (simple), 3) upon command parse another PHP page (B.php) (???) and save (B.php) page as a static HTML (B.html) (???).
UPDATE= I found a post, here at SO, helpfully suggesting (to another, GROAN, non-Uber Geek with a completely Pedestrian Question) he could capture html from a php page using output buffer. Will this work for a different php file?
There are more complex and better answers to each question, but I'm going to jot down the most simple ones.
PHP is a template language, so a PHP file with your template is your answer. This question is a bit vague.
Access the user-provided data using the $_GET or $_POST superglobals, with the choice depending on your HTTP request method. Basically, GET is for URL data, POST for form data. Once you have the data, validate it. Then use PDO to connect to a database and execute an insertion query.
You can use an output buffer, like so:
ob_start(); // Start output buffer
require 'B.php'; // Execute B.php, storing its output to the buffer
file_put_contents('B.html', ob_get_clean()); // Clean the buffer, retrieve its contents and write them to B.html
It saddened me to get reamed on this question. To show my Q was in good faith, I'm answering my own question with what was a simple solution. I created generate.php to run when a change was made to the content. No cache needed.
// the switch...
$update_live = isset($_GET['update_live']) ? TRUE : FALSE;
// $adminPath, $livePath, $adminUrl are set in an include and contains site config data...
$tempfile = $adminPath . 'tempindex.html'; //a temp file...
$livefile = $livePath . 'index.html'; //the static live file...
$this_template = $adminUrl . 'main-index.php'; //the php template file...
$username = "php_admin";
$password = "123xyz456";
if(!($update_live)){
$errors[] = "Did not submit from an edit page. You can only access this page by referral!";
}else{
if(file_exists($tempfile)){
unlink($tempfile);
}
/* =3, $html = file_get_contents($this_template, false, $context);*/
$html = file_get_contents($this_template);
if($html === false){
$errors[] = "Unable to load template. Static page update aborted!";
exit();
}
if(!file_put_contents($tempfile, $html)){
$errors[] = "Unable to write $tempfile. Static page update aborted!";
exit();
}
if(!copy($tempfile, $livefile)){
$errors[] = "Unable to overwrite index file. Static page update aborted!";
exit();
}
if(!unlink($tempfile)){
$errors[] = "Unable to delete $tempfile. Static page update aborted!";
exit();
}
}

Categories