I have been tracking emails for years using a "beacon" image and for those clients that allow the images to download it has worked great to track how many people have opened the email.
I came across the service "DidTheyReadIt" which shows how long the client actually read the email, I tested it with their free service and it is actually pretty close to the times I opened the email.
I am very curious in how they achieve the ability to track this, I am certain that whatever solution is chosen it will put a lot of load on the server / database and that many of the community will reply with "Stop, No and Dont" but I do want to investigate this and try it out, even if its just enough for me to run a test on the server and say "hell no".
I did some googling and found this article which has a basic solution http://www.re-cycledair.com/tracking-email-open-time-with-php
I made a test using sleep() within the beacon image page:
<?php
set_time_limit(300); //1000 seconds
ignore_user_abort(false);
$hostname_api = "*";
$database_api = "*";
$username_api = "*";
$password_api = "*";
$api = mysql_pconnect($hostname_api, $username_api, $password_api) or trigger_error(mysql_error(),E_USER_ERROR);
mysql_select_db($database_api, $api);
$fileName = "logo.png";
$InsertSQL = "INSERT INTO tracker (FileName,Time_Start,Time_End) VALUES ('$fileName',Now(),Now()+1)";
mysql_select_db($database_api, $api);
$Result1 = mysql_query($InsertSQL, $api) or die(mysql_error());
$TRID = mysql_insert_id();
//Open the file, and send to user.
$fp = fopen($fileName, "r");
header("Content-type: image/png");
header('Content-Length: ' . filesize($fileName));
readfile($fileName);
set_time_limit(60);
$start = time();
for ($i = 0; $i < 59; ++$i) {
// Update Read Time
$UpdateSQL = "UPDATE tracker SET Time_End = Now() WHERE TRID = '$TRID'";
mysql_select_db($database_api, $api);
$Result1 = mysql_query($UpdateSQL, $api) or die(mysql_error());
time_sleep_until($start + $i + 1);
}
?>
The problem with the code above (other than updating the database every second) is that once the script runs it continues to run even if the user disconnects (or moves to another email in this case).
I added "ignore_user_abort(false);", however as there is no connection to the mail client and the headers are already written I dont think the "ignore_user_abort(false);" can fire.
I looked at the post Track mass email campaigns and one up from the bottom "Haragashi" says:
"You can simply build a tracking handler which returns the tracking image byte by byte. After every byte flush the response and sleep for a period of time.
If you encounter a stream closed exception the client has closed the e-mail (deleted or changed to another e-mail who knows).
At the time of the exception you know how long the client 'read' the e-mail."
Does anyone know how I could "simply build a tracking handler" like this or know of a solution I can implement into my code that will force the code to stop running when the user disconnects?
I think the problem is that you aren't doing a header redirect every so often. The reason that it is necessary is because once a script starts executing in PHP+Apache, it basically disregards the client until finished. If you force a redirect every X seconds, it makes the server re-evaluate if the client is still connected. If the client isn't connected, it can't force the redirect, and therefore stops tracking the time.
When I played around with this stuff, my code looked like:
header("Content-type: image/gif");
while(!feof($fp)) {
sleep(2);
if(isset($_GET['clientID'])) {
$redirect = $_SERVER['REQUEST_URI'];
} else {
$redirect = $_SERVER['REQUEST_URI'] . "&clientID=" . $clientID;
}
header("Location: $redirect");
exit;
}
If the client id was set, then above this block of code I would log this attempt at reading the beacon in the database. It was easy to simply increment the time on email column by 2 seconds every time the server forced a redirect.
Would you not do something more like this:
<?php
// Time the request
$time = time();
// Ignore user aborts and allow the script
// to run forever
ignore_user_abort(true);
set_time_limit(0);
// Run a pointless loop that sometime
// hopefully will make us click away from
// page or click the "Stop" button.
while(1)
{
// Did the connection fail?
if(connection_status() != CONNECTION_NORMAL)
{
break;
}
// Sleep for 1 seconds
sleep(1);
}
// Connention is now terminated, so insert the amount of seconds since start
$duration = time() - $time;
Related
hello so i am trying to add broadcast command to my telegram bot which broadcast a specific message to all my bot subscribers which ids are saved in mysql database but the loop never seem to end and restarts after a random amount of sent messages
for example : the bot start messaging and then stop at 987 users and
restart the loop over and over or a different number of users too
this is the code that i am using:
<?php
http_response_code(200);
$admin_id = ''; // my id
$bot_token = ''; // my bot token
$message_object = json_decode(file_get_contents("php://input"),true);
$message_text = $message_object['message']['text'];
if(startsWith($message_text, '/broadcast') !== false){
$text_to_send = trim(str_replace('/broadcast','',$message_text));
$start_message = sendmessage($admin_id, 'Broadcasting Started', $bot_token, 'markdown');
$start_message_id = $start_message['result']['message_id'];
$query = mysqli_query($con, "SELECT * FROM users");
if($query and mysqli_num_rows($query) >= 1){
$all_users = mysqli_fetch_all($query,MYSQLI_ASSOC);
$sent = 0;
foreach($all_users as $user){
$user_id = $user['userid'];
$sent += 1;
sendmessage($user_id,$text_to_send,$bot_token,'markdown');
sleep(1);
editmessage($admin_id,"Messages Sent : $sent",$bot_token,$start_message_id);
}
sendmessage($admin_id,'finished broadcasting '.$sent.' messages',$bot_token,'markdown');
}
}
?>
and i never manage to get to the end of the loop to get the broadcast finished message and stuck on an infinite loop
same issue happen when i try to import amount of data that is more than 50 items so mysql database using the same method used in broadcast one
I think that's a problem of PHP maximum execution time: by default PHP has a max execution time of 30s. After 30 seconds the script is terminated and an error is reported to the client who made the initial request (in this case the Telegram API). Telegram sees the error thrown by your script and repeat the same request, so the script is executed again and again every 30 seconds. A possible solution may be the following:
Add the following code before $admin_id = '';
set_time_limit(100); // Set the max execution time
ignore_user_abort(true);
header('Connection: close');
flush();
fastcgi_finish_request();
This code will immediately close the connection with Telegram so it doesn't have to wait until the script terminates and it doesn't call the script again if an error occurs. With the set_time_limit(100) function you can increase the execution limit (for example to 100 seconds) so PHP doesn't kill everything before you have sent the broadcast message to everyone. If this operation takes more than 100 seconds just increse the limit or set it to 0 so it never ends, because according to the set_time_limit docs: If set to zero, no time limit is imposed.
I have a web application using PHP and PDO with SQLSRV prepared statements to display links to files for users to download. The back-end PHP script 'download.php' checks various items before serving the PDF to the user to download. The download.php file then should update a few SQL tables, and serve the PDF file to the user.
Please read my
Previous Question
and the troubleshooting completed there, if you need more information.
After troubleshooting, the error I thought was occurring (and thus the previous question I had asked) was incorrect. My download script is getting executed more than once for every file download.
I have searched the server logs and while debugging with Firebug, I can see my download.php script making multiple GET requests to the server. Sometimes the script completes only once as expected. Other times the script executes three to four request for the one click of the download link.
Now that I more fully understand what error is occurring, I need a bit of help fixing it.
I need to prevent the script from running multiple times, and thus updating the SQL table with records that are within a few milliseconds of each other.
The view page checks the SQL database for files the current user is allowed access to, and displays a list of links:
<a href='download.php?f={$item['name']}&t={$type}' target='_blank'>{$item['name']}</a>
Because the values are needed for the download.php script to work, I cannot change the request to a $_POST instead of $_GET.
What I have tried:
Checking/setting a session variable for 'downloading' state, before the getfile() which unsets right before the exit(0)
Putting the SQL statements in a separate PHP file and require'ing that
Adding a sleep(1) after the getfile()
Commenting out the header/PDF information
The first three measures did not work to prevent the double/triple execution of the PHP download script. However, the last measure DOES prevent the double/triple execution of the PHP script, but of course the PDF is never delivered to the client browser!
Question: How can I ensure that only ONE insert/update PER DOWNLOAD is inserted into the database, or at the least, how can I prevent the PHP script from being executed multiple times?
UPDATE
Screenshot of issue in firebug:
One request:
Two requests:
download.php script
<?php
session_start();
require("cgi-bin/auth.php");
// Don't timeout when downloading large files
#ignore_user_abort(1);
#set_time_limit(0);
//error_reporting(E_ALL);
//ini_set('display_errors',1);
function getfile() {
if (!isset($_GET['f']) || !isset($_GET['t'])) {
echo "Nothing to do!";
exit(0);
}
require('cgi-bin/connect_db_pdf.php');
//Update variables
$vuname = strtolower(trim($_SESSION['uname']));
$file = trim(basename($_GET['f'])); //Filename we're looking for
$type = trim($_GET['t']);//Filetype
if (!preg_match('/^[a-zA-Z0-9_\-\.]{1,60}$/', $file) || !preg_match('/^av|ds|cr|dp$/', $type)) {
header('Location: error.php');
exit(0);
}
try {
$sQuery = "SELECT TOP 1 * FROM pdf_info WHERE PDF_name=:sfile AND type=:stype";
$statm = $conn->prepare($sQuery);
$statm->execute(array(':sfile'=>$file,':stype'=>$type));
$result = $statm->fetchAll();
$count = count($result);
$sQuery = null;
$statm = null;
if ($count == 1 ){ //File was found in the database so let them download it. Update the time as well
$result = $result[0];
$sQuery = "INSERT INTO access (PDF_name,PDF_type,PDF_time,PDF_access) VALUES (:ac_file, :ac_type, GetDate(), :ac_vuname); UPDATE pdf_info SET last_view=GetDate(),viewed_uname=:vuname WHERE PDF_name=:file AND PDF_type=:type";
$statm = $conn->prepare($sQuery);
$statm->execute(array( ':ac_vuname'=>$vuname, ':ac_file'=>$file, ':ac_type'=>$type,':vuname'=>$vuname, ':file'=>$file, ':type'=>$type));
$count = $statm->rowCount();
$sQuery = null;
$statm = null;
//$result is the first element from the SELECT query outside the 'if' scope.
$file_loc = $result['floc'];
$file_name = $result['PDF_name'];
// Commenting from this line to right after the exit(0) updates the database only ONCE, but then the PDF file is never sent to the browser!
header("Content-Type: application/pdf");
header("Pragma: no-cache");
header("Cache-Control: no-cache");
header("Content-Length: " . filesize($file_loc));
header("Accept-Ranges: bytes");
header("Content-Disposition: inline; filename={$file_name}");
ob_clean();
flush();
readfile($file_loc);
exit(0);
} else { //We did not find a file in the database. Redirect the user to the view page.
header("Location: view.php");
exit(0);
}
} catch(PDOException $err) {//PDO SQL error.
//echo $err;
header('Location: error.php');
exit(0);
}
}
getfile();
?>
If you really need to make sure that a link only creates an event once, then you need to implement a token system, where when a hyperlink (or a form post target) is generated, a use once token is generated and stored (in the session or wherever), and then is checked in the calling script.
So your hyperlink may look like this:
<a href='download.php?token={some-token}&f={$item['name']}&t={$type}' target='_blank'>{$item['name']}</a>
On the php side this is a really simplified idea of what you might do:
<?php
session_start();
if (!isset($_REQUEST['token']) die(); // or fail better
if (!isset($_SESSION['oneTimeTokens'][$_REQUEST['token']) die(); // or fail better
if ($_SESSION['oneTimeTokens'][$_REQUEST['token']=='used') die(); // or fail better
$_SESSION['oneTimeTokens'][$_REQUEST['token']='used';
// we're good from this point
This would solve the effects of your problem, though not the double running itself. However since you want to make sure a link is firing an event only once NO MATTER WHAT, you probably implement this in some form or another as it's the only way to guarantee that any link generated has a one real use life that I can think of.
When generating the link you would do something like this in your code:
<?php
$tokenID = {random id generation here};
$_SESSION['oneTimeTokens'][$tokenID] = 'not used';
I'd also somewhere put a cleanup routine to remove all used tokens. Also, it's not a bad idea to expire tokens beyond a certain age, but I think this explains it.
Hey guys i'm making a website where you submit a server for advertising. When the user goes to the index page of my website it grabs the ip's of all the servers submitted and then tests to see if it is online using fsockopen() like so:
while($row = mysql_fetch_assoc($rs)) {
$ip = $row['ip'];
$info = #fsockopen($ip, 25565, $errno, $errstr, 0.5);
if($info) {
$status = "<div><img width='32px' height='32px'
title='$name is online!' src='images/online.png'/></div>";
$online = true;
} else {
$status = "<div><img width='32px' height='32px'
title='$name is offline!' src='images/offline.png'/></div>";
$online = false;
}
}
}
This way works fine, but the only downside is when you load the site it takes a good 2-4 seconds to start loading the website due to the fsockopen() methods being called. I want to know if there is a better way to do this that will reduce the amount of wait time before the website loads.
Any information will be appreciated, thanks.
Store the online status and last check time in a database, if the last check time is longer than 15 minutes for example, update it. I am pretty sure you don't need to get the status on EVERY pageload? It's the time it takes to connect to each server that slows down the website.
Then again, you would probably wanna move the update process to a cronjob instead of relying on someone visiting your website to update the server statuses.
Looking at your example, I'd make all the $status bits be javascript calls to another php page that checks that individual server.
However, the idea to move the status checks to cron job or use some kind of status caching is very good too. Maybe store statuses in a database only only check the ones that have expired (time limit set by you).
I am writing a script where it checks for an updated version from an external server. I use this code in the config.php file to check for latest version.
$data = get_theme_data('http://externalhost.com/style.css');
$latest_version = $data['Version'];
define('LATEST_VERSION', $latest_version);
This is fine and I can fetch the latest version (get_theme_data is WordPress function) but the problem is that it will be executed on every single load which I do not want. I also do not want to only check when a form is submitted for example. Alternatively I was looking into some sort of method to cache the result or maybe check the version every set amount of hours? Is such thing possible and how?
Here, gonna make it easy for you. Store the time you last checked for the update in a file.
function checkForUpdate() {
$file = file_get_contents('./check.cfg', true);
if ($file === "") {
$fp = fopen('./check.cfg', 'w+');
fwrite($fp, time() + 86400);
fclose($fp);
}
if ((int)$file > time()) {
echo "Do not updatE";
} else {
echo "Update";
$fp = fopen('./check.cfg', 'w+');
fwrite($fp, time() + 86400);
fclose($fp);
}
}
You can obviously make this much more secure/efficient if you want to.
Edit: This function will check for update once every day.
A scheduled task like this should be set up as a separate cron or at job. You can still write everything in PHP, just make a script that runs from the command line and does the updating. Checkout "man crontab" for details, and/or check which scheduling services your server is running.
Greeting Overflowers,
I am asked to code an email tracker using PHP.
Email clients request my PHP script thinking it is an image (BODY's background).
However, the client (namely Outlook 2007) hangs then (showing nothing) until my PHP script reaches its timeout.
<?php
define("DB_FILE", "sqlite:C:\wamp\www\database.sdb");
define("QUERY", "INSERT INTO Receipt (counter_id, reader_id, start_time, end_time) VALUES (%s, \"%s\", %d, %d)");
define("TIME_OUT", "10");
function track() {
global $counter_id;
global $reader_id;
global $start_time;
$end_time = time();
$db = new PDO(DB_FILE);
$db->exec(sprintf(QUERY, $counter_id, $reader_id, $start_time, $end_time));
}
$counter_id = $_GET["counter_id"];
$reader_id = $_SERVER["REMOTE_ADDR"];
$start_time = time();
set_time_limit(TIME_OUT);
register_shutdown_function("track");
while(!connection_aborted()) {
echo "\n";
ob_flush();
}
?>
It is the loop which should keep the HTTP connection alive for the period of client's reading the tracked email.
Any thoughts on how to solve this ?
Regards
I'm afraid there's no sensible way of telling how long somebody has spent looking at an email. Mail clients simply aren't designed with that kind of data gathering in mind, and trying to force a network connection to stay open will merely cause the kind of problems you've run into.
There are people(companies) doing some simple tracking of how long an email is open by exploiting the fact that browsers will continue to request more frames for a gif (see the technique here) until you tell them that the last frame has been sent.