I have a PHP program connected to an MYSQL database on a website.
Upon clicking a link to download a file, the program reads an integer field from the database, increments it, then puts the number back, to count the number of downloads. That program works. The download counts, however, over time, seem to be moderately inflated.
Could the download counts be incremented by web robots following the links to download the files? If so, would telling the web robots to ignore the download page on the website, using the robots.txt file, solve the inflated count problem?
Here is the PHP code:
function updateDownloadCounter($downloadPath, $tableName, $fileNameField, $downloadCountField, $idField)
{
require("v_config.php");
if(isset($_REQUEST["file_id"]) && is_numeric($_REQUEST["file_id"])) {
try
{
$sql = "SELECT * FROM " . $tableName . " WHERE file_id = " . $_REQUEST[$idField];
$connection = new PDO($dsn, $username, $password, $options);
$statement = $connection->prepare($sql);
$statement->execute();
$result = $statement->fetchAll();
if ($result && $statement->rowCount() == 1)
{
foreach ($result as $row)
{
if(is_file($_SERVER['DOCUMENT_ROOT'].$downloadPath . $row[$fileNameField]))
{
$count = $row[$downloadCountField] + 1;
$sql = "UPDATE " . $tableName . " SET " . $downloadCountField . " = " . $count . " WHERE file_id = " . $_REQUEST[$idField];
$statement = $connection->prepare($sql);
$statement->execute();
$documentLocationAndName = $downloadPath . $row[$fileNameField];
header('Location:' . $documentLocationAndName);
}
}
}
}
catch(PDOException $error)
{
echo $sql . "<br>" . $error->getMessage();
}
}
}
The answer to both of your questions is yes.
When a crawler indexes your website, it also looks for related content, akin to creating a sitemap. The first place it looks for related content on a page are the direct links. If you're linking to your files directly on your download page, the crawler will also attempt to index those links.
Preventing the crawlers from seeing your download page with robots.txt would prevent this problem, but then you'd be losing potential SEO. And what if a third party links to your downloads directly? If they have their downloads page indexed, your links will still be visible to crawlers.
Fortunately, you can disable this behaviour. Simply tell the crawlers that the links on the download page are all canonical ones, by adding the following to the <head> section of the downloads page:
<link rel="canonical" href="http://www.example.com/downloads" />
Considering the parameters are essentially different 'pages', crawlers will think that /downloads?file_id=1 is different to /downloads. Adding the above line will inform them that it is the same page, and that they don't need to bother.
Assuming that you have actual files that are being indexed (such as PDFs), you can prevent crawlers from indexing them in your .htaccess or httpd.conf:
<Files ~ "\.pdf$">
Header set X-Robots-Tag "noindex, nofollow"
</Files>
As a fallback, you could always check who is attempting to download the file in the PHP itself! It depends how pedantic you want to be (as there are a lot of different crawlers), but this function works pretty well:
function bot_detected() {
return (
isset($_SERVER['HTTP_USER_AGENT'])
&& preg_match('/bot|crawl|slurp|spider|mediapartners/i', $_SERVER['HTTP_USER_AGENT'])
);
}
Then simply call it as a conditional before running your try:
if (!bot_detected()) {
try { } // Will only get executed for real visitors
}
Also, as an aside, I'd recommend using $_GET["file_id"] over $_REQUEST["file_id"]. $_REQUEST combines $_GET with both $_POST and $_COOKIE, which tend to be used in rather different ways. While this is technically secure if you're only retrieving data, it's far safer to limit the request to a simple $_GET.
Hope this helps! :)
Related
I have a first site https://www.mydomain1.com in which I use PHP sessions. No problem, everything works fine, when I go from page to page, I can access my session variables.
I have a second site https://www.mydomain1.com in which I display part of my 1st site via an iframe:
<iframe src = "https://www.mydomain1.com" width = "100%" frameborder = "0" style = "border: 0" allowfullscreen = "allowfullscreen" id = "frameLeonard"> </iframe>
And there strangely, the session variables are no longer recognized. I'm not even trying to get my 1st site to access the session variables from the 2nd site (that's not the goal and it's normal that it doesn't work) but just run the 2nd site inside the 1st site.
Strangely, it was still working a year ago.
Has there been any upgrade that would explain the problem?
Thank you in advance for your lights !
Now I found the reason, chrome shows this behaviour. With version 80 (Feb. 2020) it has it's "SameSite by default cookies" enabled as default, which means that including external pages (different domain) inside an iframe, will kill their sessions.
For preventing this, you can disable "SameSite by default cookies" in chrome://flags
Beware: This might be a security issue (but solved my problem for now)
Otherwise - if using PHP 7.3 or newer - you could add one (or both) of the following ini_set() in your PHP before session_start():
ini_set('session.cookie_samesite', 'None');
session_set_cookie_params(['samesite' => 'None']);
Here you get further details:
https://blog.heroku.com/chrome-changes-samesite-cookie#prepare-for-chrome-80-updates
i recommend you use MySQL function for that,
// to add captcha record via img file.
$time = time();
$deltime = time()-1500;
$ip = $_SERVER['REMOTE_ADDR'];
$result = $conn->query("SELECT * FROM `captcha` WHERE `ip` = '" . $ip . "'");
if (($result) && ($result->num_rows >= 1))
{
$conn->query("UPDATE `captcha` SET `captcha` = '".$_SESSION["captcha"]."' WHERE `ip` = '".$ip."'");
}
else
{
$conn->query("DELETE FROM `captcha` WHERE `time` < '".$deltime."'");
$sql = "INSERT INTO `captcha` (captcha, ip, time) VALUES ('".$_SESSION["captcha"]."', '".$ip."', '".$time."')";
if ($conn->query($sql) === TRUE) {
//echo "New record created successfully";
} else {
//echo "Error: " . $sql . "<br>" . $conn->error;
}
}
// on process file to match captcha code
$ip = $_SERVER['REMOTE_ADDR'];
$result = $conn->query("SELECT * FROM `captcha` WHERE `ip` = '" . $ip . "'");
while ($row = $result->fetch_assoc())
{
$captcha = $row['captcha'];
}
if ($captcha == $_POST["access_token"]) { /* do anything */ }
Having the same problem here, but no solution yet.
I made several tests. Seems only to occur, when iFrame loaded content is SSL certificated. If not, it works perfect.
Maybe this is helpful. Or did you get any solution yet?
What is happening is I think my code is selecting the data first (basically old data) then updating it but what I want is for it to update then select the data (new data). How can I do this?
I am going to post where it goes wrong and if you need the full code just ask:
$select_links = $db->query("SELECT pid, added_by,link_title,lid,link_order FROM " . TABLE_PREFIX . "homepage_links WHERE pid='$pid'
ORDER BY link_order DESC LIMIT $start,$show");
$check_link_count_rows = $db->num_rows($select_links);
echo "<b> You Current Have " . $check_link_count_rows . " Links On Your Page: </b><br>";
echo "<form action='' method='POST'>
";
while($select_links_array = $db->fetch_array($select_links)) {
$link_title_display = $select_links_array['link_title'];
$link_id_display = $select_links_array['lid'];
if(!$mybb->input["order_edit_$link_id_display"]) {
$link_order_display = $select_links_array['link_order'];
} else {
$link_order_display = $mybb->input["order_edit_$link_id_display"];
}
$order_edit_value1 = $mybb->input["order_edit_$link_id_display"];
$order_edit_value = $db->escape_string($order_edit_value1);
echo "<br>" . $link_title_display . " <a href='?operation=edit_links&link=$link_id_display'> (edit) </a>
<input type='number' name='order_edit_$link_id_display' value='$link_order_display' style='width:40px;'>
<input type='hidden' name='get_link_id_display_value_$link_id_display' value='$link_id_display'><br>
";
$get_link_id_display_value1 = $mybb->input["get_link_id_display_value_$link_id_display"];
$get_link_id_display_value = $db->escape_string($get_link_id_display_value1);
$update_quick_edit_query = $db->query("UPDATE spud_homepage_links SET link_order='$order_edit_value'
WHERE lid='$get_link_id_display_value'");
}
I cannot find a solution as everything is in the right place for it to work besides this bug.
After a discussion in the comments, I determined that you were attempting to render a page after a post form submission that amends the database. It is perfectly possible to re-read your new database state and render it in a post operation, but it is inadvisable, since browsers cannot refresh the page without asking you if you wish to run the operation again. This does not make for a good user experience, especially in relation to using the back/forward buttons.
The reason for this behaviour is that post operations generally modify the database. They are used for example in credit card purchases or profile amendments where some change in the state of the server is expected. Thus, it is good practice to execute a new round-trip to the server, after the write operation, to change the page method from post to get.
The header() call I linked to will do this, and will resolve your rendering problem too.
I created a simple PHP MVC framework and I'm familiar with PHP. I think I understand the basics of JavaScript, but I have no idea how to use it with my MVC framework. Right now I have a folder in my root directory called scripts, a file inside of it called javascript.js and I put the appropriate source thing in my template. All I want to do right now is make a simple confirm box in the admin panel before accepting/deleting an application to join my site. Obviously there are two buttons (accept/delete) and I use onclick to call a function (AdminModel::acceptApplication). This is the AdminModel:acceptApplication function up to this point:
public function acceptApplication($id) {
$confirm=AdminModel::confirm();
if($confirm) {
$mysqli = BaseModel::dbConnect();
$sql = "SELECT * FROM applications WHERE id=" . $id;
$result = mysqli_query($mysqli, $sql);
$row = mysqli_fetch_array($result);
$sql = "INSERT INTO users (fname, lname, email, password) VALUES (" . $row['fname'] . ", " . $row['lname'] . ", " . $row['email'] . ", " . $row['password'] . ")";
mysqli_query($mysqli, $sql);
$sql = "DELETE FROM applications WHERE id=" . $id;
mysqli_query($mysqli, $sql);
header('Location: http://www.canforce.org/' . $_SESSION['language'] . '/admin/applications');
}
public function confirm() {
$confirm = echo '<script> areYouSure(); </script>';
return($confirm);
}
The JavaScript areYouSure() function returns true if you click yes:
function areYouSure() {
if(<?php echo $_SESSION['language'] ?> == "fr") {
confirm("Êtes-vous sûr");
}
else {
confirm("Are you Sure?");
}
}
I'm guessing there's allot wrong with what I've done here, simply bc of the whole server side/client side thing, but then I have no idea how to use javascript properly within my website. I want this to work, but if anybody has any tips or links to tutorial on how I can incorporate javascript into my php mvc framework, that would be appreciated as well. Thanks
PHP runs on a server. Javascript for the purpose of this conversation runs in the browser. You are trying to get the results of a browser level call on the server, another computer, it will not work. Your Model code exists on the server.
You need to have the js file included in your html file, in this case whatever passes for your view.
PS. The purpose of prepared statements is to prevent someone being able to run queries against your database, including deletes and getting all of your user info.
good day
need some help here, my Delete button works but page is not automatically refreshing after i clicked the delete button. i still need to manually retrieve the data from db and it would reflect that data is deleted already...
here is my code for delete php: how can i make this to refresh the page automatically?
<?php
require 'include/DB_Open.php';
$id = $_POST['id'];
$idtodelete = "'" . implode("','",$id) . "'";
$query = "DELETE FROM tbl WHERE ticket in (" . $idtodelete . ")";
$myData = mysql_query($query);
echo "DATA DELETED";
if($myData)
{
header("Location: delete.php");
}
include 'include/DB_Close.php';
?>
I suggest fetching the data after your delete logic. Then the delete logic will be executed before fetching the tickets.
Then a redirect to the same page isn't even necessary.
//
// DELETE
//
if (isset($_POST['delete'] && isset($_POST['id'])) {
// Do delete stuff,
// notice delete variable which would be the name of the delete form button e.g.
// If you like, you can still echo "Data deleted here" in e.g. a notification window
}
//
// FETCH data
//
$query = "Select * FROM tbl";
...
if you use post method better with this
if ($_SERVER["REQUEST_METHOD"] == "POST")
{
$id = $_POST['id'];
$idtodelete = "'" . implode("','",$id) . "'";
$query = "DELETE FROM tbl WHERE ticket in (" . $idtodelete . ")";
if (mysql_query($query))
{
header("Location: delete.php");
} else {
echo "Can not delete";
}
}
As suggested on one of the comments, and on the php documentation:
http://it2.php.net/manual/en/function.header.php :
Remember that header() must be called before any actual output is sent, either by normal HTML tags, blank lines in a file, or from PHP. It is a very common error to read code with include, or require, functions, or another file access function, and have spaces or empty lines that are output before header() is called. The same problem exists when using a single PHP/HTML file.
Basically you have to take out the :
echo "DATA DELETED";
What's the point to try to echo that string if the page is going to be redirected anyway?
If you want to make it fancy you could use Ajax to delete it, and trigger a setTimeout() on JavaScript x seconds after showing the message.
Or if you really really really really, REALLY, want to do it this way, you could disable the errors report/display (using error_reporting(0) and ini_set('display_errors', 'Off'). By experience I know that it will work, but it's nasty and extremately ultra highly not recommended
I have a small problem with the php content-disposition, I kind of understand where the problem lies but I have no idea how to solve it (new to using databases). Calling this php page will result in not showing any of the echos and only showing the download box, which I intended for the "cv" only (not sure if it's working that way, because the downloadable file I receive cannot be opened)
Removing the header(content... line will result in showing the echos, but I won't be able to download the specified file. I want it to show as a link which would download its contents when clicked.
$newEmployeeName = $_POST['name'];
$newEmployeeArea = $_POST['area'];
$newEmployeeCV = $_POST['cv'];
include('databaseConnection.php');
$result = mysql_query("SELECT * FROM participants");
while($row = mysql_fetch_array($result))
{
$download_me = $row['cv'];
header("Content-Disposition: attachment; filename=$download_me");
echo $row['name'] . " " . $row['area_of_exp'] . " " . $download_me;
echo "<br />";
}
The Content-Disposition header will force the script to present anything echoed after it as a download. You would normally use this with reading a file from the file system, so you can offer that as a download. In your case, if you’re storing CVs on your server then you may offer them as a download as follows:
<?php
$sql = "SELECT * FROM table WHERE id = :id LIMIT 1";
$stmt = $db->prepare($sql);
$stmt->bindParam(':id', $id, PDO::PARAM_INT);
$stmt->execute();
$row = $stmt->fetchObject();
if ($row) {
header('Content-Disposition: attachment; filename=' . $row['filename']);
readfile($uploads_dir . $row['filename']);
exit;
}
else {
die('Invalid CV requested.');
}
Obviously the above is a simplified version of the process and you will need to tweak it to fit your application, but that’s the gist of it.
Also, don’t use the mysql_ functions. They’re deprecated (as per the warning on this page). Use either PDO or the new MySQLi (MySQL improved) extension.