I build a internet site with php and html. I detect the access from the users with a code like this :
$qry ="SELECT * FROM visit WHERE ip='" .$ip ."' AND date(quando)=CURRENT_DATE";
$arr=$dbcls->query_arr($qry);
if(count($arr)==0)
{
$data=get_location2($row["ip"]);
if($data)
{
$country=$dbcls->escape_string($data->country);
$sub=$dbcls->escape_string($data->subdivision);
$city=$dbcls->escape_string($data->city);
$qry="INSERT INTO visit(ip,n,country,region,city)
VALUES('" . $ip ."',1,'". $country. "','" . $sub . "','". $city. "');";
//echo $qry;
}
else
$qry="INSERT INTO visit(ip,n) VALUES('" . $ip ."',1);";
}
else
$qry="UPDATE visit SET n=n+1 WHERE ip='" . $ip ."' AND date(quando)=CURRENT_DATE ;";
$dbcls->query_command($qry);
that allow me to save all the users that login in my site. The next step is to save how many users downloads my program.
The question is: how can I detect when a user make a download with php? If I have to create code with Javascript How can access to my database with javascript?
Your URL downloads/Treebase.zip is not served by PHP, is a static file.
The download is managed by the web server not by PHP.
To manage by PHP you'll need to make a route and make the PHP send the file.
Something like downloads/download.php?file_name=Treebase.zip.
Related
I have a first site https://www.mydomain1.com in which I use PHP sessions. No problem, everything works fine, when I go from page to page, I can access my session variables.
I have a second site https://www.mydomain1.com in which I display part of my 1st site via an iframe:
<iframe src = "https://www.mydomain1.com" width = "100%" frameborder = "0" style = "border: 0" allowfullscreen = "allowfullscreen" id = "frameLeonard"> </iframe>
And there strangely, the session variables are no longer recognized. I'm not even trying to get my 1st site to access the session variables from the 2nd site (that's not the goal and it's normal that it doesn't work) but just run the 2nd site inside the 1st site.
Strangely, it was still working a year ago.
Has there been any upgrade that would explain the problem?
Thank you in advance for your lights !
Now I found the reason, chrome shows this behaviour. With version 80 (Feb. 2020) it has it's "SameSite by default cookies" enabled as default, which means that including external pages (different domain) inside an iframe, will kill their sessions.
For preventing this, you can disable "SameSite by default cookies" in chrome://flags
Beware: This might be a security issue (but solved my problem for now)
Otherwise - if using PHP 7.3 or newer - you could add one (or both) of the following ini_set() in your PHP before session_start():
ini_set('session.cookie_samesite', 'None');
session_set_cookie_params(['samesite' => 'None']);
Here you get further details:
https://blog.heroku.com/chrome-changes-samesite-cookie#prepare-for-chrome-80-updates
i recommend you use MySQL function for that,
// to add captcha record via img file.
$time = time();
$deltime = time()-1500;
$ip = $_SERVER['REMOTE_ADDR'];
$result = $conn->query("SELECT * FROM `captcha` WHERE `ip` = '" . $ip . "'");
if (($result) && ($result->num_rows >= 1))
{
$conn->query("UPDATE `captcha` SET `captcha` = '".$_SESSION["captcha"]."' WHERE `ip` = '".$ip."'");
}
else
{
$conn->query("DELETE FROM `captcha` WHERE `time` < '".$deltime."'");
$sql = "INSERT INTO `captcha` (captcha, ip, time) VALUES ('".$_SESSION["captcha"]."', '".$ip."', '".$time."')";
if ($conn->query($sql) === TRUE) {
//echo "New record created successfully";
} else {
//echo "Error: " . $sql . "<br>" . $conn->error;
}
}
// on process file to match captcha code
$ip = $_SERVER['REMOTE_ADDR'];
$result = $conn->query("SELECT * FROM `captcha` WHERE `ip` = '" . $ip . "'");
while ($row = $result->fetch_assoc())
{
$captcha = $row['captcha'];
}
if ($captcha == $_POST["access_token"]) { /* do anything */ }
Having the same problem here, but no solution yet.
I made several tests. Seems only to occur, when iFrame loaded content is SSL certificated. If not, it works perfect.
Maybe this is helpful. Or did you get any solution yet?
I have a PHP program connected to an MYSQL database on a website.
Upon clicking a link to download a file, the program reads an integer field from the database, increments it, then puts the number back, to count the number of downloads. That program works. The download counts, however, over time, seem to be moderately inflated.
Could the download counts be incremented by web robots following the links to download the files? If so, would telling the web robots to ignore the download page on the website, using the robots.txt file, solve the inflated count problem?
Here is the PHP code:
function updateDownloadCounter($downloadPath, $tableName, $fileNameField, $downloadCountField, $idField)
{
require("v_config.php");
if(isset($_REQUEST["file_id"]) && is_numeric($_REQUEST["file_id"])) {
try
{
$sql = "SELECT * FROM " . $tableName . " WHERE file_id = " . $_REQUEST[$idField];
$connection = new PDO($dsn, $username, $password, $options);
$statement = $connection->prepare($sql);
$statement->execute();
$result = $statement->fetchAll();
if ($result && $statement->rowCount() == 1)
{
foreach ($result as $row)
{
if(is_file($_SERVER['DOCUMENT_ROOT'].$downloadPath . $row[$fileNameField]))
{
$count = $row[$downloadCountField] + 1;
$sql = "UPDATE " . $tableName . " SET " . $downloadCountField . " = " . $count . " WHERE file_id = " . $_REQUEST[$idField];
$statement = $connection->prepare($sql);
$statement->execute();
$documentLocationAndName = $downloadPath . $row[$fileNameField];
header('Location:' . $documentLocationAndName);
}
}
}
}
catch(PDOException $error)
{
echo $sql . "<br>" . $error->getMessage();
}
}
}
The answer to both of your questions is yes.
When a crawler indexes your website, it also looks for related content, akin to creating a sitemap. The first place it looks for related content on a page are the direct links. If you're linking to your files directly on your download page, the crawler will also attempt to index those links.
Preventing the crawlers from seeing your download page with robots.txt would prevent this problem, but then you'd be losing potential SEO. And what if a third party links to your downloads directly? If they have their downloads page indexed, your links will still be visible to crawlers.
Fortunately, you can disable this behaviour. Simply tell the crawlers that the links on the download page are all canonical ones, by adding the following to the <head> section of the downloads page:
<link rel="canonical" href="http://www.example.com/downloads" />
Considering the parameters are essentially different 'pages', crawlers will think that /downloads?file_id=1 is different to /downloads. Adding the above line will inform them that it is the same page, and that they don't need to bother.
Assuming that you have actual files that are being indexed (such as PDFs), you can prevent crawlers from indexing them in your .htaccess or httpd.conf:
<Files ~ "\.pdf$">
Header set X-Robots-Tag "noindex, nofollow"
</Files>
As a fallback, you could always check who is attempting to download the file in the PHP itself! It depends how pedantic you want to be (as there are a lot of different crawlers), but this function works pretty well:
function bot_detected() {
return (
isset($_SERVER['HTTP_USER_AGENT'])
&& preg_match('/bot|crawl|slurp|spider|mediapartners/i', $_SERVER['HTTP_USER_AGENT'])
);
}
Then simply call it as a conditional before running your try:
if (!bot_detected()) {
try { } // Will only get executed for real visitors
}
Also, as an aside, I'd recommend using $_GET["file_id"] over $_REQUEST["file_id"]. $_REQUEST combines $_GET with both $_POST and $_COOKIE, which tend to be used in rather different ways. While this is technically secure if you're only retrieving data, it's far safer to limit the request to a simple $_GET.
Hope this helps! :)
I have some simple system to upload files and keep track of them for each particular user, using a database.
The problem of mine is, I connect to the database in the file checklogin.php, which is responsible to handle the $_POST from main_login.php.
In file 'checklogin.php':
$current_user_name = NULL;
which is a global variable for all files. Now in file signup.php, I try to include the checklogin.php to define that variable:
require_once '/checklogin.php';
...
mysql_query("INSERT INTO " . tbl_name . " (username, userpassword, userisadmin)
VALUES (''" . $_POST['myusername'] . "',"
. "'" . md5($_POST['mypassword']). "',"
. "0)");
$current_user_name = $_POST['myusername'];
header("location:login_success.php");
As you can see, I'm trying to set the value of the variable $current_user_name = $_POST['myusername'];, but when header goes to the file login_success.php, which is having require_once '/checklogin.php'; too, the variable is set again to null.
How can I solve this problem? i.e. How can I store the current user so that it is accessible by all files?
You cannot store a variable like that. Each request will be new execution in sever. In this kind situation we have to use session please check this
And another issue with your code is SQL injection, Please read this too
You can not access the Parameter received at checklogin.php
what you can do you can check the the login status and set the current user in session.
From session variable you can access and set the current user.
you can set a session variable for it and on every you can use it like this
session_start();
if(isset($_SESSION['current_user_name']))
{
$current_user_name = $_SESSION['current_user_name'];
}
else
{
$current_user_name = NULL;
}
and set your session variable as follows
session_start();
require_once '/checklogin.php';
////...
mysql_query("INSERT INTO " . tbl_name . " (username, userpassword, userisadmin)
VALUES (''" . $_POST['myusername'] . "',"
. "'" . md5($_POST['mypassword']). "',"
. "0)");
$current_user_name = $_POST['myusername'];
$_SESSION['current_user_name'] = $current_user_name; // set your session here
header("location:login_success.php");
I created a simple PHP MVC framework and I'm familiar with PHP. I think I understand the basics of JavaScript, but I have no idea how to use it with my MVC framework. Right now I have a folder in my root directory called scripts, a file inside of it called javascript.js and I put the appropriate source thing in my template. All I want to do right now is make a simple confirm box in the admin panel before accepting/deleting an application to join my site. Obviously there are two buttons (accept/delete) and I use onclick to call a function (AdminModel::acceptApplication). This is the AdminModel:acceptApplication function up to this point:
public function acceptApplication($id) {
$confirm=AdminModel::confirm();
if($confirm) {
$mysqli = BaseModel::dbConnect();
$sql = "SELECT * FROM applications WHERE id=" . $id;
$result = mysqli_query($mysqli, $sql);
$row = mysqli_fetch_array($result);
$sql = "INSERT INTO users (fname, lname, email, password) VALUES (" . $row['fname'] . ", " . $row['lname'] . ", " . $row['email'] . ", " . $row['password'] . ")";
mysqli_query($mysqli, $sql);
$sql = "DELETE FROM applications WHERE id=" . $id;
mysqli_query($mysqli, $sql);
header('Location: http://www.canforce.org/' . $_SESSION['language'] . '/admin/applications');
}
public function confirm() {
$confirm = echo '<script> areYouSure(); </script>';
return($confirm);
}
The JavaScript areYouSure() function returns true if you click yes:
function areYouSure() {
if(<?php echo $_SESSION['language'] ?> == "fr") {
confirm("Êtes-vous sûr");
}
else {
confirm("Are you Sure?");
}
}
I'm guessing there's allot wrong with what I've done here, simply bc of the whole server side/client side thing, but then I have no idea how to use javascript properly within my website. I want this to work, but if anybody has any tips or links to tutorial on how I can incorporate javascript into my php mvc framework, that would be appreciated as well. Thanks
PHP runs on a server. Javascript for the purpose of this conversation runs in the browser. You are trying to get the results of a browser level call on the server, another computer, it will not work. Your Model code exists on the server.
You need to have the js file included in your html file, in this case whatever passes for your view.
PS. The purpose of prepared statements is to prevent someone being able to run queries against your database, including deletes and getting all of your user info.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
how to count the site current visitors using java script or php
I have an embedded stream on my website, but I want to pull the number of live viewers on the page. Is there a way to do this with PHP / AJAX, show the number of people currently viewing one of my webpages?
DISCLAIMER: I did something like this a LONG time ago, so here is the ugly old code (That I'm not going to put effort into making look nicer / not when I first started programming as this is just to give you an idea of how it can be done, and not to spoon feed any specific code).
$timeout = time() - (20);
$sessid_exist = mysql_query("SELECT sessid FROM bw_sessions WHERE sessid='" . session_id() . "'") or die (mysql_error());
$sessid_check = mysql_num_rows($sessid_exist);
if ($_SESSION['bw_username']) {
$sql = mysql_query("UPDATE bw_sessions SET timestamp='" . time() . "', username='" . $_SESSION['bw_username'] . "' WHERE sessid='" . session_id() . "'");
} else {
if($sessid_check > 0){
$sql = mysql_query("UPDATE bw_sessions SET timestamp='" . time() . "' WHERE sessid='" . session_id() . "'");
} else {
$sql = mysql_query("INSERT INTO bw_sessions (id, username, sessid, timestamp, ip)
VALUES(null, '', '" . session_id() . "', '" . time() . "', '" . $_SERVER['REMOTE_ADDR'] . "')") or die (mysql_error());
}
}
$sql = mysql_query("SELECT distinct sessid FROM bw_sessions WHERE username='' AND timestamp >= '$timeout' ORDER BY timestamp DESC") or die (mysql_error());
$sql2 = mysql_query("SELECT distinct sessid,username FROM bw_sessions WHERE username!='' AND timestamp >= '$timeout' ORDER BY username DESC") or die (mysql_error());
$num_guests = mysql_num_rows($sql);
$num_reg = mysql_num_rows($sql2);
?>
<font size='1'>Currently Online: <br>
<?=$num_guests;?> Guests<br>
<?=$num_reg;?> Registered users
You just need to make a table and hold session_id's.. then query that table for any "recent" activity. If you want real time updates, put code above (modified to your table design) in "online.php" and call it via jquery every x seconds, or however you decide to do it.
Something like Clicky should work for you.
If you've seen it somewhere, it probably IS possible, and this sort of thing is built into most forum sofware, and it's used on a lot of websites, so it's probably not that hard.
The usual way of doing this gets the IP of a connected visitor from $_SERVER['REMOTE_ADDR'] and then writes that to a file, and adding up all the unique IP's to find out how many people are connected.
This will need some sort of cleaning function to remove any IP's that are no longer connected.
The script runs on pageload, and counts visitors in a file, so if using ajax you run a PHP script polling that file every so often to update the count dynamically, or you could do it on pageload, but then ajax is'nt necessary as you could just do it with PHP.
If you don't already know how, figuring out how to run a PHP script in $.ajax is the first thing to do, then writing a function that counts visitors is probably the next.
I got 244 million hits on a search for such a script, and there's one here and here.