Running concurrent database queries in PHP per session? - php

I appear to be having some difficulties with handling running multiple database calls, especially in regards to large datasets being returned. It appears that PHP only lets you have one database call running at a time, per session. This normally isn't an issue, as the database calls tend to be so small it doesn't lock anything up, but the large ones cause this waiting issue.
I discovered this issue when I fixed an unrelated issue, and discovered that if you click a button to query the database via an AJAX call, then try to refresh the website, it won't start loading the website until that database call is done, as the page does have an internal function to make a database call. Conversely, if I were to start the database query, then load up a pure html webpage stating "Hello World", it loads instantly. Based on this, Apache isn't having an issue serving, it's something to do with database connections.
To point, I've isolated code that's possibly relevant, as I can't figure out why I'm only able to have one active call at a time. In short, is there a way to have multiple database calls running per user at a time, or will a user have to wait?
db_connect.php:
<?php
$user = 'TEST';
include_once 'config.php'; //Intialize constants for the connection
$conn = oci_connect(USER, PASSWORD, '//'.HOST.':1630/'.DATABASE);
oci_set_client_identifier($conn, $user); //Identify who's making these calls.
?>
events.php: (if I refresh this after clicking the ajax button to do the same fetch, it won't load until that AJAX call is finished. Doesn't matter if I have code to abort the call, the database is still running that database query.)
<?php
session_start();
include 'db_connect.php';
include 'database/event_defs.php';
?>
<html>
<!-- boilerplate nonsense -->
<body>
<table>
<?php
$dataset = get_event_list($conn, $_SESSION['username']); //Returns 1000 records, could take a while to fully retrieve it.
foreach($dataset as $key => $val) {
//Make multiple rows happen here.
}
?>
</table>
<button onclick="do_ajax_call('get_event_list');">Make DB Call</button>
</body>
</html>
database/event_defs.php: (Probably the most relevant part).
<?php
function get_event_list($conn, $user) {
$l_result = array();
$sql = 'BEGIN ...(:c_usr, :c_rslt); END'; //name is irrelevant.
if($stmt = oci_parse($conn, $sql)) {
$l_results = oci_new_cursor($conn);
oci_bind_by_name($stmt,':c_usr',$user);
oci_bind_by_name($stmt,':c_rslt',$lresults,-1,OCI_B_CURSOR);
if(oci_execute($conn)) {
oci_execute($l_results); //Problem line, seems to stall out here for a while and won't let the user query again until this call finishes.
while($r = oci_fetch_array($l_results, OCI_ASSOC) {
$l_result[] = $r;
}
} else {
return 'bad statement';
}
} else {
return 'unable to connect';
}
return $l_result;
}
?>
Version information:
PHP 5.4.45
Oracle 11g
Apache 2.2.15

As MonkeyZeus monkey has already pointed out in the comments to your question, the second request is most likely only blocked by the session mechanism.
Since it looks like you don't need anything but the username from the session, just grab that value and finish the session mechanism.
<?php
session_start();
// check $_SESSION['username'] here if necessary
$username = $_SESSION['username'];
// no need to keep the session mecahnism "alive"
session_abort(); // and since nothing has been written to _SESSION, abort() should do.
require 'db_connect.php';
require 'database/event_defs.php';
?>
<html>
<!-- boilerplate nonsense -->
<body>
<table>
<?php
$dataset = get_event_list($conn, $username); //Returns 1000 records, could take a while to fully retrieve it.
foreach($dataset as $key => $val) {
//Make multiple rows happen here.
}
?>

It's PHP session blocking mechanism.
You need to call session_write_close() when you don't need session any more.
May be after this string:
$dataset = get_event_list($conn, $_SESSION['username']);
After calling session_write_close() you can't use $_SESSION.

Related

How to stop PHP and SQL execution when user disconnects?

I have a quite heavy SQL search query that takes a few minutes to complete, called from a PHP script that's called by an ajax request.
Problem is, users often click the search button many times from impatience, and each time it creates a new ajax request, and a new PHP script execution. But the old ones continue executing, even though the connection has been dropped. This causes the SQL server load to constantly be at 100% CPU usage.
So I tested my theory that script execution continues even after the browser tab is closed. I used 2 types of queries, an ad hoc query and a stored procedure execution, both methods do the same thing, inserting the numbers 1-9 into a table, with a 2 second delay between each number.
Ad hoc query:
for($i = 1; $i < 10; $i++){
$sql = "INSERT INTO t (i) VALUES(?)";
$res = pdoQuery($sql, array($i));
if($res === false){
echo $pdo->getErrorMessage();
http_response_code(500);
exit();
}
sleep(2);
}
SP call:
$sql = "EXEC sp_Slow";
$res = pdoQuery($sql);
if($res === false){
echo $pdo->getErrorMessage();
http_response_code(500);
exit();
}
How I tested: using buttons that trigger ajax calls to each script, I tested them, by clicking the button and immediately closing the tab. And then monitoring the data in the table. And just as I suspected, new data gets inserted every 2 seconds. (This also happens if I directly open the script in the browser and closing the tab, instead of requesting it through ajax)
I need a way to completely kill both PHP and SQL execution whenever the user disconnects, transactions are not important because it's just a select operation.
You can change this behaviour using php.ini directive or at runtime with ignore_user_abort() function.
Here's what I did, from the comment by #Robbie Toyota, thanks!
if(!empty($_SESSION['SearchSPID'])){
$sql = "KILL " . $_SESSION['SearchSPID'];
$res = pdoQuery($sql);
if($res === false){
exit('Query error: failed to kill existing spid:' . $_SESSION['SearchSPID']);
}
$_SESSION['SearchSPID'] = null;
}
$sql = "SELECT ##spid AS SPID";
$res = pdoQuery($sql);
$spid = $res->row_assoc()['SPID'];
$_SESSION['SearchSPID'] = $spid;
// do long query here
$_SESSION['SearchSPID'] = null;
Of course using this method you have to be careful about session file locking, which if happens will make this whole thing pointless, because then the requests will be sequential and not parallel

How to optimize a database connection and query, to prevent MySQL run out of memory?

I have a class for my db connection, witch i try to use in order to limit the amount of connections to the server.
If i have a lot of users, the sql server crash due to run out of memory. it can be the server setup, or it can be related to the code? i would appreciate any suggestions here.
The __destruct kills the connection, but does it free the result in order to take on more query's?
I'm leaving the example code here:
class db{
public $db_connection;
public function __construct(){
$this->db_connection = new mysqli("127.0.0.1","myuser","passwd","dbname");
$this->db_connection->set_charset("utf8");
if($this->db_connection->connect_errno) {
echo "Failed to connect to database: " . $db_connection->connect_error;
}
}
public function __destruct(){
return $this->db_connection->close();
}
}
Initiate the db connection, usally i do this in header, so i can re-use the connection troughout the page.
$db = new db();
Typical query on a random page: (can be more on one page)
<div class="container">
<div class="row">
<div class="col-md-12">
$result = $db->db_connection->query("SELECT * FROM news");
if ($result) {
while ($obj = $result->fetch_object()) {
//some html code here
}
}
</div>
</div>
</div>
Every connection to your page is going to try and open a connection to MySQL. Every page load is going to be retrieving everything from the 'news' table.
There's a limit to how many connections MySQL can handle - once you hit that the page is going to break.
A PHP class will not limit the number of connections to the MySQL server. What will limit the number of connections is how many web server processes you run, and of those processes how many are executing PHP.
You should look into using some sort of local file based cache (e.g. a scheduled task writes the 'news' data out into a static html file, and each page load then just includes 'news.html'). This would allow you to remove the database connection and query from the rendering of the page.
Alternatively, use something like CloudFlare (assuming your pages are fairly static and are setting cache friendly headers)

Improve code that periodically query SQLite database

I have an sqlite database that I query from PHP periodically. The query is always the same and it returns me a string. Once the string changes in the database the loop ends.
The following code is working, but I am pretty sure this is not the optimal way to do this...
class MyDB extends SQLite3
{
function __construct()
{
$this->open('db.sqlite');
}
}
$loop = True;
while ($loop == True) {
sleep(10);
$db = new MyDB();
if (!$db) {
echo $db->lastErrorMsg();
} else {
echo "Opened database successfully\n";
}
$sql = 'SELECT status from t_jobs WHERE name=' . $file_name;
$ret = $db->query($sql);
$state = $ret->fetchArray(SQLITE3_ASSOC);
$output = (string)$state['status'];
if (strcmp($output, 'FINISHED') == 0) {
$loop = False;
}
echo $output;
$db->close();
}
If you want to get an output immediately and a kind of interface, I think The best solution for your problem might be to use HTTP long polling. This way, it will not hold the connection for hours if the job is not done:
you will need to code a javascript snippet (in another html or php page) that runs an ajax call to your current php code.
Your web server (and so, your php code) will keep the connection opened for a while until the job is done or a time limit is reached (say 20-30 seconds)
if the job is not done, the javascript will make another ajax call and everything will start again, keeping the connection, etc... until you get the expected output status...
BEWARE : this solution will not work on any hosting provider
You will need to set the max_execution_time to a higher value than the default one see php doc for this.
I think you can find many things on http long polling with php/javascript on google / stack overflow...

Security vulnerabilities in code to insert and return last inserted row number

I have the following code which is supposed to insert a row into a DB table "clicks" (consisting 1 Primary AI column "id" and another column "user" which contains the user's sessions id) upon clicking the Like button. For each user assuming they have a session id set from a login I would like to return to them their most recently inserted id from the table. So the first time the button is clicked it will return 1 etc.
I would like this to be accessible to multiple users through a login system. I was wondering if there are any major security vulnerabilities with my code e.g can the results be forged etc?
index.php:
<?php
include 'init.php';
include 'connect.php';
?>
<!doctype html>
<html>
<body>
<?php
$userid = $_SESSION['user_id'];
echo '<a class="like" href="#" onclick="like_add(', $userid,
');">Like</a>';
?>
<script type ="text/javascript" src="jquery-1.11.1.min.js"></script>
<script type ="text/javascript" src="like.js"></script>
</body>
</html>
connect.php:
<?php
$servername = "localhost";
$username = "root";
$password = "";
$dbname = "DB";
$conn = new mysqli($servername, $username, $password, $dbname);
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
?>
init.php:
<?php
session_start();
$_SESSION['user_id']='1';
$userid = $_SESSION['user_id'];
include 'connect.php';
include 'like.php';
?>
like.js:
function like_add(userid) {
$.post('like_add.php', {userid:userid}, function(data) {
if (data == 'success'){
add_like($userid);
} else{
alert(data);
}
});
}
like.php:
<?php
function add_like($userid){
include 'connect.php';
$stmt = $conn->prepare("INSERT INTO clicks (user) VALUES (?)");
$stmt->bind_param("s", $userid);
$stmt->execute();
$stmt = $conn->prepare("SELECT max(id) FROM clicks WHERE user=?");
$stmt->bind_param("s", $userid);
$stmt->execute();
$stmt->bind_result($click);
$stmt->fetch();
echo $click;
$stmt->close();
}
?>
like_add.php:
<?php
include 'init.php';
if (isset($userid)) {
$userid = $userid;
add_like($userid);
}
?>
Your query might give incorrect results if the same user sends multiple requests almost at the same time, case when your query will not return the currently inserted id. You can use the last_insert_id() mysql function which gives you the last inserted auto-increment value, regardless if meanwhile other requests updated the table.
Also, you don't need to pass the user_id parameter with the ajax request, as you anyway can obtain it from the session. Passing the user_id can be considered a security hole, as anyone can modify the onclick handler and trigger clicks for other users. I'd recommend avoiding as much as possible sending user ids in plain text as response.
To add more security: in your connection script. Change the $servername, $username etc to constants. These don’t need to change, and you don’t want them to be changed.
Do you have any type of checks for your sessions? Sessions are more secure than cookies but they can be hijacked in transit when a user logs in. To add some security to your session, use the session_regenerate_id() function when the user logs in, this will generate a new session id therefore if the users id has been hijacked, it is of no use as it will have changed. There are other checks that can be carried out on sessions to secure them but this is a good quick way of adding an extra level.
#nomistic makes some good suggestions also especially regarding encryption of passwords and other sensitive information. Using the crypt() function or PHP’s password hashing API - http://php.net/manual/en/book.password.php. Is also a good way.
This looks pretty good on the php side. You are using session ids for user verification, and have prepared your SQL inserts. (One question, why are you setting $_SESSION['user_id']='1'? Do you plan on only having one user? That doesn't seem necessary to me)
However, you might want to tighten up your database-side security. It's probably a good idea to set up a different user for public database access, and limit the actions on the database side. For instance, if all they are going to do is select or insert, that user should only have access to do so. I wouldn't use your root account for that. Though it's probably not a huge risk (you are doing pretty well against SQL injection, at least the first two times) just to add another layer is always a good idea.
When dealing with security, it's helpful to think of a "use-case" scenario. What sort of data are you storing? Is it something that somebody really would want? (e.g. is it financial?) It's always a good idea to look at the human element. Would someone want to spend more than a day trying to hack your data (is it worth it for them?).
Also, though it's not evident here, you probably want to make sure you have a good form of encrypting passwords.
Another thought: even if it is minor risk, it's not a bad idea to run daily backups, so you can recover your data in a worst-case scenario.
Edit:
Since it was asked, here's how to setup security at the database side:
First create a new user (following this pattern):
CREATE USER 'newuser'#'localhost' IDENTIFIED BY 'password';
Granting permissions work like this:
GRANT [type of permission] ON [database name].[table name] TO ‘[username]’#'localhost’;
Types of privileges include ALL PRIVILEGES, CREATE,DROP, DELETE, INSERT, SELECT, UPDATE, GRANT OPTION.
If you want to read up on this more, here's the documentation: https://dev.mysql.com/doc/refman/5.1/en/adding-users.html

Mysql PHP Arbitrary Increment

I'm trying to increment +1 impression every time an ad is displayed on my site, however the variable increments +2 to +3 arbitrarily. I've removed everything that's working correctly and I made a page with only this code in it:
<?php
require "connect_to_mydb.php";
echo 'Hello***** '.$testVariable=$testVariable+1;
mysql_query("UPDATE `imageAds` SET `test`=`test`+1 WHERE `id`='1'");
?>
Every time the page is refreshed the, test increments arbitrarily either +2 or +3 and my page displays Hello***** 1 (Just to show its not looping). Access is restricted to this page so it's not other users refreshing the page.
Also, id and test are int(11) in the DB.
My DB required connection has nothing in it that would interfere.
Edit
Here is an updated code:
<?php
require "connect_to_mydb.php";
mysql_query("UPDATE `imageAds` SET `test`=`test`+1 WHERE `id`='1'");
$sql = mysql_query("SELECT * FROM imageAds WHERE id='1' LIMIT 1");
$check = mysql_num_rows($sql);
if($check > 0){
$row = mysql_fetch_array($sql);
echo $row['test'];
}
?>
Increments by +2 everytime
Edit
This is whats in connect_to_mydb.php
<?php
$db_host = "*************************";
$db_username = "*********";
$db_pass = "**********";
$db_name = "**************";
mysql_connect("$db_host","$db_username","$db_pass") or die ("could not connect to mysql");
mysql_select_db("$db_name") or die ("no database");
?>
Either there's a bug in MySQL's implementation of UPDATE, or you're doing something wrong in some code you haven't posted.
Hint: It's very unlikely to be a bug in MySQL. Other people would have noticed it.
From what you've shown, it looks like your page is being loaded multiple times.
This attempt to prove that the code is only being called once doesn't prove anything:
echo 'Hello***** '.$testVariable=$testVariable+1;
This will always print the same thing (Hello***** 1) even if you open this page multiple times because the value of $testVariable is not preserved across seperate requests.
This +2/+3 error is occurring only with Chrome and my Mobile Android browser and the code is solid. I looked to see if there is any issue with Chrome sending more than one http request (thx user1058351) and there is which is documented here:
http://code.google.com/p/chromium/issues/detail?id=39402
So since this way was unreliable I just completed a work around that is solid. Instead of including a PHP file that updates the amount of ad impressions on reload, I now have it so when the page loads, an AJAX request is sent to a separate PHP file which updates the ad stats and returns the appropriate data. The key I think is to send it through the JS code so only one http request can be sent to increment the data.
Thank you to all who responded especially user1058351 and Mark Byers (not a bug in MYSQL but possibly appears to be a bug in Chrome).

Categories