Improve code that periodically query SQLite database - php

I have an sqlite database that I query from PHP periodically. The query is always the same and it returns me a string. Once the string changes in the database the loop ends.
The following code is working, but I am pretty sure this is not the optimal way to do this...
class MyDB extends SQLite3
{
function __construct()
{
$this->open('db.sqlite');
}
}
$loop = True;
while ($loop == True) {
sleep(10);
$db = new MyDB();
if (!$db) {
echo $db->lastErrorMsg();
} else {
echo "Opened database successfully\n";
}
$sql = 'SELECT status from t_jobs WHERE name=' . $file_name;
$ret = $db->query($sql);
$state = $ret->fetchArray(SQLITE3_ASSOC);
$output = (string)$state['status'];
if (strcmp($output, 'FINISHED') == 0) {
$loop = False;
}
echo $output;
$db->close();
}

If you want to get an output immediately and a kind of interface, I think The best solution for your problem might be to use HTTP long polling. This way, it will not hold the connection for hours if the job is not done:
you will need to code a javascript snippet (in another html or php page) that runs an ajax call to your current php code.
Your web server (and so, your php code) will keep the connection opened for a while until the job is done or a time limit is reached (say 20-30 seconds)
if the job is not done, the javascript will make another ajax call and everything will start again, keeping the connection, etc... until you get the expected output status...
BEWARE : this solution will not work on any hosting provider
You will need to set the max_execution_time to a higher value than the default one see php doc for this.
I think you can find many things on http long polling with php/javascript on google / stack overflow...

Related

How to stop PHP and SQL execution when user disconnects?

I have a quite heavy SQL search query that takes a few minutes to complete, called from a PHP script that's called by an ajax request.
Problem is, users often click the search button many times from impatience, and each time it creates a new ajax request, and a new PHP script execution. But the old ones continue executing, even though the connection has been dropped. This causes the SQL server load to constantly be at 100% CPU usage.
So I tested my theory that script execution continues even after the browser tab is closed. I used 2 types of queries, an ad hoc query and a stored procedure execution, both methods do the same thing, inserting the numbers 1-9 into a table, with a 2 second delay between each number.
Ad hoc query:
for($i = 1; $i < 10; $i++){
$sql = "INSERT INTO t (i) VALUES(?)";
$res = pdoQuery($sql, array($i));
if($res === false){
echo $pdo->getErrorMessage();
http_response_code(500);
exit();
}
sleep(2);
}
SP call:
$sql = "EXEC sp_Slow";
$res = pdoQuery($sql);
if($res === false){
echo $pdo->getErrorMessage();
http_response_code(500);
exit();
}
How I tested: using buttons that trigger ajax calls to each script, I tested them, by clicking the button and immediately closing the tab. And then monitoring the data in the table. And just as I suspected, new data gets inserted every 2 seconds. (This also happens if I directly open the script in the browser and closing the tab, instead of requesting it through ajax)
I need a way to completely kill both PHP and SQL execution whenever the user disconnects, transactions are not important because it's just a select operation.
You can change this behaviour using php.ini directive or at runtime with ignore_user_abort() function.
Here's what I did, from the comment by #Robbie Toyota, thanks!
if(!empty($_SESSION['SearchSPID'])){
$sql = "KILL " . $_SESSION['SearchSPID'];
$res = pdoQuery($sql);
if($res === false){
exit('Query error: failed to kill existing spid:' . $_SESSION['SearchSPID']);
}
$_SESSION['SearchSPID'] = null;
}
$sql = "SELECT ##spid AS SPID";
$res = pdoQuery($sql);
$spid = $res->row_assoc()['SPID'];
$_SESSION['SearchSPID'] = $spid;
// do long query here
$_SESSION['SearchSPID'] = null;
Of course using this method you have to be careful about session file locking, which if happens will make this whole thing pointless, because then the requests will be sequential and not parallel

Running concurrent database queries in PHP per session?

I appear to be having some difficulties with handling running multiple database calls, especially in regards to large datasets being returned. It appears that PHP only lets you have one database call running at a time, per session. This normally isn't an issue, as the database calls tend to be so small it doesn't lock anything up, but the large ones cause this waiting issue.
I discovered this issue when I fixed an unrelated issue, and discovered that if you click a button to query the database via an AJAX call, then try to refresh the website, it won't start loading the website until that database call is done, as the page does have an internal function to make a database call. Conversely, if I were to start the database query, then load up a pure html webpage stating "Hello World", it loads instantly. Based on this, Apache isn't having an issue serving, it's something to do with database connections.
To point, I've isolated code that's possibly relevant, as I can't figure out why I'm only able to have one active call at a time. In short, is there a way to have multiple database calls running per user at a time, or will a user have to wait?
db_connect.php:
<?php
$user = 'TEST';
include_once 'config.php'; //Intialize constants for the connection
$conn = oci_connect(USER, PASSWORD, '//'.HOST.':1630/'.DATABASE);
oci_set_client_identifier($conn, $user); //Identify who's making these calls.
?>
events.php: (if I refresh this after clicking the ajax button to do the same fetch, it won't load until that AJAX call is finished. Doesn't matter if I have code to abort the call, the database is still running that database query.)
<?php
session_start();
include 'db_connect.php';
include 'database/event_defs.php';
?>
<html>
<!-- boilerplate nonsense -->
<body>
<table>
<?php
$dataset = get_event_list($conn, $_SESSION['username']); //Returns 1000 records, could take a while to fully retrieve it.
foreach($dataset as $key => $val) {
//Make multiple rows happen here.
}
?>
</table>
<button onclick="do_ajax_call('get_event_list');">Make DB Call</button>
</body>
</html>
database/event_defs.php: (Probably the most relevant part).
<?php
function get_event_list($conn, $user) {
$l_result = array();
$sql = 'BEGIN ...(:c_usr, :c_rslt); END'; //name is irrelevant.
if($stmt = oci_parse($conn, $sql)) {
$l_results = oci_new_cursor($conn);
oci_bind_by_name($stmt,':c_usr',$user);
oci_bind_by_name($stmt,':c_rslt',$lresults,-1,OCI_B_CURSOR);
if(oci_execute($conn)) {
oci_execute($l_results); //Problem line, seems to stall out here for a while and won't let the user query again until this call finishes.
while($r = oci_fetch_array($l_results, OCI_ASSOC) {
$l_result[] = $r;
}
} else {
return 'bad statement';
}
} else {
return 'unable to connect';
}
return $l_result;
}
?>
Version information:
PHP 5.4.45
Oracle 11g
Apache 2.2.15
As MonkeyZeus monkey has already pointed out in the comments to your question, the second request is most likely only blocked by the session mechanism.
Since it looks like you don't need anything but the username from the session, just grab that value and finish the session mechanism.
<?php
session_start();
// check $_SESSION['username'] here if necessary
$username = $_SESSION['username'];
// no need to keep the session mecahnism "alive"
session_abort(); // and since nothing has been written to _SESSION, abort() should do.
require 'db_connect.php';
require 'database/event_defs.php';
?>
<html>
<!-- boilerplate nonsense -->
<body>
<table>
<?php
$dataset = get_event_list($conn, $username); //Returns 1000 records, could take a while to fully retrieve it.
foreach($dataset as $key => $val) {
//Make multiple rows happen here.
}
?>
It's PHP session blocking mechanism.
You need to call session_write_close() when you don't need session any more.
May be after this string:
$dataset = get_event_list($conn, $_SESSION['username']);
After calling session_write_close() you can't use $_SESSION.

What should I do for avoid Joomla's cache to cache this php script?

I have this script
<?php
$db = JFactory::getDBO();
if(!$db)
{
echo "";
}
elseif(!$htmlVideoDetails->id)
{
echo "";
}
else
{
$query = "UPDATE __hdflv_upload SET times_viewed=1+times_viewed WHERE id={$htmlVideoDetails->id}";
$db->setQuery($query);
if(!$db->query())
{
echo "";
}
else
{
echo "";
}
}
?>
With joomla's cache disabled this script works well.
But with joomla's script enabled this script stop working.
This probably because joomla's cache save page in html. I checked a cached file, and there isn't any reference to this script.
What should I do for make this script working even in an HTML converted page, or make a reference to this script? I need this script running even in cached (html) version... thank you
Use Ajax. Please check this guide: http://www.itoctopus.com/hit-tracking-not-working-when-joomlas-caching-is-enabled-how-to-solve on how to do this (the guide is made for hit tracking on Joomla, but you can easily switch the query to update hits for your components).

Notifications of new messages. Long polling

Help me please to realise notifications of new messages for users.
Now i have this client code:
function getmess(){
$.ajax({
url:"notif.php",
data:{"id":id},
type:"GET",
success:function(result){
$("#count").html(result);
setTimeout('getmess',10000);
}
});
}
and this server code:
$mysqli = new mysqli('localhost', 'root', '', 'test');
if (mysqli_connect_errno()) {
printf("error: %s\n", mysqli_connect_error());
exit;
}
session_start();
$MY_ID = $_SESSION['id'];
while (true) {
$result = $mysqli->query("SELECT COUNT(*) FROM messages WHERE user_get='$MY_ID'");
if (mysqli_num_rows($result)) {
while ($row = mysqli_fetch_array($result)) {
echo $row[0]."";
}
flush();
exit;
}
sleep(5);
}
I have the problem that this script is not updating in real time when new message was added to database. But if I press button with onclick="getmess();" it works.
First, you check your database every 5 seconds, so you can't achieve real time - you have at least 5 seconds delay.
And second, there is no way you can achieve real-time by polling.
The way to deliver notifications nearly real time is to send the message by the same code that inserts into the database, e.g. you should not query the database for new records, but when there is a new record to send the data to the client. Even with a long-polling as a transport protocol.
How to achieve this? Unfortunately PHP is not a good choice. You need a non-blocking server to hold the connection, you need to know which connection waits for what data and you need a way from PHP (your backend) to notify this connection.
You can use the tornado-web server, node.js or nginx to handle the connections. You assign an identifier to each connection (probably you already have one - the userid), and when there is a new record added - the PHP script performs HTTP request to the notification server (tornado, node.js, nginx) saying what data to which user does this.
For nginx, take a look at nginx push stream

where's the best place to close the DB connection?

In this code I think maybe it would be best to close after both the if and else but it seems off to close it twice.
<?php
$member_id = "";
require("connect.php");
if (isset($_POST['member_id']))$member_id = fix_string($_POST['member_id']);
$sql=("DELETE FROM members WHERE member_id = '$member_id'");
$res = mysqli_query($con, $sql);
if (mysqli_affected_rows($con) == 1) {
echo "member with ID of ".$member_id." has been removed from members table";
} else {
echo "member was not deleted";
}
function fix_string($string) {
if (get_magic_quotes_gpc()) $string = stripslashes($string);
return htmlentities ($string);
}
?>
It is very common practice to open the db connection at the beginning, and close your connection once at the end. You don't need to do it in the middle of your code.
Closing database connections isn't absolutely required, as seen in the PHP manual page for mysql_close(), but it is considered good practice by many to do so.
There is a rare exception to this. If your program is going to be doing some heavy processing for several minutes, you might want to close the db connection before this. If you need it again after the processing, open the db connection again. The reason for this is that the MySQL connection will eventually time out, and then it may lead to more problems in your program.
Using mysql_close() isn't usually necessary, as non-persistent open links are automatically closed at the end of the script's execution.
Straigh out from the php manual.

Categories