Notifications of new messages. Long polling - php

Help me please to realise notifications of new messages for users.
Now i have this client code:
function getmess(){
$.ajax({
url:"notif.php",
data:{"id":id},
type:"GET",
success:function(result){
$("#count").html(result);
setTimeout('getmess',10000);
}
});
}
and this server code:
$mysqli = new mysqli('localhost', 'root', '', 'test');
if (mysqli_connect_errno()) {
printf("error: %s\n", mysqli_connect_error());
exit;
}
session_start();
$MY_ID = $_SESSION['id'];
while (true) {
$result = $mysqli->query("SELECT COUNT(*) FROM messages WHERE user_get='$MY_ID'");
if (mysqli_num_rows($result)) {
while ($row = mysqli_fetch_array($result)) {
echo $row[0]."";
}
flush();
exit;
}
sleep(5);
}
I have the problem that this script is not updating in real time when new message was added to database. But if I press button with onclick="getmess();" it works.

First, you check your database every 5 seconds, so you can't achieve real time - you have at least 5 seconds delay.
And second, there is no way you can achieve real-time by polling.
The way to deliver notifications nearly real time is to send the message by the same code that inserts into the database, e.g. you should not query the database for new records, but when there is a new record to send the data to the client. Even with a long-polling as a transport protocol.
How to achieve this? Unfortunately PHP is not a good choice. You need a non-blocking server to hold the connection, you need to know which connection waits for what data and you need a way from PHP (your backend) to notify this connection.
You can use the tornado-web server, node.js or nginx to handle the connections. You assign an identifier to each connection (probably you already have one - the userid), and when there is a new record added - the PHP script performs HTTP request to the notification server (tornado, node.js, nginx) saying what data to which user does this.
For nginx, take a look at nginx push stream

Related

MYSQL server has gone away after certain amount of scripts run

I have a PHP Symfony application that has a download of registries feature. With this is normal for users to download excels files of >5000 registries with around 20 personalized columns so it's a heavy process for the server. We decided we needed to move this process out of the application server into a serverless function using Digital Ocean Function and send the 5k registries in batches of 50, so we call around 100 times that function for a view (file download).
This script needs to connect to db to gather data and send the end result asynchronously, but sometimes when a view is too large (let's say 130 calls) for some reason after X amount of calls MySQL returns "MySQL server has gone away" when connecting to db. The error always seems to happend around the same amount of calls (always 100-103 of 130 total calls) but the database nevers shuts down.
This is the main structure of the script:
$databaseConnection = new mysqli($databaseHost, $username, $password, $dbName);
if ($databaseConnection->connect_error) {
echo("Connection failed: " . $databaseConnection->connect_error);
return ["body" => "not ok"];
}
echo "Connected successfully";
$resultArr = getFilterArrFromFilters($objectTypeInstanceIds, $selectedCustomFieldsArrIds, $addedCustomFieldsArrIds, $transitionRegistryFieldsIds,
$addressObjectTypeId, $addressCustomFieldTypeId, $invoiceVendorCustomFieldId, $vendorGeneralCustomFieldId, $databaseConnection); //A lot of queries to db to check values, columns, data types, etc.
$sql = "INSERT INTO download_process_result (download_process_id, data, creation_date)
VALUES (".$args['downloadProcessId'].", '".mysqli_real_escape_string($databaseConnection, json_encode($resultArr, JSON_UNESCAPED_UNICODE))."', '".(new DateTime())->format('Y-m-d H:i:s')."')";
if ($databaseConnection->query($sql) === TRUE) {
echo "New record created successfully";
} else {
echo "Error: " .$databaseConnection->error;
}
$databaseConnection->close();
return ["body" => "ok"];
Database is currently in my local computer with docker (mariadb:10.5.9) using ngrok for port fowarding so we can test the script.
I've tried to play with the setting of the database (max_connections, timeouts, packet_size, etc) but nothing seems to change the outcome.
Any help or leads towards the solution will be greatly appreciated.

IFTTT : Trying to run a php script using Maker webhooks

Good day,
I have created an IFTTT receipe that if a "myfox" alarm system is armed, a php script is executed on my NAS (192.165.x.x). The php script is supposed to trigger a stored procedure in my mysql database.
The following PHP script has been tested by other means and I'm sure that it works :
<?php
/*
php_update_mode_armed.php
*****************************************************************************************
* This script updates the value of the components in the table tbl_eedomus_current_mode
* It calls the stored procedure 'sp_tbl_eedomus_current_mode_armed'
*****************************************************************************************
Version 1.00, 09.06.2017, Initial version of the script
*/
mainProcess();
function mainProcess()
{
$ServerIP = "192.165.x.x";
$sqlUser = "domoos";
$sqlDatabase = "domoos";
$pw = "myPass";
// Connect
$mysqli = new mysqli($ServerIP, $sqlUser, $pw, $sqlDatabase);
if(!$mysqli) {[![enter image description here][1]][1]
header('Location: error.php?error=DbConnectionFailure');
die();
}
// Call stored procedure sp_tbl_eedomus_current_mode_armed
if(!$mysqli->query("CALL sp_tbl_eedomus_current_mode_armed ()"))
{
echo "OK";
if($mysqli) $mysqli->close(); // Close DB connection
//header('Location: error.php?error=QueryFailure');
die();
}
if($mysqli) $mysqli->close(); // Close DB connection
}
?>
Below, also, a screen shot of my "then" part of the IFTTT receipe.
Am I doing something wrong here or is the use of IFTTT not fit for the purpose I'm trying to achieve here?
Many thanks for your help on this matter and have a great day.
Most likly IFTTT can't access your NAS from the internet due to home router firewall etc. Also the ip address used in IFTTT shouldn't be a local ip like 192.168.* but your public ip address. You can figure this out by googling for "whats my ip".
Best way to test your setup using your laptop is to disconnect from local wifi network, tether your phone to your laptop and try visiting the NAS ip url to see if it still works. You can use chrome postman app to send out POST requests.
If that's all fine, get PHP to log incoming connections by writing to a file. file_put_contents("log.txt", print_r($_REQUESTS, true));
I would suggest trying with a GET request first with default content type. Good luck!

Improve code that periodically query SQLite database

I have an sqlite database that I query from PHP periodically. The query is always the same and it returns me a string. Once the string changes in the database the loop ends.
The following code is working, but I am pretty sure this is not the optimal way to do this...
class MyDB extends SQLite3
{
function __construct()
{
$this->open('db.sqlite');
}
}
$loop = True;
while ($loop == True) {
sleep(10);
$db = new MyDB();
if (!$db) {
echo $db->lastErrorMsg();
} else {
echo "Opened database successfully\n";
}
$sql = 'SELECT status from t_jobs WHERE name=' . $file_name;
$ret = $db->query($sql);
$state = $ret->fetchArray(SQLITE3_ASSOC);
$output = (string)$state['status'];
if (strcmp($output, 'FINISHED') == 0) {
$loop = False;
}
echo $output;
$db->close();
}
If you want to get an output immediately and a kind of interface, I think The best solution for your problem might be to use HTTP long polling. This way, it will not hold the connection for hours if the job is not done:
you will need to code a javascript snippet (in another html or php page) that runs an ajax call to your current php code.
Your web server (and so, your php code) will keep the connection opened for a while until the job is done or a time limit is reached (say 20-30 seconds)
if the job is not done, the javascript will make another ajax call and everything will start again, keeping the connection, etc... until you get the expected output status...
BEWARE : this solution will not work on any hosting provider
You will need to set the max_execution_time to a higher value than the default one see php doc for this.
I think you can find many things on http long polling with php/javascript on google / stack overflow...

Sending two POST Ajax requests, Server handling them at once

I am trying to send two values to my server to be input into a database in the same row, the problem I have is that it isn't possible to send both values in one request. So what I want to do is send both the values in separate requests but handle them on the server at once so I can add the values into a database as one entry. My php isn't very strong, and I have no idea how to go about doing this. Is it possible? How would I do it?
Here's what I have so far:
<?php
$user = "user";
$pass = "pass";
$table = "database";
if(isset($_POST['currentUser']))
{
$userID = 'currentUser';
}
if(isset($_POST ['e.regid']))
{
$regid = 'e.regid';
}
if ($regid && $userID != null)
{
$con = mysqli_connect("localhost", $user, $pass);
mysqli_select_db($con, $table);
if (mysqli_connect_errno$con))
{
echo "Error connecting to the DB: " . mysqli_connect_errno());
}
else
{
"INSERT INTO gek_devices('regid', 'pin') VALUES ($regid, $userID)";
}
}
No, due to network latency and unreliability there's not even any guarantee that both requests will ever arrive at the server, let alone within a minute apart, let alone that you could run code once handling both requests. In practice chances are over 90% that both requests will not even be handled by the same Apache process on the server, given that a default Apache install on *nix will prefork 10 'spare' instances.
If you need to process the data 'simultaneously` you need to send them in the same request, that's the only way to guarantee atomicity.
Your intended solution is completely impossible, but also a glaring XY problem. Solve why you can't send the values simultaneously right now instead of focusing on hacky workarounds following that.

How to decrease connection count to mySql DB on a remote server? My system must send data every second or two

My first post, because I haven't found answer to this problem anywhere! And i looked way beyond Google.. :)
DESCRIPTION:
So I have a set-up where an arduino device is connected to a laptop via USB serial cable and the laptop is connected to internet.
Like this: http://postimg.org/image/cz1g0q2ib/
arduino ---USB---> laptop (transit.py) ---WWW---> server (insert.php)-> mysql DB
There is a python script (transit.py) on the pc running continuously and listening to the COM port, analyzing received data and forwarding it to a file (insert.php) on a remote server (a free hosting site)
See code to learn how that works...
Then there is the insert.php script that receives this data (still almost every second), analyzes it and stores it in the mySql database.
This, however, is not the only file that requires mySql connection, therefore i include connect.php at the beginning of every such file.
PROBLEM:
Warning: mysqli::mysqli() [mysqli.mysqli]: (42000/1226): User 'user' has exceeded the 'max_connections_per_hour' resource (current value: 1500) in /server/connect.php on line 8
As a result of all this data travel and it's frequency (and cheapness of the hosting) i run into a "maximum connections per hour exceeded" error. The limit is 1500 per hour and i can't change it (it's a remote server). And no, i don't want to pay for hosting to get a bigger allowance - that's not the point- the issue is inefficiency of my code. Can i have one, persistent connection? Like a service?
Sending data from python script straight to remote mysql is not an option, because i don't have access to this feature.
CODE:
transit.py:
try:
ser = serial.Serial('COM4',9600,timeout=4)
except:
print ('=== COULD NOT CONNECT TO BOARD ===')
value = ser.readline()
strValue = value.decode("utf-8")
if strValue:
mylist = strValue.split(',')
print(mylist[0] + '\t\t' + mylist[1]+ '\t\t' + mylist[2])
path = 'http://a-free-server.com/insert.php'
dataLine = {"table": mylist[0], "data": mylist[1], "value": mylist[2]}
toServer = requests.post(path, params=dataLine, timeout=2)
insert.php:
<?php
include 'connect.php';
//some irrelevant code here...
if (empty($_GET['type']) && isset($_GET['data'])) {
$table = $_GET['table'];
$data = $_GET['data'];
$value = $_GET['value'];
if($mysqli->connect_errno > 0){
die('Unable to connect to database [' . $mysqli->connect_error . ']');
}
else
{
date_default_timezone_set("Asia/Hong_Kong");
$clock = date(DATE_W3C);
if (isset($_GET['time'])) {
$time = $_GET['time'];
}
else{
$time = $clock;
}
echo "Received: ";
echo $table;
echo ",";
echo $data;
echo ",";
echo $value;
echo ",";
echo $time;
if ($stmt = $mysqli->prepare("INSERT INTO ".$table." (`id`, `data`, `value`, `time`) VALUES (NULL, ?, ?, ?) ON DUPLICATE KEY UPDATE time='".$time."'"))
{
$stmt->bind_param('sss', $data, $value, $time);
$stmt->execute();
$stmt->free_result();
$stmt->close();
}
else{
echo "Prepare failed: (" . $mysqli->errno . ") " . $mysqli->error;
}
}
}else{
echo " | DATA NOT received!";
}
?>
connect.php:
<?php
define("HOST", "p:a-free-host.com"); // notice the p: for persistence
define("USER", "user");
define("PASSWORD", "strongpassword1"); // my password. don't look!
define("DATABASE", "databass");
$GLOBALS["mysqli"] = new mysqli(HOST, USER, PASSWORD, DATABASE, 3306);
$count = intval(file_get_contents('conns.txt'));
file_put_contents('conns.txt', ++$count); //just something i added to monitor connections
?>
P.S. Everything works fine and all data is handled in a rather desirable manner, except for exceeding the limit and perhaps some other hidden caveats.
Any suggestion on how to decrease the connection count but still receive data every second?
If I have understood your issue correctly, your web host sucks. If you are limited to 1500 connections / hour, and each page requires a connection, that means you can never exceed 1500 page views per hour; that's not very much.
Many programming languages support connection pooling; in this model, the server opens one or more connection at start-up, and individual page requests get one of those connections when they need them. This reduces the overhead of opening and closing connections. See here for a discussion of connection pooling and PHP. You may be able to use one of the answers without too much trouble.
The alternative - and probably better - solution is to batch up data in your Python scripts so you don't have to connect to the web server so often. The classic waty to do this for applications that aren't time critical is to use a message bus. I'm not a Pythonist, but this should do the job...
Did you try to create a script that is all the time alive(here you make the connection)(S1) and then the rest?
(S2)
In the script that you are doing the operations first check if the connection is alive and if is not redo connection.
Close the connection in S1 at the end of the script.

Categories