PHP script restarting while importing a big MySQL table - php

This is a problem I'm having for quite some time now.
I work for a banking institute: our service provider lets us access our data via ODBC through a proprietary DB engine.
Since I need almost a hundred tables for our internal procedures and whatnot, I set up some "replication" scripts, put em in a cron and basically reloading from scratch the tables I need every morning.
When the number of records is small (approx. 50.000 records and 100 columns or so) everything goes smooth, but whenever I get a medium-big table (approx. 700.000 records), more often than not the script restarts itself (I look at my MySQL tables while the import scripts are running and I see them going 400k, 500k... and back from 1).
This is an example of one of my import scripts:
<?php
ini_set('max_execution_time', '0');
$connect = odbc_connect('XXXXX', '', '') or die ('0');
$empty_query = "TRUNCATE TABLE SADAS.".$NOME_SCRIPT;
$SQL->query($empty_query);
$select_query = " SELECT...
FROM ...";
$result = odbc_exec($connect, $select_query);
while($dati = odbc_fetch_object($result)) {
$insert_query = " INSERT INTO ...
VALUES ...";
$SQL->Query($insert_query);
}
// Close ODBC
odbc_close($connect);
?>
Any ideas?

Related

MYSQL server has gone away after certain amount of scripts run

I have a PHP Symfony application that has a download of registries feature. With this is normal for users to download excels files of >5000 registries with around 20 personalized columns so it's a heavy process for the server. We decided we needed to move this process out of the application server into a serverless function using Digital Ocean Function and send the 5k registries in batches of 50, so we call around 100 times that function for a view (file download).
This script needs to connect to db to gather data and send the end result asynchronously, but sometimes when a view is too large (let's say 130 calls) for some reason after X amount of calls MySQL returns "MySQL server has gone away" when connecting to db. The error always seems to happend around the same amount of calls (always 100-103 of 130 total calls) but the database nevers shuts down.
This is the main structure of the script:
$databaseConnection = new mysqli($databaseHost, $username, $password, $dbName);
if ($databaseConnection->connect_error) {
echo("Connection failed: " . $databaseConnection->connect_error);
return ["body" => "not ok"];
}
echo "Connected successfully";
$resultArr = getFilterArrFromFilters($objectTypeInstanceIds, $selectedCustomFieldsArrIds, $addedCustomFieldsArrIds, $transitionRegistryFieldsIds,
$addressObjectTypeId, $addressCustomFieldTypeId, $invoiceVendorCustomFieldId, $vendorGeneralCustomFieldId, $databaseConnection); //A lot of queries to db to check values, columns, data types, etc.
$sql = "INSERT INTO download_process_result (download_process_id, data, creation_date)
VALUES (".$args['downloadProcessId'].", '".mysqli_real_escape_string($databaseConnection, json_encode($resultArr, JSON_UNESCAPED_UNICODE))."', '".(new DateTime())->format('Y-m-d H:i:s')."')";
if ($databaseConnection->query($sql) === TRUE) {
echo "New record created successfully";
} else {
echo "Error: " .$databaseConnection->error;
}
$databaseConnection->close();
return ["body" => "ok"];
Database is currently in my local computer with docker (mariadb:10.5.9) using ngrok for port fowarding so we can test the script.
I've tried to play with the setting of the database (max_connections, timeouts, packet_size, etc) but nothing seems to change the outcome.
Any help or leads towards the solution will be greatly appreciated.

Laravel jobs/queue unclosed SQL Server database sessions

I noticed large amout of sessions running on database. Almost half of them have a query, take a look at . In my project I use queue worker to execute a code in background and use database as queue connection.
Here is the code I use:
Passing jobs to Batch:
$jobs = [];
foreach($data as $d){
$jobs[] = new EstimateImportJob($d);
}
$batch = Bus::batch($jobs)->dispatch();
Job source code:
$current_date = Carbon::now();
// Using these as tables don`t have increment traits
$last_po_id = \DB::connection("main")->table('PO')->latest('ID')->first()->ID;
$last_poline_id = \DB::connection("main")->table('POLINE')->latest('ID')->first()->ID;
$last_poline_poline = \DB::connection("main")->table('POLINE')->latest('POLINE')->first()->POLINE;
\DB::connection('main')->table('POLINE')->insert($d);
As I know Laravel is supposed to close DB connection after code executions is finished. But I can`t find a reason why I have so many database sessions. Any ideas would be appretiated!
Normally, even with working queue worker expected result is to have 3-4 database sessions.

PHP/MySQL Not updating DB from form anymore

I have been working on a program for a job interview coming up soon, and I was nearing the completion of the program which had everything running the way I needed it to... but then my computer crashed. When I opened up the files again, everything was the same, nothing changed, all my changes were saved before it crashed. Only thing is, now my MySQL table doesn't get the data sent to it from the INSERT code. Is there something I can do to make this work?
I have tried creating a new database, a different table, restarting my computers, restarting Chrome, everything.... I can't get it and I'm desperate at this point.
Please see the code below...
// Connect to the database.
$link2 = mysqli_connect("localhost", "cl60-booking", "XXXXXXX", "cl60-booking");
if (mysqli_connect_error()) {
die ("There was an error connecting to the database");
}
// Update the bookings DB with the user's hotel room.
$query = "INSERT INTO booking
(`beds`, `baths`, `booked`, `checkInDate`, `checkOutDate`)
VALUES('".mysqli_real_escape_string($link2,
$_POST['bedNumber'])."',
'".mysqli_real_escape_string($link2, $_POST['bathNumber'])."',
'Yes', '".mysqli_real_escape_string($link2,
$_POST['checkIn'])."',
'".mysqli_real_escape_string($link2,
$_POST['checkOut'])."')";
Use the following code segment :
mysqli_query($link2, $query);

Why do we have to close the MySQL database after a query command?

I'm starter.
I want to know what will happen if we don't close the MySQL connection.
1- Is it possible to open more than one database if we don't close them? I mean can we open more than one database in a same time?
2- Does closing database increase the speed?
3- Is it necessary to close the database or it is optional?
Look at this code. I don't use "mysql_close()" so I don't close the database after each request. There are a lot of requests for this PHP page. Maybe 50000 per each minute. I want to know closing database is necessary for this code or no?
<?php
//Include the file that lets us to connect to the database.
include("database/connection.php");
//Call "connect" function to connect to the database.
connect("database", "localhost", "root", "", "user");
//The GPRS module send a string to this site by GET method. The GPRS user a variable named variable to send the string with.
$received_string = $_GET["variable"];
//Seprates data in an array.
$array_GPRS_data = explode(",", $received_string);
//we need to remove the first letter.
$array_GPRS_data[9] = substr($array_GPRS_data[9], 1);
$array_GPRS_data[13] = substr($array_GPRS_data[13], 4, 2).substr($array_GPRS_data[13], 2, 2).substr($array_GPRS_data[13], 0, 2);
//Query statement.
$query = "INSERT INTO $array_GPRS_data[17](signal_quality, balance, satellite_derived_time, satellite_fix_status, latitude_decimal_degrees,
latitude_hemisphere, longitude_decimal_degrees, longitude_hemisphere, speed, bearing, UTCdate, theChecksum)
VALUES('$array_GPRS_data[0]', '$array_GPRS_data[1]', '$array_GPRS_data[5]', '$array_GPRS_data[6]', '$array_GPRS_data[7]',
'$array_GPRS_data[8]', '$array_GPRS_data[9]', '$array_GPRS_data[10]', '$array_GPRS_data[11]', '$array_GPRS_data[12]', '$array_GPRS_data[13]',
'$array_GPRS_data[16]')";
//Run query.
$result = mysqli_query($query);
//Check if data are inserted in the database correctly.
if($result)
{
echo("*#01");
}
else
{
echo("Error: 001");
echo (mysqli_error());
}
?>
Yes, you can have multiple database connections. You are not opening a database, you are opening a database connection. The database is 'open' (i.e. running) all of the time, generally speaking, whether you are connected to it or not.
Depends... if you only have one open connection on a page, then you don't need to close it because it will automatically close when PHP is done. If you have many, then you could potentially make the database server slower, or make the database server run out of available connections (it can only have a certain number of connections open at the same time). That said, most modern database servers can handle hundreds of concurrent connections.
Optional, but recommended. It's not a big deal for small-medium projects (i.e. if you have less than 100 concurrent visitors at any given time, you probably won't have any issues regardless). Since you have many thousand visitors per minute, you should actively close the database connection as soon as you are done with it, to free it up as soon as possible.
Once you connect to the database it is not necessary to close. As non-persistent connection automatically closed at the end of script execution.
Follow this for more information

Table Lock Fails

I'm having trouble getting a table to lock in MySQL. I've tried testing for concurrent requests by running two scripts at the same time on different browsers.
Script 1:
mysql_query("lock tables id_numbers write");
$sql = "select number from id_numbers";
$result = mysql_query($sql);
if ($record = mysql_fetch_array($result))
{
$id = $record['number'];
}
sleep(30);
$id++;
$sql = "update id_numbers set number = '$id'";
$result = mysql_query($sql);
mysql_query("unlock tables");
Script 2:
$sql = "select number from id_numbers";
$result = mysql_query($sql);
if ($record = mysql_fetch_array($result))
{
$id = $record['number'];
}
echo $id;
I start Script 1 first, then start Script 2. Theoretically, Script 2 should wait 30+ seconds until Script 1 unlocks the table, and then output the updated ID. But it immediately outputs the original ID, so the lock is obviously not taking effect. What could be going wrong?
BTW, I know I shouldn't still be using mysql_*, but I'm stuck with it for now.
EDIT: I've discovered that the lock does happen on the live site, but not on my dev site. They are both on shared hosts, so I'm assuming there's some setting that's different between the two. Any idea what that could be?
Using the code you have here, I get exactly the behavior you expect: script 2 will block until script 1 issues the unlock tables.
Your problem must lie elsewhere. Things to check:
Does the table actually exist on the database? (Check your mysql_query() return values and use mysql_error() if you get any FALSE returns.)
Are both scripts connecting to the same database?
Are you sure the table is not a temporary (thus connection-local) table?
Could script 1's mysql connection (or the script itself) be timing out during the sleep(30), thus releasing the lock earlier than you expect?

Categories