execute mysql query and display results while reading - php

I have a mysql database and i want to execute a query and while this query is being executed the data should be displayed in page.
so for example if i have 1,000 result row from the query result i want to display each row while the query is being executed instead of waiting till the query finishes executing then displaying them at once.
here is my php code:
<?php
$con = mysql_connect("localhost","root","");
if (!$con)
{
die('Could not connect: ' . mysql_error());
}
// Database Name
mysql_select_db("dbname", $con);
$test_query = mysql_query("SELECT * FROM V_Posts where _PID > 100 and _PID < 10000");
while($CHECK_PCID_R = mysql_fetch_array($test_query))
{
echo $CHECK_PCID_R['_PID'] . "<br />";
}
?>
I tried
echo $CHECK_PCID_R['_PID'] . "<br />";
flush();
But it didn't work :(

One query will produce one dataset and you'll have all the data at once. If your query is slow any latency in displaying the data will be small compared to the delay in receiving it. Using flush() might force the server to send parts of the page, but you're really just tinkering at the edges.
If you want to break this down you'll have to run multiple queries, which will arguably be much slower since you'll be running the same query repeatedly. This will load the database server unnecessarily, and will achieve only a minor cosmetic effect.
If you use an AJAX call to retrieve your data you can display a 'loading' message while you wait. You could use multiple AJAX calls to display the data bit by bit - this is even worse than using multiple queries in the PHP script.

Related

pg_close must to wait query end

I'm making a function that register a big CSV data file so i am searching the ways that can send asynchronous query to postgre sql server and receive query result later.
I tried to use pg_send_query to send SQL(that take time to query) like bellow.
SQL
$sql = SELECT * FROM BATCH_WORK_TBL WHERE COPR_ID='99999999test';
PHP script
//-----------------------------------------------
// DB Conntect
//-----------------------------------------------
$conn = pg_connect('user=username dbname=databasename');
//-----------------------------------------------
// Send asynchronous query
//-----------------------------------------------
$sql = " SELECT * FROM BATCH_WORK_TBL WHERE COPR_ID='99999999test' ";
if (!pg_connection_busy( $conn )) {
$res = pg_send_query($conn , $sql ); // pg_send_query return result(true/flase) without wait results
}
pg_close($conn); // pg_close wait until it query return result
however, when i close connection,pg_close() wait for a long time until query return query result so finally my php script take a long time run.
Any body if has experience on asynchronous query on posgres sql with php, could you pls help me in this case?
That cannot work. Something has to keep the database connection open so that the query result can be received.
You can either keep the connection open, do some other work and come back later to check on the connection, or you write a co-process that does that waiting for you (and has the database connection).

How to stop PHP and SQL execution when user disconnects?

I have a quite heavy SQL search query that takes a few minutes to complete, called from a PHP script that's called by an ajax request.
Problem is, users often click the search button many times from impatience, and each time it creates a new ajax request, and a new PHP script execution. But the old ones continue executing, even though the connection has been dropped. This causes the SQL server load to constantly be at 100% CPU usage.
So I tested my theory that script execution continues even after the browser tab is closed. I used 2 types of queries, an ad hoc query and a stored procedure execution, both methods do the same thing, inserting the numbers 1-9 into a table, with a 2 second delay between each number.
Ad hoc query:
for($i = 1; $i < 10; $i++){
$sql = "INSERT INTO t (i) VALUES(?)";
$res = pdoQuery($sql, array($i));
if($res === false){
echo $pdo->getErrorMessage();
http_response_code(500);
exit();
}
sleep(2);
}
SP call:
$sql = "EXEC sp_Slow";
$res = pdoQuery($sql);
if($res === false){
echo $pdo->getErrorMessage();
http_response_code(500);
exit();
}
How I tested: using buttons that trigger ajax calls to each script, I tested them, by clicking the button and immediately closing the tab. And then monitoring the data in the table. And just as I suspected, new data gets inserted every 2 seconds. (This also happens if I directly open the script in the browser and closing the tab, instead of requesting it through ajax)
I need a way to completely kill both PHP and SQL execution whenever the user disconnects, transactions are not important because it's just a select operation.
You can change this behaviour using php.ini directive or at runtime with ignore_user_abort() function.
Here's what I did, from the comment by #Robbie Toyota, thanks!
if(!empty($_SESSION['SearchSPID'])){
$sql = "KILL " . $_SESSION['SearchSPID'];
$res = pdoQuery($sql);
if($res === false){
exit('Query error: failed to kill existing spid:' . $_SESSION['SearchSPID']);
}
$_SESSION['SearchSPID'] = null;
}
$sql = "SELECT ##spid AS SPID";
$res = pdoQuery($sql);
$spid = $res->row_assoc()['SPID'];
$_SESSION['SearchSPID'] = $spid;
// do long query here
$_SESSION['SearchSPID'] = null;
Of course using this method you have to be careful about session file locking, which if happens will make this whole thing pointless, because then the requests will be sequential and not parallel

Wait between fetching value one by one in data base

<?php
$host= "localhost";
$user= "xxxxxx";
$pass= "xxxx";
$db="xxxxxxx";
$connect= mysql_connect($host,$user,$pass);
if (!$connect)die ("Cannot connect!");
mysql_select_db($db, $connect);
$result = mysql_query("
SELECT
*
FROM
values
");
if($result){
while($row = mysql_fetch_array($result, MYSQL_ASSOC)){
$url = $row['value'];
echo '<li><iframe src="http://xxxxxxx.xxxx/xxxx.php?value='.$url.'" width="300" height="100" scrolling="no" frameBorder="0""></iframe></li>';
}
}
?>
this is my php code I am using to get values from database. I want to use a time delay in each of the value.
like
http://xxxxxxx.xxxx/xxxx.php?value='.$url.'
wait 5 sec
http://xxxxxxx.xxxx/xxxx.php?value='.$url.'
wait 5 sec
and so on.
Is there any way I can do that.
Thanks.
wrong answer: use sleep(5) inside while().
right answer:
a) you should not use mysql_* functions
b) if you need a delay, get all rows, then output them one by one using JS.
OR:
c) again, using JS and ajax, query for new row every 5 seconds
If you'll stick to wrong answer, your script will die of timeout at ~6th row (in default php install)
You should understand client/server architecture better:
mysql query and your php-code is a server-side, it will not return any output to browser (your visitor), until its end, and you don't want your visitors to wait 30 or 300 seconds for the server to reply.
so only option you have is: query for new image every 5 seconds, or query them all and iterate over them.
There are many jquery/javascript tutorials on that subject.

Mysql fails in php but works in phpmyadmin

I've made this a lot of times but now I can't :(
The insert allways return false but if I execute the same SQL script (taked from the output) it inserts in the database without any problem. I'm connected to the database because some values are fetched from another table.
This is my code:
$query = "INSERT INTO normotensiones(fecha,macropera,pozo,equipo_pmx,equipo_compania,paciente,sexo,edad,id_compania,otra_compania,puesto,ta,tum,ove,coordinador)
VALUES('$fecha','$macropera','$pozo','$equipo_pmx','$equipo_compania','$paciente','$sexo',$edad,$id_compania,'$otra_compania','$puesto','$ta','$tum','$ove','$coordinador')";
if (mysql_query($query,$connection)){
//OK
} else {
$errno = mysql_errno();
$error = mysql_error();
mysql_close($connection);
die("<br />$errno - $error<br /><br />$query");
exit;
}
The output is:
0 -
INSERT INTO normotensiones(fecha,macropera,pozo,equipo_pmx, equipo_compania,paciente,sexo,edad,id_compania, otra_compania,puesto,ta,tum,ove,coordinador)
VALUES('20111001','P. ALEMAN 1739','P. ALEMAN 1715','726', 'WDI 838','SERGIO AYALA','M',33,21, '','','110/70','ROBERTO ELIEL CAMARILLO','VICTOR HUGO RAMIREZ','LIC. PABLO GARCES')
Looks like there are no error, but allways execute the code in the else part of the if instruction. Any idea? Thanks in advance.
I think the issue might be you are missing the mysql_select_db line after the connection.
After the connection with the database is established you need to select a DB. Please make sure you have selected the Database that your desired table resides in.
And you can even use the following snippets to get some useful informated through mysql_errors.
$connection = mysql_connect('localhost', 'root', 'password');
if (!$connection) {
die('<br>Could not connect: ' . mysql_error());
}
if (!mysql_select_db('db_name')) {
die('Could not select database: ' . mysql_error());
}
And try you insert query after these lines of code. All the best.
I agree with the others concerning the column types. INT is one of the only data types that do not require single quotes.
There are two blank strings. There is a possibility that the variables are not defined, and therefore giving you a PHP exception (not even in the MySql yet) but that requires stricter-than-normal exception settings. I would personally look into the $connection variable. Before the SQL query statement, put this and send us the cleaned results:
echo '<pre>'.var_dump($connection, true).'</pre>';
Additionally, on your mysql_connect function call, put
OR die('No connection')
afterwords. Do the same thing with the mysql_select_db function, changing it to 'No DB Select' obviously.
Ultimately, we will need more information. But changing to mysqli is very desirable.
Oh! And make sure the permissions for the user you are connecting as are not changed. Sometimes I find people who connect to PhpMyAdmin using one user account but a different account in their PHP code. This is problematic, and will lead to problems eventually, as you forget the different accounts, at times.

php Getting Data from Database slows down the Page

I am using wamp server on windows. while getting a little bit of data from my database hangs my page badly. it's just like a simple post which have 1 image 1 title and a little bit discription and when I trigger the command it hangs my page badly. here is how my code looks like.
<?php
//1. Create a connection
$connection= mysql_connect("localhost","root","");
if(!$connection){
die("Database Connection Failed :" . mysql_error());
}
//2 Select a database to use
$db_select = mysql_select_db("gat", $connection);
if (!$db_select) {
die("Database selection failed: " . mysql_error());
}
?>
<html>
<head>
<title>Database Check</title>
</head>
<body>
<?php
//3 perform database query
$result=mysql_query("SELECT * FROM recent_works",$connection);
if (!$result) {
die("Database query failed:" . mysql_error());
}
//4 use returned data
while ($row= mysql_fetch_assoc($result)) {
echo "<div class='work_item'>";
echo "<img src='{$row['image']}' alt=''>";
echo "<h2>{$row['title']}</h2>";
echo "<p>{$row['short_discription']}</p>";
echo "</div>";
}
?>
</body>
</html>
<?php
//5 close connection
mysql_close($connection);
?>
Fetching data from a database will always involve some level of blocking. The question is how much data are you fetching. Your example indicates you are selecting everything from the table and fetching all of the data to print out onto the page. So how many rows are in the table, how much data is stored in each column, and how much of that data gets transferred to the client are all provisioning factors of speed here. Additionally, you have to consider that connecting to the database also has a cost.
Here are a few suggestions I can make to the above code:
Don't use the old mysql extension (mysql_* functions), but consider using the newer MySQLi extension, which can help you do things the old extension can't; like asynchronous queries. It's also highly discouraged to use the old mysql extension in new development, since it's in plans for deprecation currently. See MySQL: choosing an API in the PHP manual for more information.
Check phpinfo() to make sure you aren't using output buffering (which requires buffering up to a certain amount of data before it gets sent to the client). This could result in the client waiting around until there's data ready to be sent. Pushing some HTML content out to the client as soon as possible could help improve the user experience.
Don't use SELECT * FROM table in your queries, instead, consider explicitly selecting only the fields you need for each query: SELECT image,title,short_discription FROM recent_works
If there's a lot of data (more than say hundred rows maybe) consider using pagination and LIMIT the query to a certain number of rows per page view. This can greatly reduce the amount of traffic between your DBMs and PHP on a per request basis.
If it's a high load site consider using a persistent database connection.

Categories