I have a table with around 100000 records ,the structure is shown below
id | name | desc | length | breadth | -------------|remark //upt o 56 fields
1 FRT-100 -desc- 10 10 remarking
----------------------------------------------------------------
----------------------------------------------------------------
what iam doing this is using a cronjob(Gearman) to write all these data to a csv ,my code is given below
<?php
set_time_limit (0);
ini_set("memory_limit","2048M");
//get the total count of records,so that we can loop it in small chunks
$query = "SELECT COUNT(*) AS cnt FROM tablename WHERE company_id = $companyid";
$result = $link->query($query);
$count = 0;
while ($row = mysqli_fetch_array($result)) {
$count = $row["cnt"];
}
if ($count > 1000) {
$loop = ceil($count / 1000);
} else {
$loop = 1;
}
// im going to write it in small chunks of 1000's each time to avoid time out
for ($ii = 1; $ii <= $loop; $ii++) {
if ($ii == 1) {
$s = 1;
} else {
$s = floatval(($ii * 1000) - 1000);
}
$q = "SELECT * FROM datas WHERE group_company_id = $companyid LIMIT 1000 OFFSET $s";
$r = $link->query($q);
while ($row2 = mysqli_fetch_array($r)) {
//my csv writing will be done here and its working fine for records up to 10,000 ~ 12,000 after than memory exhaustion occours
}
}
?>
I strongly suspects something can be optimised in the offset function of mysql .Can someone show me a better way to optimise it ? open to any suggestions (CRON,third party libraries ..etc)
Try and avoid storing everything in memory at once, instead load each row, then write out result one row at a time.
<?php
$q = "SELECT * FROM datas";
$r = $link->query($q);
$fp = fopen("out.csv","w+");
// Or you could just set the headers for content type, and echo the output
while ($row2 = mysqli_fetch_array($r)) {
fwrite($fp, implode(",",$row2)."\n");
}
fclose($fp);
This should solve the issue, nothing is being stored in memory.
Related
/* To sort the id and limit the post by 40 */
$sql = "SELECT * FROM requests";
$result = $conn->query($sql);
$sqlall= "SELECT * FROM requests ";
$resultall = $conn->query($sqlall);
$i = 0;
if ($result->num_rows > 0) {
// Output data of each row
$idarray= array();
while($row = $result->fetch_assoc()) {
echo "<br>";
// Create an array to store the
// id of the blogs
array_push($idarray,$row['id']);
}
}
else {
echo "0 results";
}
?>
<?php
for($x = 1; $x < 40; $x++) {
// This is the loop to display all the stored blog posts
if(isset($x)) {
$query = mysqli_query(
$conn,"SELECT * FROM `requests`");
$res = mysqli_fetch_array($query);
$email1 = $res['email1'];
$msg1= $res['msg1'];
$subject1 = $res['subject1'];
$name1 = $res['name1'];
$id = $res['id'];
the output is 40 cards reading data from the first row in my database. can anyone help?
I'm using xampp.
This code is to show the loop, but if anyone wants the full code is here
You are storing all the IDs in the array $idarray, but then you don't really use them properly. You loop over them, but you just run SELECT * FROM requests` 40 more times, and always extract the same first row. You never use the ID to change the query.
But it really makes no sense to run lots of separate queries anyway. If you just want the first 40 rows then use MySQL's LIMIT keyword. It usually works best when combined with ORDER BY as well. Something like this:
$sql = "SELECT * FROM requests ORDER BY id LIMIT 40";
$result = $conn->query($sql);
while ($res = $result->fetch_assoc()) {
$email1 = $res['email1'];
$msg1 = $res['msg1'];
$subject1 = $res['subject1'];
$name1 = $res['name1'];
$id = $res['id'];
//example output, just for demo:
echo $email1." ".$msg1." ".$subject1." ".$name1." ".$id;
}
Documentation: https://dev.mysql.com/doc/refman/8.0/en/limit-optimization.html
I do a mySQL query to get some data and then (for the purpose of debugging) print it out. In this particular sample there are 5 rows of data and each room_id in the database table has a value. However the print-out only shows the room_id of the first row.
$query_rooms = "SELECT room_id FROM lh_rooms WHERE hid = '$hid'";
$rooms = mysql_query($query_rooms, $MySQL) or die(mysql_error());
$row_rooms = mysql_fetch_assoc($rooms);
$numrows = mysql_num_rows($rooms);
$i = 0;
while ($i < $numrows) {
$room_id=$row_rooms['room_id'][$i];
echo $i." - ".$room_id."<br><br>";
++$i;
}
0 - 2
1 -
2 -
3 -
4 -
Can someone explain what is happening
You are fetching multiple rows.
So, you need to loop over the result set instead of fetching just one time.
Corrected code:
$query_rooms = "SELECT room_id FROM lh_rooms WHERE hid = '$hid'";
$rooms = mysql_query($query_rooms, $MySQL) or die(mysql_error());
$i=0;
while($row_rooms = mysql_fetch_assoc($rooms)) {
$room_id=$row_rooms['room_id'];
echo $i." - ".$room_id."<br><br>";
++$i;
}
Note: Never use mysql_, they are deprecated and will be removed in the upcoming versions. Use mysqli_ or PDO instead.
Try like this
$query_rooms = "SELECT room_id FROM lh_rooms WHERE hid = '$hid'";
$rooms = mysql_query($query_rooms, $MySQL) or die(mysql_error());
$i = 0;
while ($row_rooms = mysql_fetch_assoc($rooms)) {
$room_id=$row_rooms['room_id'];
echo $i." - ".$room_id."<br><br>";
$i++;
}
You are looping $i instead of looping the $row_rooms.
I'm using php with mysql.
I have a large database of mysql containing 365 tables, each day contain a single table and that contains thousand of records for each client in single table.
My problem is that when I'm going to generate the report for multiple clients it doesn't show any thing.
On other hand when I check mysql log it shows queries running on back end, when queries a completed at back end still nothing comes to browser, browser still show process running.
My current code looks like this:
//$ClientList Contains 100 clientIds
//$TableList contains 30 table list
$TotalCount = count($ClientList);
$CountTables = count($TableList);
for($i=0; $i<$TotalCount; $i++) {
for($j=0; $j<$CountTables; $j++) {
$sql = "INSERT INTO TABLEA SELECT * FROM ".$TableList[$j]." WHERE clientid = '".$ClientList[$i]."'";
$rs = mysql_query($sql);
}
}
for($i=0; $i<$TotalCount; $i++) {
$sql = "SELECT * FROM TABLEA";
//STORE IN ARRAY
$rs = mysql_query($sql);
while($ds=mysql_fetch_assoc($rs)) {
$aRRAY[$i] = $ds;
}
}
for($i=0; $i<count($aRRAY); $i++) {
}
But nothing is coming to browser I have also added the settimeout to 0; increased session time, but no results. Any solution to this issuee?
Do not store one million records in a php array:
//$ClientList Contains 100 clientIds
//$TableList contains 30 table list
$TotalCount = count($ClientList);
$CountTables = count($TableList);
for($i=0; $i<$TotalCount; $i++) {
for($j=0; $j<$CountTables; $j++) {
$sql = "INSERT INTO TABLEA SELECT * FROM ".$TableList[$j]." WHERE clientid = '".$ClientList[$i]."'";
$rs = mysql_query($sql);
}
}
//OUTPUT
$sql = "SELECT * FROM TABLEA";
//
$rs = mysql_query($sql);
while($ds=mysql_fetch_assoc($rs)) {
my_output($ds);
}
Also if you are trusting the database to put all your records into TABLEA, why put them into separate tables at all?
//$ClientList Contains 100 clientIds
//$TableList contains 30 table list
$TotalCount = count($ClientList);
$CountTables = count($TableList);
for($i=0; $i<$TotalCount; $i++) {
for($j=0; $j<$CountTables; $j++) {
//OUTPUT
$sql = "SELECT * FROM ".$TableList[$j]." WHERE clientid = '".$ClientList[$i]."'";
//
$rs = mysql_query($sql);
while($ds=mysql_fetch_assoc($rs)) {
my_output($ds);
}
}
}
I am working on a project. I need to query a DB and write the result to a csv file. The result is going to be over 15,000 entries, (thats what the user wants). I am breaking up the results using the LIMIT because if I don't the DB will time out. I divide the query in what I call total_pages. Here is my code.
The big for loop, loops 19 times. The problems is that the code will go through the nested loop one time ( only 500 entries) then It does not go back in. I tried using null on the $results but no luck. Please help.
// using this for my LIMIT
$start_from = 0;
$sql = "select * from someplace where gender = '".$gender."'";
$rs_result = mysql_query($sql);
$total_records = mysql_num_rows($rs_result);
$total_pages = ceil($total_records / 500);
// open a file
//write x in file
file = fopen("./DataFile/gdownload.csv","w");
//write header to the file
$x = "Last Name,First Name,Primary_Name, ........ etc...... \n";
fwrite($file, $x);
for($count = 0; $count <= $total_pages; $count++)
{
$query = "SELECT *
FROM ptable
JOIN person_name ON ptable.Primary_Name = person_name.Primary_Name
WHERE gender = '$gender'
ORDER BY person_name.Lname ASC
LIMIT ".$start_from.", 500";
$result = mysql_query($query) or die(mysql_error());
$num_row = mysql_num_rows($result);
//print tables in rows
while($row = mysql_fetch_array($result))
{
$x="";
$x=$x.$row['Lname'].",";
$x=$x.$row['Fname'].",";
$x=$x.$row['Primary_Name'].",";
$x=$x.$row['asdf#'].",";
$x=$x.$row['qwer'].",";
$x=$x.$row['hjkl'].",";
$x=$x.$row['bnm,'].",";
$x=$x.$row['yui'].",";
$x=$x.$row['aaa'].",";
.....
fwrite($file, $x);
}// end nested while
$start_from+=500;
}// end for loop
fclose($file);
It could be a problem with your LIMIT condition. What you have seems OK to me but the fact that the rows are only being written on the first pass through makes me think $result is empty after the first pass.
Try changing
LIMIT ".$start_from.", 500";
to
LIMIT 500 OFFSET ".$start_from;
Also make sure the query actually returns more than 500 results.
On a different note, it's odd that the request would timeout on just 15,000 records.
How can I use PHP to show five rows from a MySQL database, then create a new line and show another five, etc?
Use the LIMIT clause if you want to limit the amount of results returned from the query.
If you want to print an <hr/> after every fifth record you can check it via the modulus operator:
$counter = 1;
while ($row = mysql_fetch_assoc($rst)) {
// print $row stuff
if ($counter % 5 == 0)
print "<hr />";
$counter++;
}
Basically, we have a variable used to count how many records we've printed. Once that counter can be divided by five, and leave no remainder, we print our horizontal-rule.
Something like this may be helpful:
$result = mysql_query($query);
if($result) {
while($row = mysql_fetch_assoc($result)) {
if(++$i%5==0 && $i>0) {
/* do the extra thing... new line some styles */
}
}
}
Err.. you mean something like:
SELECT * FROM `tablename` WHERE ... LIMIT 5
$total = 20;//get total number here;
$limit = 5;
for($i = 0;$i< $total/$limit;$i++)
{
$sql = $result = $rows = "";
$start = $limit * $i;
$sql = "select * from m_table order by id desc limit $start,$limit";
$result = mysql_query($query);
while($rows = mysql_fetch_assoc($result))
{
//print the result here;
}
}
You can fetch 5 rows every time from mysql without fetching all the rows at once.