looping twice as much - writing doubles to MySQL table - php

In a nutshell, my code seems to be looping twice as much as it should (writing four rows when it should be writing two rows). This should be an easy solution but I'm not having any luck.
Here is my php loop . . . must be one very simple yet invisible thing that no one has yet been able to locate why this baby is not working:
//query statement before the for loop
$stmt="INSERT INTO o70vm_invoices_invoices
(`id`, `created_by`, `user_id`, `added`, `to_name`, `to_address`, `invoice_num`, `real_invoice_num`, `from_name`, `from_address`, `from_num`, `invoice_date`, `publish`, `notes`, `template_id`, `taxes`, `start_publish`, `end_publish`, `currency_before`, `currency_after`, `status`, `to_email`, `to_company`, `from_phone`, `from_url`, `from_email`, `discount`, `invoice_duedate`, `admin_notes`, `to_city`, `to_state`, `to_country`, `to_vatid`, `to_zipcode`, `rec_year`, `rec_month`, `rec_day`, `rec_nextdate`, `is_recurrent`) VALUES ";
// loop through number of invoices user selected to create
for($x = 0; $x < $invoiceCount; $x++)
{
// add the user identified days to each invoice
$date->modify("+7 days");
$invoiceDateNew = $date->format ('Y-m-d 00:00:00');
$invoiceDueDateNew = $date->format ('Y-m-d H:m:s');
$startPubNew = $date->format ('Y-m-d 00:00:00');
// getting the values per row
$ValuesAddToQuery[] ="(NULL, '792', '$userID', '$todayDate', '$parentName', 'unknown address', '0000', '0000', '', '', '', '".$invoiceDateNew."', '1', '', '2', '', '".$startPubNew."', '0000-00-00 00:00:00', '$', '', '', '$email', '$childName', '', '', '', '0.00', '".$invoiceDueDateNew."', '', '', '', '', '', '', '0', '0', '0', '0000-00-00', '0')";
}
$stmt .= implode(',',$ValuesAddToQuery);
mysql_query($stmt) or exit(mysql_error());
I store the number of invoices as:
$invoiceCount
I have echoed out the value of $invoiceCount and the value is always the same value as the user inputs. IE, user selects 2 invoices to create, displays 2 invoices in the variable, yet creates 4 invoices in the MySQL table.
Stranger more: When I check for the rows affected with:
mysql_affected_rows()
It returns the user-selected number of invoices / rows (not the actual rows I can see are added in the MySQL Table). For example, it will say "2" rows have been affected when four rows have been added.
Even more wild . . . when I echo out the MySQL query:
echo $stmt;
my query also shows just two rows have been added when the user selected two rows to add but the code wrote 4 actual rows.
Adventurous, I even tried to slice the array to see if I could alter the code that is sent:
//implode the values into the statement
$stmt .= implode(',',$ValuesAddToQuery);
//limit the length of the array
array_slice($ValuesAddToQuery,0,2);
mysql_query($stmt) or exit(mysql_error());
And, you guessed it, it changes absolutely nothing. I put the array_slice on top of the implode statement. Again, no change in the 4 rows inputted when I only want 2 rows.
The more I look at this, I can't tell in this code why it is doubling.
Any help, much appreciated.
For detailed explanation of some of my input fields and what I'm doing, follow below:
To start, I am letting the user select how many rows to copy and update the invoice date as required. I am getting the values of FREQUENCY (7 days, 14 days, or 30 days) of recurring invoices and the DURATION (number of invoices to create/copy) using these input fields:
<select name="freqOfInvoices">
<option value="7">Weekly</option>
<option value="14">Bi-Weekly</option>
<option value="30">Monthly</option>
</select>
<input type="number" title="numberOfInvoices" name="numberOfInvoices" size="2" id="numberOfInvoices" value="numberOfInvoices" />
I have similar input fields for the three dates I'm looking to ADD x number of days to:
// assigning variables
$freqOfInvoices = htmlentities($_POST['freqOfInvoices'], ENT_QUOTES);
$numberOfInvoices = htmlentities($_POST['numberOfInvoices'], ENT_QUOTES);
$invoiceDate = htmlentities($_POST['invoice_date'], ENT_QUOTES);
$invoiceDueDate = htmlentities($_POST['invoice_duedate'], ENT_QUOTES);
$startPub = htmlentities($_POST['start_publish'], ENT_QUOTES);
//assigning number of invoices
$countInvoices=$numberOfInvoices;

It seems you may only need 1 loop to construct the values.
//query statement before the foreach loop
$stmt="INSERT INTO o70vm_invoices_invoices (`id`, `.....`, etc) VALUES ";
$ValuesAddToQuery = [];
for($x = 0; $x < $arrayLength; $x++) {
// add the user identified days to the date
$date->modify("+7 days");
$invoiceDateNew = $date->format ('Y-m-d 00:00:00');
$ValuesAddToQuery[]="(NULL, '....', ".$invoiceDateNew.")";
}
$stmt .= implode(',',$ValuesAddToQuery);
mysql_query($stmt) or exit(mysql_error());

If you echo $stmt does the query string look correct or are your values getting doubled?

I'm answering my own question as I figured out a workaround that "actually" works. I'm keeping this question online for others as if I can save them four days of work, it is my pleasure.
Because I could not figure out why my code was creating twice as many invoices as the user requested, I simply added this code:
//////////////////////////
// Work around CODE as it keeps on doubling the records in DB
// Deleting the same amount of last entries as there are valid entries
//////////////////////////
$RecordsToDelete=$invoiceCount;
$DeleteQuery="DELETE FROM o70vm_invoices_invoices ORDER BY id DESC limit $RecordsToDelete";
mysql_query($DeleteQuery) or exit(mysql_error());
immediately after my original implode / execute query code:
$stmt .= implode(',',$ValuesAddToQuery);
mysql_query($stmt) or exit(mysql_error());
The code works because my "infected" code was writing the full series of invoices (with the user selected dates) once, and then doing the same series again. So, that translated into the first set of invoices (2) to be correct and the last invoices (2) to be duplicates. So, presto, just delete from the last entries the count of your invoices.
I admit, it is a workaround. Hopefully someday I will figure out why my original code produced duplicates. But, happy day regardless. :)

Related

Insert 60000 rows Mysql from PHP

I have a MySQL database with a backend: PHP.
In one table I have to insert 60,000 rows.
We made a query into my PHP that returns 1,000 rows. For each rows we have to insert 60 rows. We thought make a loop but we don't know if is the best practices that way.
The part of the code that insert the data is:
$turnos = $db->query("SELECT * FROM turno t
WHERE t.canchaId = :cancha
AND t.fecha BETWEEN :fechaInicio AND :fechaFin
AND t.nombre LIKE :cadena
ORDER BY t.fecha,t.hora ASC",
array("cancha" => $cancha["idCancha"], "fechaInicio" => $fechaInicio, "fechaFin" => $fechaFin, "cadena" => "%turno fijo%"));
foreach($turnos as $turno) {
//turnos have 1000 rows
$fecha = date_create($turno["fecha"]);
date_add($fecha, date_interval_create_from_date_string('7 days'));
$anioAuxiliar = 2017;
while(2017 == $anioAuxiliar){
//60 times
$data1 = turno[data1];
$data2 = turno[data2];
...
$fechaAGuardar = (String) $fecha->format('Y-m-d');
$result = $db->query("INSERT INTO turno(fechaAGuardar, data2, data3, data4, data5, data6, data7, data8) VALUES(:fechaAGuardar, :data2, :data3, :data4, :data5, :data6, :data7, :data8)",
array("fechaAGuardar" => $fechaAGuardar, "data2" => $data2, "data3" => $data3, "data4" => $data4, "data5" => $data5, "data6" => $data6, "data7" => $data7, "data8" => $data8));
date_add($fecha, date_interval_create_from_date_string('7 days'));
$anioAuxiliar = (int) $fecha->format("Y");
}
$cantidad_turnos = $cantidad_turnos + 1;
}
This php is into a hosting with phpmyadmin.
So my questions are:
This is the best way to insert 60,000 rows?
Shall we considerer take into account another constraint? (eg: phpmyadmin don't allow you insert that amount of rows)
Thanks for helping me, Any suggestions are welcome
//EDIT//
The inserts data change, we have to insert datetime, and for each loop we have to add 7 day the last date inserted. So we can't use insert with select
As a bunch of fellows described in the comments, INSERT/SELECT is the way to go if this data is in the same server/database. There's no need to use PHP at all. Your year comment can be handled with DATE_ADD.
Anyway, if there is any other requirement and you can't use PHP, consider using Bulk Data Loading.
Analysing your code, the MOST IMPORTANT TIP would be: don't use multiple INSERT INTO TABLE expressions. Each INSERT INTO will cause a round trip do the database and things will get really slow. Instead of it, concat multiple values with one INSERT INTO (example from the link):
INSERT INTO yourtable VALUES (1,2), (5,5), ...;
Good luck!

How to transform a PHP loop to a MySQL query?

I am trying to make a PHP loop work for me in MySQL. Currently all visits to a website via a specific URL parameterare logged into a table along with the date and time of the visit. I am rebuilding the logging procedure to only count the visits via one specific parameter on one day, but I'll have to convert the old data first.
So here's what I'm trying to do: The MySQL table (let's call it my_visits) has 3 columns: parameter, visit_id and time.
In my PHP code, I've created the following loop to gather the data I need (all visits made via one paramter on one day, for all parameters):
foreach (range(2008, 2014) as $year) {
$visit_data = array();
$date_ts = strtotime($year . '-01-01');
while ($date_ts <= strtotime($year . '-12-31')) {
$date = date('Y-m-d', $date_ts);
$date_ts += 86400;
// count visit data
$sql = 'SELECT parameter, COUNT(parameter) AS total ' .
'FROM my_visits ' .
'WHERE time BETWEEN \''.$date.' 00:00\' AND \''.$date.' 23:59\' '.
'GROUP BY parameter ORDER BY total DESC';
$stmt = $db->prepare($sql);
$stmt->execute(array($date));
while ($row = $stmt->fetch()) {
$visit_data[] = array(
'param' => $row['parameter'],
'visit_count' => $row['total'],
'date' => $date);
}
$stmt->closeCursor();
}
}
Later on, the gathered data is inserted into a new table (basically eliminating visit_id) using a multiple INSERT (thanks to SO! :)).
The above code works, but due to the size of the table (roughly 3.4 million rows) it is very slow. Using 7 * 365 SQL queries just to gather the data seems just wrong to me and I fear the impact of just running the script will slow everything down substantially.
Is there a way to make this loop work in MySQL, like an equivalent query or something (on a yearly basis perhaps)? I've already tried a solution using GROUP BY, but since this eliminates either the specific dates or the parameters, I can't get it to work.
You can GROUP further.
SELECT `parameter`, COUNT(`parameter`) AS `total`, DATE(`time`) AS `date`
FROM `my_visits`
GROUP BY `parameter`, DATE(`time`)
You can then execute it once (instead of in a loop) and use $row['date'] instead of $date.
This also means you don't have to update your code when we reach 2015 ;)

mysql data insert in php

let me explain my problem..actualy i have a table where patient report get stored and patient can have more than one test so the result for every report should be different on print, result is inserting differnt but the field remark and nor inserting same value for more than one test..
this is input field image of report
and the field row can increase acording to increase of tested by patient..
now i am using this for inserting in table
function save_report_content()
{
$R=DIN_ALL($_REQUEST);
$dt = time();
foreach($R as $k=>$v)
{
$test_id = str_replace('rep_result_', '', $k);
if(strstr($k, 'rep_result_'))
{
$content = $v;
$SQL = "INSERT INTO report SET
rep_te_id = '$test_id',
rep_result = '$content',
record_id = '$R[payment_id]',
remark= '$R[remark]',
nor= '$R[nor]',
rep_date = '$dt'";
now result is going differently in table but remark and nor same for more than one test
i spend so much time to recover this problem but did not succeed, if i miss any relevant info regarding this question then feel free to ask me, and thanks in advance, any idea will be appreciate highly....
Whats the structure of your form ?
<input name='nor[]' />
It should be an array to allow each rows value to come other wise just the last rows value will come ...
when you are in the foreach .. you shouldn't use $R[remark] since you are using $k=>$v
it should be $v['remark']

MySQL INSERTing rows into a database with datetime sanity checks

I am making a meeting room booking system in which there should be no times within the start and end dates so in theory the validation should check for no dates/times within one start and end date time frame.
I have two tables, I can insert into it fine with both start and end dates so the only columns i am interested in at the moment are these
meetingrooms
|------------------------------------||- bookingtime -|-bookingend-|
I understand the principle behind the sanity check and the check i can do in psudocode. Here is the code i have got so far -
>
p4a_db::singleton()->query("INSERT INTO meetingrooms(location_id, bookingtime, bookingend, merono_id)
WHERE bookingtime < " . $date . " AND bookingend > " . $date . "
OR
bookingdate < " . $date . " AND bookingend > " . $dateend . "
VALUES(?,?,?,?)",
array($location, $date, $dateend, $merono));
I don't want to insert data directly into the statement but until i understand how to do this i am stuck, so the question,
How do i perform a sanity check before the data is inserted so that i don't get dates within booked times.
any help would be greatly appreciated.
Edit:
I've been overthinking my answer and I realized that the old solution will not work in your case since you need the time span, comparing the start and end date is useless.
My way of processing this would be:
Save the dates as int, use 24h system (7:40am is 740, 9:50pm is 2150)
Check for stored dates where: (Start<NewStart<End)
Check for stored dates where: (Start<NewEnd<End)
When processing several rooms, just store room number + time as int. That way you can still use the method from 2 and 3.
2 and 3 can be done in a sql query, check out this link.
Old answer (checking for duplicates)
This is an example of how to check for duplicates (in this case email) before inserting the text:
$emailexist = $mysqli->prepare("select email from users where email = ?");
$emailexist->bind_param('s', $email);
$emailexist->execute();
$emailexist->store_result();
if ($emailexist->num_rows > 0) {
$emailexist->close();
$mysqli->close();
return true;
}
else {
$emailexist->close();
$mysqli->close();
return false;
}
It checks if there are rows which contain the string. If so (if number of rows higher than 0) it returns true (which means, the date already exists).
You can just adapt this to you code.
However, you could also just set the columns to UNIQUE. Then you get an error when trying to insert it. It is easier and you won't have problems with concurrent connections.
after a long and intensive search, I have now got a working example of this method, along with a method of protecting against sql injection, here's the code;
if ($this->BookingValue == 1)
{
$sql = "SELECT COUNT(*) as num FROM meeting_room_bookings
WHERE
(
(? < start_at AND ? > start_at)
OR
(? > start_at AND ? < end_at)
)
AND
meeting_room_id = ?";
$result = p4a_db::singleton()->fetchRow($sql, array($date, $date, $date, $dateend, $merono));
if ( 0 == $result["num"] )
{
p4a_db::singleton()->query("INSERT INTO meeting_room_bookings (start_at, end_at, meeting_room_id)
VALUES
(?,?,?)", array($date, $dateend, $merono));
return true;
}
else
{
return false;
There isn't much to explain about this code, but in term of differences, (excluding the change in column names with the table) the query is now prepared before the value is set, then it is possible to use it in an if statement, thus allowing the validation to take place to filter results between different dates.
along with this i have added validation to stop dates from other meeting rooms being included within the statement via the AND statement where the meeting room id is limeted to a single value.
Although now, which will lead on to a separate question is another thrown error that comes from this statement, i know the insert is sound but something from this prepared statement causes the error:
SQLSTATE[HY093]: Invalid parameter number: number of bound variables does not match number of tokens
File: Pdo.php, Line: 234
Although now i am looking into a error that is thrown from the prepared statement and will update this answer when there is a fix, thanks for the help.

optimizing code / DB for a 50,000 row table

I have a list of 300 RSS feeds of news articles stored in a database and every few minutes I grab the contents of every single feed. Each feed contains around 10 articles and I want to store each article in a DB.
The Problem: My DB table is over 50,000 rows and rapidly growing; each time I run my script to get new feeds, it's adding at least 100 more rows. It's to the point where my DB is hitting 100% CPU Utilzation.
The Question: How do I optimize my code / DB?
Note: I do not care about my server's CPU (which is <15% when running this). I greatly care about my DB's CPU.
Possible solutions I'm seeing:
Currently, every time the script runs, it goes to $this->set_content_source_cache where it returns an array of array('link', 'link', 'link', etc.) from all the rows in the table. This is used to later cross-reference to make sure there are no duplicating links. Would not doing this and simply changing the DB so the link column is unique speed things up? Possibly throw this array in memcached instead so it has to only create this array once an hour / day?
break statement if the link is set so that it moves on to the next source?
only checking links that are less than a week old?
Here's what I'm doing:
//$this->set_content_source_cache goes through all 50,000 rows and adds each link to an array so that it's array('link', 'link', 'link', etc.)
$cache_source_array = $this->set_content_source_cache();
$qry = "select source, source_id, source_name, geography_id, industry_id from content_source";
foreach($this->sql->result($qry) as $row_source) {
$feed = simplexml_load_file($row_source['source']);
if(!empty($feed)) {
for ($i=0; $i < 10 ; $i++) {
// most often there are only 10 feeds per rss. Since we check every 2 minutes, if there are
// a few more, then meh, we probably got it last time around
if(!empty($feed->channel->item[$i])) {
// make sure that the item is not blank
$title = $feed->channel->item[$i]->title;
$content = $feed->channel->item[$i]->description;
$link = $feed->channel->item[$i]->link;
$pubdate = $feed->channel->item[$i]->pubdate;
$source_id = $row_source['source_id'];
$source_name = $row_source['source_name'];
$geography_id = $row_source['geography_id'];
$industry_id = $row_source['industry_id'];
// random stuff in here to each link / article to make it data-worthy
if(!isset($cache_source_array[$link])) {
// start the transaction
$this->db->trans_start();
$qry = "insert into content (headline, content, link, article_date, status, source_id, source_name, ".
"industry_id, geography_id) VALUES ".
"(?, ?, ?, ?, 2, ?, ?, ?, ?)";
$this->db->query($qry, array($title, $content, $link, $pubdate, $source_id, $source_name, $industry_id, $geography_id));
// this is my framework's version of mysqli_insert_id()
$content_id = $this->db->insert_id();
$qry = "insert into content_ratings (content_id, comment_count, company_count, contact_count, report_count, read_count) VALUES ".
"($content_id, '0', '0', 0, '0', '0')";
$result2 = $this->db->query($qry);
$this->db->trans_complete();
if($this->db->trans_status() == TRUE) {
$cache_source_array[$link] = $content_id;
echo "Good!<br />";
} else {
echo "Bad!<br />";
}
} else {
// link alread exists
echo "link exists!";
}
}
}
} else {
// feed is empty
}
}
}
I think you answered your own question:
Currently, every time the script runs, it goes to
$this->set_content_source_cache where it returns an array of
array('link', 'link', 'link', etc.) from all the rows in the table.
This is used to later cross-reference to make sure there are no
duplicating links. Would not doing this and simply changing the DB so
the link column is unique speed things up?
Yes, creating a primary key or unique index and allowing the DB to throw an error if there is a duplicate is a much better practice and should be much more efficient.
REFERENCE EDIT:
mysql 5.0 indexes - Unique vs Non Unique
http://dev.mysql.com/doc/refman/5.0/en/create-index.html

Categories