Random jump within while() loop in PHP for MySQL query - php

I am doing an example project for University and got a problem that I can't solve.
In general, the project is to create an automated pizza order system in PHP and MySQL on Apache. The system works through the following steps:
- Customer places order -> Baker receives order, proceeds -> Driver receives order at certain state, proceeds
- Customer can view order at all time through session
Now I hung up at the last step: The driver can see a page that has a table with the information that the baker worked with and passed on (all changes are on database side). The driver can only see a whole package (whenever all pizzas are marked as a certain status, also saved in DB).
For this, I have the following SQL statement
SELECT PizzaID, BestellungID, Adresse, PizzaName, Preis, Status FROM angebot, bestelltepizza, bestellung where bestellung.bestellungid = bestelltepizza.fbestellungid and angebot.PizzaName = bestelltepizza.fPizzaName and (select min(status) from bestelltepizza where bestellung.bestellungid = fbestellungid) >2 ORDER BY Status, BestellungID
Now, when I use var_dump() to get the mysqli_num_rows() output, I get no errors and the following output int 26. Compared to the database rows, it's the correct number. I fetch the sql:
while($row = mysqli_fetch_array($this->result)) {
var_dump(mysqli_num_rows($this->result));
var_dump($row);
...
}
Within the while() loop contains another query
$this->query = "SELECT fPizzaName FROM bestelltepizza WHERE fBestellungID = '$BestellID'";
var_dump($this->query);
$tmpResult = $this->_database->query($this->query);
$count = mysqli_num_rows($tmpResult);
Now here is the problem, the while() loop leaves out a random $BestellID which can contain x rows of data. But when I count the output of var_dump() everything is correct. However, var_dump($this->query); is not showing the query statement for the specific jump, too.
Any ideas what this could be? Full link to pastebin below.
To not extend this question to the fullest, I uploaded the whole code to pastebin here: http://pastebin.com/u888CPLw
Offtopic: Appreciate any help, thanks. If I failed clearing out my exact problem or if any questions pop up to my question, please comment and I will clarify. Thanks.

while($row = mysqli_fetch_array($this->result)) {
$count = mysqli_num_rows($tmpResult);
for($i = 0; $i < $count; $i++) {
$tmpVar = mysqli_fetch_array($this->result);
Ive snipped the code to show the problem
$count is based on $tmpResult you are then doing a fetch array on $this->result you should be doing it on $tmpResult
As Marc B says, Its a simple query to either inner join / left join on to the query. It would be better to use the join.

Related

Trying to delete all rows in database that do not match array

I am trying to figure out how to delete all ids in the database that do not exist in an array. I have been trying to use NOT IN in my query but I am not sure why it wont work when running it in a script the same way it works when I manually enter it into mysql. Here is an example.
mysqli_query($con, "DELETE FROM table WHERE id NOT IN ($array)");
$array is a list of ids from a json api. I use CURL to fetch the ids and I am trying to delete all ids in the database that do not match the ids in $array.
First I use another simple CURL script to scrape the apis and insert the ids found into the database and what I am trying to do here is basically make a link/data checker.
If the ids in the database are not found in the array when rechecking them then I want them deleted.
I thought that the query above would work perfect but for some reason it doesn't. When the query is ran from a script the mysql log shows the queries being ran as this.
Example:
DELETE FROM table WHERE id NOT IN ('166')
or this when I am testing multiple values.
DELETE FROM table WHERE id NOT IN ('166', '253', '3324')
And what happens is it deletes every row in the table every time. I don't really understand because if I copy/paste the same query from the log and run it manually myself it works perfect.
I have been trying various ways of capturing the array data such as array_column, array_map, array_search and various functions I have found but the end result is always the same.
For right now, just for testing I am using these 2 bits of code for testing 2 different apis which gives me the same sql query log output as above. The functions used are just a couple random ones that I found.
//$result is the result from CURL using json_decode
function implode_r($g, $p) {
return is_array($p) ?
implode($g, array_map(__FUNCTION__, array_fill(0, count($p), $g), $p)) :
$p;
}
foreach ($result['data'] as $info){
$ids = implode_r(',', $info['id']);
mysqli_query($con, "DELETE FROM table WHERE id NOT IN ($ids)");
}
And
$arrayLength = count($result);
for($i = 0; $i < $arrayLength; $i++) {
mysqli_query($con, "DELETE FROM table WHERE id NOT IN ('".$result[$i]['id']."')");
}
If anyone knows what is going on i'd appretiate the help or any suggestions on how to achieve the same result. I am using php 7 and mysql 5.7 with innodb tables if that helps.
It probably doesn't work because your IN value is something like this
IN('1,2,3,4');
When what you want is this
IN('1','2','3','4')
OR
IN( 1,2,3,4)
To get this with implode include the quotes like this
$in = "'".implode("','", $array)."'";
NOTE whenever directly inputting variables into SQL there is security Implications to consider such as SQLInjection. if the ID's are from a canned source you're probably ok, but I like to sanitize them anyway.
You can't mix array and string.
This works:
mysqli_query($con, "DELETE FROM table WHERE id NOT IN (".implode(',', $ids).")");

SQL groupping by date and host address

I'm trying to count the unique visitors each day on my website.
This is the script i made:
<?php
require_once 'database.php';
$dateQuery = $db->prepare("SELECT count(*) AS hits FROM tracked GROUP BY DATE(date), hostadrr");
$dateQuery->execute();
$rows = $dateQuery->fetchAll(PDO::FETCH_ASSOC);
$array = array();
foreach ( $rows as $row ) {
$array[] = [(int)$row['hits']];
}
$json = json_encode($array);
echo $json;
?>
This array I get from the json_encode is then this:
[[3],[1],[1],[2],[10],[3],[1],[2]]
It is correct that there are 8 arrays inside an array, this represents each day. But the number inside each array are just the total number on clicks on my website, not grouped by the host address of the user. What am i doing wrong here? The array should be: [[1],[1],[1],[1],[1],[1],[1],[1]] (because i'm testing it on my own computer ;) )
First of all, always try to strip down your code to the minimal problem. For example, don't mix JSON encoding, PHP and MySQL and be astonished, that the result isn't what you look like.
Start with the SQL query:
SELECT count(*) AS hits FROM tracked GROUP BY DATE(date), hostadrr
Did you try to run that on console or in phpMyAdmin? Is the correct result showing up?
I would guess not. Why? Because this query only returns a number of hits without any relation to a date or a host address. So, even though you say that the data is correct for the total hits in a day, you can't even be sure if it is the right order or if it doesn't leave out days without any hits.
Since you do not provide your SQL schema, I can only assume one. But your query should at least look something like this:
SELECT
date AS date,
hostadrr AS host,
count(*) AS hits
FROM
tracked
GROUP BY
DATE(date),
hostadrr
ORDER BY
date ASC
Now, you'll get one line for each date/host combination along with the respective number of hits and you can assign that to an array, i.e.
$array[$row['date']][$row['host]] = (int) $row['hits'];

PHP & SQL: update record issue

Having some difficulty pinpointing exactly what is wrong with this block of code. I am expecting it to run through a loop a set number of times and update some rows in the table tbl_games with some values received from the form.
I have tried running the code in phpMyAdmin without variables, which works fine (updates specified row). I assume the problem is something to do with the string in $insert_q.
gamecount will always be an int<30, game_ID will be a unique primary key integer value in tbl_games.
A little background: this code is part of a bigger project - which is centered around football games. An admin adds games to tbl_games (coded and finished), this current file now displays games to the admin which are unplayed (scores for team1 and team2 are NULL) and gives them a space to input scores for each team. This code takes those 2 scores, and the game_ID and updates each row.
It's having no effect on the DB rows though. Please point me in the right direction.
<?php
$lim=$_SESSION['gamecount'];
for ($i=1; $i<$lim; $i++) {
$game_ID = ${"_SESSION['game".$i."_ID']"};
$score_team_1 = ${"_REQUEST['".$i."_team1-score']"};
$score_team_2 = ${"_REQUEST['game".$i."_team2-score']"};
$insert_q = "UPDATE tbl_games SET team1_score = '$score_team_1', team2_score = '$score_team_2' WHERE game_ID = '$game_ID';";
mysql_query($insert_q);
}
session_destroy();
?>
I think the problem is with this line.
$game_ID = ${"POST['game".$i."_ID']"};
It should be something like this.
$game_ID = ${"_POST['game".$i."_ID']"}; or
$game_ID = $_POST['game'.$i.'_ID']; //much cleaner
You need to make use of the mysql reporting. Get it to output any errors, and affected rows. While you may think affected rows will be none, it might not be (always good to check when debugging just so you check everything).
Does your PHP error log have any warnings or other notices that might point to your query being an issue etc?
What is the value you're updating (echo out the var/session) and what is the DB value (look at it in phpmyadmin or mysql command line).
Could be there's nothing to update.

Timing out while updating MySQL with PHP from a CSV

I need to come up with a way to make a large task faster to beat the timeout.
I have very limited access to the server due to the restrictions of the hosting company.
I have a system set up where a cron visits a PHP file that grabs a csv that contains data on some products. The csv does not contain all of the fields that the product would have. Just a handful of essential ones.
I've read a fair number of articles on timeouts and handling csv's and currently (in an attempt to shave time) I have made a table (let's call it csv_data) to hold the csv data. I have a script that truncates the csv_data table then inserts data from the csv so each night the latest recordset from the csv is in that table (the csv file gets updated nightly). So far, no timeout problems..the task only takes about 4-5 seconds.
The timeouts occur when I have to sift through the data to make updates to the products table. The steps that it is running right now is like this
1. Get the sku from csv_data table (that holds thousands of records)
2. Select * from Products where products.sku = csv.sku (products table also holds thousands of records to loop through)
3. Get numrows.
If numrows<0{no record in products, so skip}.
If numrows>1{duplicate entries, don't change anything, but later on report the sku}
If numrows==1{Update selected fields in the products table with csv data}
4. Go to the next record in csv_data all over again
(I figured outlining the process is shorter and easier than dropping in the code.)
I looked into MySQl views and stored procedures but I am not skilled enough in it to know if it will handle the 'if' statement portion.
Is there anything I can do to make this faster to avoid the timeouts?
edit:
I should mention that set_time_limit(0); isn't doing it. And if it helps, the server uses IIS7 and fastcgi
Thanks for your help.
Update after using suggestions from Jakob and Shawn:
I'm doing something wrong. The speed is definitely faster and the csv sku is incrementing,
but when I tried to implement Shawn's solution; the query is giving me a PHP Warning: mysql_result() expects parameter 1 to be resource, boolean error.
Can you help me spot what I am doing wrong?
Here is the section of code:
$csvdata="SELECT * FROM csv_update";
$csvdata_result=mysql_query($csvdata);
mysql_query($csvdata);
$csvdata_num = mysql_num_rows($csvdata_result);
$i=0;
while($i<$csvdata_num){
$csv_code=#mysql_result($csvdata_result,$i,"skucode");
$datacheck=NULL;
$datacheck=substr($csv_code,0,1);
if($datacheck>='0' && $datacheck<='9'){
$csv_price=#mysql_result($csvdata_result,$i,"price");
$csv_retail=#mysql_result($csvdata_result,$i,"retail");
$csv_stock=#mysql_result($csvdata_result,$i,"stock");
$csv_weight=#mysql_result($csvdata_result,$i,"weight");
$csv_manufacturer=#mysql_result($csvdata_result,$i,"manufacturer");
$csv_misc1=#mysql_result($csvdata_result,$i,"misc1");
$csv_misc2=#mysql_result($csvdata_result,$i,"misc2");
$csv_selectlist=#mysql_result($csvdata_result,$i,"selectlist");
$csv_level5=#mysql_result($csvdata_result,$i,"level5");
$csv_frontpage=#mysql_result($csvdata_result,$i,"frontpage");
$csv_level3=#mysql_result($csvdata_result,$i,"level3");
$csv_minquantity=#mysql_result($csvdata_result,$i,"minquantity");
$csv_quantity1=#mysql_result($csvdata_result,$i,"quantity1");
$csv_discount1=#mysql_result($csvdata_result,$i,"discount1");
$csv_quantity2=#mysql_result($csvdata_result,$i,"quantity2");
$csv_discount2=#mysql_result($csvdata_result,$i,"discount2");
$csv_quantity3=#mysql_result($csvdata_result,$i,"quantity3");
$csv_discount3=#mysql_result($csvdata_result,$i,"discount3");
$count_check="SELECT COUNT(*) AS totalCount FROM products WHERE skucode = '$csv_code'";
$count_result=mysql_query($count_check);
mysql_query($count_check);
$totalCount=#mysql_result($count_result,0,'totalCount');
$loopCount = ceil($totalCount / 25);
for($j = 0; $j < $loopCount; $j++){
$prod_check="SELECT skucode FROM products WHERE skucode = '$csv_code' LIMIT ($loopCount*25), 25;";
$prodresult=mysql_query($prod_check);
mysql_query($prod_check);
$prodnum =#mysql_num_rows($prodresult);
$prod_id=#mysql_result($prodresult,0,"catalogid");
if($prodnum<1){
echo "NOT FOUND:$csv_code<br>";
$count_sku_not_found=$count_sku_not_found+1;
$list_sku_not_found=$list_sku_not_found." $csv_code";}
if($prodnum>1){
echo "DUPLICATE:$csv_ccode<br>";
$count_duplicate_skus=$count_duplicate_skus+1;
$list_duplicate_skus=$list_duplicate_skus." $csv_code";}
if ($prodnum==1){
///This prevents an overwrite from happening if the csv file doesn't produce properly
if ($csv_price!="" OR $csv_price!=NULL)
{$sql_price='price="'.$csv_price.'"';}
if ($csv_retail!="" OR $csv_retail!=NULL)
{$sql_retail=',retail="'.$csv_retail.'"';}
if ($csv_stock!="" OR $csv_stock!=NULL)
{$sql_stock=',stock="'.$csv_stock.'"';}
if ($csv_weight!="" OR $csv_weight!=NULL)
{$sql_weight=',weight="'.$csv_weight.'"';}
if ($csv_manufacturer!="" OR $csv_manufacturer!=NULL)
{$sql_manufacturer=',manufacturer="'.$csv_manufacturer.'"';}
if ($csv_misc1!="" OR $csv_misc1!=NULL)
{$sql_misc1=',misc1="'.$csv_misc1.'"';}
if ($csv_misc2!="" OR $csv_misc2!=NULL)
{$sql_pother2=',pother2="'.$csv_misc2.'"';}
if ($csv_selectlist!="" OR $csv_selectlist!=NULL)
{$sql_selectlist=',selectlist="'.$csv_selectlist.'"';}
if ($csv_level5!="" OR $csv_level5!=NULL)
{$sql_level5=',level5="'.$csv_level5.'"';}
if ($csv_frontpage!="" OR $csv_frontpage!=NULL)
{$sql_frontpage=',frontpage="'.$csv_frontpage.'"';}
$import="UPDATE products SET $sql_price $sql_retail $sql_stock $sql_weight $sql_manufacturer $sql_misc1 $sql_misc2 $sql_selectlist $sql_level5 $sql_frontpage $sql_in_stock WHERE skucode='$csv_code'";
mysql_query($import) or die(mysql_error("error updating in products table"));
echo "Update ".$csv_code." successful ($i)<br>";
$count_success_update_skus=$count_success_update_skus+1;
$list_success_update_skus=$list_success_update_skus." $csv_code";
//empty out variables
$sql_price='';
$sql_retail='';
$sql_stock='';
$sql_weight='';
$sql_manufacturer='';
$sql_misc1='';
$sql_misc2='';
$sql_selectlist='';
$sql_level5='';
$sql_frontpage='';
$sql_in_stock='';
$prodnum=0;
}
}
$i++;
}
Is it timing out before the first row is returned or is it between rows during the read? One good practice bit would be to handle your query in chunks; do a count first to see how many records you are dealing with for the SKU, the loop through smaller chunks (the size of these chunks would depend on how many things you have to do with each row). Your updated workflow would look more like this:
Get next SKU from CSV
Get a total count: SELECT COUNT(*) AS totalCount FROM products WHERE products.sku = csv.sku
Determine chunk size (using 25 for this demo)
loopCount = ceil(totalCount / 25)
Loop through all results using a loop like this: for($i = 0; $i < loopCount; $i++)
Inside your loop you should be running a query like this: SELECT * FROM products WHERE products.sku = csv.sku LIMIT (loopCount*25), 25
You will want to use a constant order for your SELECT chunks; your unique ID would probably be best.
I think you can solve this problem with cron. http://en.wikipedia.org/wiki/Cron . It has never had timeout.

debugging a mysql insert fail in php

I'm having problems debugging a failing mysql 5.1 insert under PHP 5.3.4. I can't seem to see anything in the mysql error log or php error logs.
Based on a Yahoo presentation on efficient pagination, I was adding order numbers to posters on my site (order rank, not order sales).
I wrote a quick test app and asked it to create the order numbers on one category. There are 32,233 rows in that category and each and very time I run it I get 23,304 rows updated. Each and every time. I've increased memory usage, I've put ini setting in the script, I've run it from the PHP CLI and PHP-FPM. Each time it doesn't get past 23,304 rows updated.
Here's my script, which I've added massive timeouts to.
include 'common.inc'; //database connection stuff
ini_set("memory_limit","300M");
ini_set("max_execution_time","3600");
ini_set('mysql.connect_timeout','3600');
ini_set('mysql.trace_mode','On');
ini_set('max_input_time','3600');
$sql1="SELECT apcatnum FROM poster_categories_inno LIMIT 1";
$result1 = mysql_query($sql1);
while ($cats = mysql_fetch_array ($result1)) {
$sql2="SELECT poster_data_inno.apnumber,poster_data_inno.aptitle FROM poster_prodcat_inno, poster_data_inno WHERE poster_prodcat_inno.apcatnum ='$cats[apcatnum]' AND poster_data_inno.apnumber = poster_prodcat_inno.apnumber ORDER BY aptitle ASC";
$result2 = mysql_query($sql2);
$ordernum=1;
while ($order = mysql_fetch_array ($result2)) {
$sql3="UPDATE poster_prodcat_inno SET catorder='$ordernum' WHERE apnumber='$order[apnumber]' AND apcatnum='$cats[apcatnum]'";
$result3 = mysql_query($sql3);
$ordernum++;
} // end of 2nd while
}
I'm at a head-scratching loss. Just did a test on a smaller category and only 13,199 out of 17,662 rows were updated. For the two experiments only 72-74% of the rows are getting updated.
I'd say your problem lies with your 2nd query. Have you done an EXPLAIN on it? Because of the ORDER BY clause a filesort will be required. If you don't have appropriate indices that can slow things down further. Try this syntax and sub in a valid integer for your apcatnum variable during testing.
SELECT d.apnumber, d.aptitle
FROM poster_prodcat_inno p JOIN poster_data_inno d
ON poster_data_inno.apnumber = poster_prodcat_inno.apnumber
WHERE p.apcatnum ='{$cats['apcatnum']}'
ORDER BY aptitle ASC;
Secondly, since catorder is just an integer version of the combination of apcatnum and aptitle, it's a denormalization for convenience sake. This isn't necessarily bad, but it does mean that you have to update it every time you add a new title or category. Perhaps it might be better to partition your poster_prodcat_inno table by apcatnum and just do the JOIN with poster_data_inno when you need the actually need the catorder.
Please escape your query input, even if it does come from your own database (quotes and other characters will get you every time). Your SQL statement is incorrect because you're not using the variables correctly, please use hints, such as:
while ($order = mysql_fetch_array($result2)) {
$order = array_filter($order, 'mysql_real_escape_string');
$sql3 = "UPDATE poster_prodcat_inno SET catorder='$ordernum' WHERE apnumber='{$order['apnumber']}' AND apcatnum='{$cats['apcatnum']}'";
}

Categories