I'm having problems debugging a failing mysql 5.1 insert under PHP 5.3.4. I can't seem to see anything in the mysql error log or php error logs.
Based on a Yahoo presentation on efficient pagination, I was adding order numbers to posters on my site (order rank, not order sales).
I wrote a quick test app and asked it to create the order numbers on one category. There are 32,233 rows in that category and each and very time I run it I get 23,304 rows updated. Each and every time. I've increased memory usage, I've put ini setting in the script, I've run it from the PHP CLI and PHP-FPM. Each time it doesn't get past 23,304 rows updated.
Here's my script, which I've added massive timeouts to.
include 'common.inc'; //database connection stuff
ini_set("memory_limit","300M");
ini_set("max_execution_time","3600");
ini_set('mysql.connect_timeout','3600');
ini_set('mysql.trace_mode','On');
ini_set('max_input_time','3600');
$sql1="SELECT apcatnum FROM poster_categories_inno LIMIT 1";
$result1 = mysql_query($sql1);
while ($cats = mysql_fetch_array ($result1)) {
$sql2="SELECT poster_data_inno.apnumber,poster_data_inno.aptitle FROM poster_prodcat_inno, poster_data_inno WHERE poster_prodcat_inno.apcatnum ='$cats[apcatnum]' AND poster_data_inno.apnumber = poster_prodcat_inno.apnumber ORDER BY aptitle ASC";
$result2 = mysql_query($sql2);
$ordernum=1;
while ($order = mysql_fetch_array ($result2)) {
$sql3="UPDATE poster_prodcat_inno SET catorder='$ordernum' WHERE apnumber='$order[apnumber]' AND apcatnum='$cats[apcatnum]'";
$result3 = mysql_query($sql3);
$ordernum++;
} // end of 2nd while
}
I'm at a head-scratching loss. Just did a test on a smaller category and only 13,199 out of 17,662 rows were updated. For the two experiments only 72-74% of the rows are getting updated.
I'd say your problem lies with your 2nd query. Have you done an EXPLAIN on it? Because of the ORDER BY clause a filesort will be required. If you don't have appropriate indices that can slow things down further. Try this syntax and sub in a valid integer for your apcatnum variable during testing.
SELECT d.apnumber, d.aptitle
FROM poster_prodcat_inno p JOIN poster_data_inno d
ON poster_data_inno.apnumber = poster_prodcat_inno.apnumber
WHERE p.apcatnum ='{$cats['apcatnum']}'
ORDER BY aptitle ASC;
Secondly, since catorder is just an integer version of the combination of apcatnum and aptitle, it's a denormalization for convenience sake. This isn't necessarily bad, but it does mean that you have to update it every time you add a new title or category. Perhaps it might be better to partition your poster_prodcat_inno table by apcatnum and just do the JOIN with poster_data_inno when you need the actually need the catorder.
Please escape your query input, even if it does come from your own database (quotes and other characters will get you every time). Your SQL statement is incorrect because you're not using the variables correctly, please use hints, such as:
while ($order = mysql_fetch_array($result2)) {
$order = array_filter($order, 'mysql_real_escape_string');
$sql3 = "UPDATE poster_prodcat_inno SET catorder='$ordernum' WHERE apnumber='{$order['apnumber']}' AND apcatnum='{$cats['apcatnum']}'";
}
Related
Function render makes website 500% slow! Can anyone fix that please ?
Someone told me :
because it sends a database request on each iteration of the loop (it's not the only problem with this chunk of code but it's the most taxing one)
Yes I understand what that means. His way is:
you need to get all of the data before you start building the menu,
then you just insert the data instead of requesting more data on each
iteration
But i don't know how i must do it!
<?php
$menu_html='';
function render_menu($parent_id,$actmenuid)
{
$obj = new Database();
$con = $obj->dbconnectt();
global $menu_html;
$result=mysqli_query($con, "select * from tbl_menu where parent_id='$parent_id'");
if(mysqli_num_rows($result)==0) return;
if($parent_id==0){
$menu_html.='<ul class="topnav">';
}else{
$menu_html.='<ul>';
}
while($row=mysqli_fetch_array($result)) {
$childnum = $obj->recordcount("SELECT * FROM tbl_menu WHERE parent_id='".$row['id']."'");
if($childnum == 0){
$linkvalue='/category/'.$row['id'].'.html';
} else{
$linkvalue='#';
}
if($row['id']==$actmenuid && $actmenuid !=NULL){
$actv='class="active"';
}else{
$actv='';
}
$menu_html.='<li '.$actv.'>'.$row['title'].'';
render_menu($row['id'],$actmenuid);
$menu_html.='</li>';
}
$menu_html.='</ul>';return $menu_html;
}
if($isDsh==false){
echo render_menu(0,$actmenuid);
}
?>
Depending on how many records you have, try removing this query from inside the loop since it's running for every record on the first query.
$childnum = $obj->recordcount("SELECT * FROM tbl_menu WHERE parent_id='".$row['id']."'");
Change it a single query like this where it returns counts for each parent idea, and place it outside of the loop:
$parentcount = mysqli_query($con, ("SELECT parent_id, count(*) FROM tbl_menu GROUP BY parent_id");
There may be other issues, so please post the database structure and number of records that you're working with too.
Don't make recursive queries.
Having "more than 1000" rows is not too big. You can simply call everything from the table into php, then perform the recursive html build in php this will have a memory overhead, but far less processing overhead because you only ever make one trip to the db.
Alternatively (when your db table is prohibitively large), you should avoid gathering rows unnecessarily by adding a new column. The new column will store all "descendants" for the respective row when the row is INSERTed or update it when it is UPDATEd. Then you only need to reference this column when needing to call specific rows. In other words, do the recursive processing only once (when writing to the db) AND not when needing to display the data. This will, again, produce a finite result set in one query which can then be recursively traversed to build the desired output.
basically you need to do what #spudly has suggested.
But there is a small catch in his solution which depending on the number of the rows in yous tbl_menu table you may use a big chunk of memory to fetch all the records.
you can optimise it more with using his solution but changing the query to:
select
parent_tbl_menu.id,
count(child_tbl_menu.id) as cnt
from
tbl_menu as parent_tbl_menu
left join
tbl_menu as child_tbl_menu
on parent_tbl_menu.id = child_tbl_menu.parent_id
where
parent_tbl_menu.parent_id = ?
group by
parent_tbl_menu.id
This way you will only fetch the child records of a specific parent.
And please consider using prepared statements as your code has sql injection vulnerability.
Connect (from PHP to MySQL) only once for the entire web page.
Don't put a SELECT inside a loop if you can do all the work in a single SELECT, such as with a JOIN. (Exception: A "hierarchical" table needs the nested SELECT. Exception to the exception: MySQL 8.0 and MariaDB 10.2 can do it with a "recursive CTE".)
Don't fetch all the columns (SELECT *) when all you want it is a recordcount. Instead, SELECT COUNT(*) ... and use the number returned.
1000 of anything is probably excessive for a web page. Re-think the UI.
I am doing an example project for University and got a problem that I can't solve.
In general, the project is to create an automated pizza order system in PHP and MySQL on Apache. The system works through the following steps:
- Customer places order -> Baker receives order, proceeds -> Driver receives order at certain state, proceeds
- Customer can view order at all time through session
Now I hung up at the last step: The driver can see a page that has a table with the information that the baker worked with and passed on (all changes are on database side). The driver can only see a whole package (whenever all pizzas are marked as a certain status, also saved in DB).
For this, I have the following SQL statement
SELECT PizzaID, BestellungID, Adresse, PizzaName, Preis, Status FROM angebot, bestelltepizza, bestellung where bestellung.bestellungid = bestelltepizza.fbestellungid and angebot.PizzaName = bestelltepizza.fPizzaName and (select min(status) from bestelltepizza where bestellung.bestellungid = fbestellungid) >2 ORDER BY Status, BestellungID
Now, when I use var_dump() to get the mysqli_num_rows() output, I get no errors and the following output int 26. Compared to the database rows, it's the correct number. I fetch the sql:
while($row = mysqli_fetch_array($this->result)) {
var_dump(mysqli_num_rows($this->result));
var_dump($row);
...
}
Within the while() loop contains another query
$this->query = "SELECT fPizzaName FROM bestelltepizza WHERE fBestellungID = '$BestellID'";
var_dump($this->query);
$tmpResult = $this->_database->query($this->query);
$count = mysqli_num_rows($tmpResult);
Now here is the problem, the while() loop leaves out a random $BestellID which can contain x rows of data. But when I count the output of var_dump() everything is correct. However, var_dump($this->query); is not showing the query statement for the specific jump, too.
Any ideas what this could be? Full link to pastebin below.
To not extend this question to the fullest, I uploaded the whole code to pastebin here: http://pastebin.com/u888CPLw
Offtopic: Appreciate any help, thanks. If I failed clearing out my exact problem or if any questions pop up to my question, please comment and I will clarify. Thanks.
while($row = mysqli_fetch_array($this->result)) {
$count = mysqli_num_rows($tmpResult);
for($i = 0; $i < $count; $i++) {
$tmpVar = mysqli_fetch_array($this->result);
Ive snipped the code to show the problem
$count is based on $tmpResult you are then doing a fetch array on $this->result you should be doing it on $tmpResult
As Marc B says, Its a simple query to either inner join / left join on to the query. It would be better to use the join.
So, I had this code working earlier today--and all of a sudden it decided to only start displaying the first result from the query. I cannot figure out what i've changed since then, I actually believe that I haven't changed anything... anyway... I've gone into the DB and altered the table so that all the "upgrades" meet the requirements to be displayed, and yet still only one result is being shown.
$sql = "SELECT id, name, cost, count(*) FROM upgrades
WHERE id NOT IN (Select upgrade_id FROM thehave8_site1.user_upgrades WHERE uid = :uid)
AND nullif NOT IN(SELECT upgrade_id FROM thehave8_site1.user_upgrades WHERE uid = :uid2)
AND prereq IN (SELECT upgrade_id FROM thehave8_site1.user_upgrades WHERE uid = :uid3)
;";
$que = $this->db->prepare($sql);
#$que->bindParam(':id', $id); //note the : before id
#$que->bindParam(':id2', $id);
$que->bindParam(':uid', $this->uid);
$que->bindParam(':uid2', $this->uid);
$que->bindParam(':uid3', $this->uid);
try {
$que->execute();
while($row = $que->fetch(PDO::FETCH_BOTH))
{
echo "<div class='upgrade' id={$row[0]}><p>{$row[1]}</p><p>{$row[2]}</p></div>";
}
} catch(PDOException $e) { echo $e->getMessage();}
Problem has been solved, though I'm not sure of the exact reason why, the count(*) being at the end of the query string was preventing the entire code from running properly
Aren't you missing some line of code that advances the result in the query to the next row? When I do loops through recordsets (slightly different than what you are doing but probably not much different) there is usually a MoveNext or something like that - I see nothing like that here.
I don't know this language you are using.
......
I am not being allowed to add a comment to your response so I will add it here..
....
Cool. You really looked like you knew what you were doing, far more advanced than anything I've written! New here also and didn't catch onto the tag system, thanks for pointing that out so I don't need to embarrass myself in future. Glad you've solved it. I think Count (*) would work if you included it as a subquery
SELECT FIELD1, FIELD2, (SELECT count (*) FROM ... ) AS FIELD3
FROM (ETC)
or if you just want its value in the recordset result, compute it ahead of time via query, and then include its value as a dummy/constant field in your select statement. Depending on the query plan in your query engine this may or may not be more efficient.
Or just wait to get Recordcount from your recordset.
I have just started to convert to mysqli due to the added security benefits. The main reason for converting to mysqli was the mysqli_multi_query function though. I had several long, complicated query's with a lot of JOINS in and when I found out about this function I thought I could make it simpler by breaking everything down into separate queries.
However, I have been unable to get the query to work as I want it to and the PHP manual isn't helping me.
Here's what I have so far;
$qry = "SELECT T.T_ID, T.name, T.pic, T.timestamp AS T_ts,
(SELECT COUNT(*) FROM track_plays WHERE T_ID = T.T_ID) AS plays,
(SELECT COUNT(*) FROM track_downloads WHERE T.T_ID = T_ID) AS downloads
FROM tracks T WHERE ID = '$ID';
";
$qry .= "SELECT S_ID, status, timestamp AS S_ts FROM status WHERE ID = '$ID';";
$qry .= "SELECT G_ID, gig_name, date_time, lineup, price, ticket, venue, G_pic, timestamp AS G_ts FROM gigs WHERE ID = '$ID'";
// Execute multi query
if (mysqli_multi_query($con,$qry)){
do {
// Store first result set
if ($result=mysqli_store_result($con)){
while ($row=mysqli_fetch_array($result)){
print_r($row);
}
}
}
while (mysqli_next_result($con));
}
This works fine and the array is printed as expected. However, i want to set a lot of variables for use later on. I won't list all of the variables but I have a few like this;
$S_ID = htmlspecialchars($row['S_ID']);
$T_ID = htmlspecialchars($row['T_ID']);
$G_ID = htmlspecialchars($row['G_ID']);
The variables are retrieved but due to the loop nature of the function PHP errors are given as well saying the $S_ID or $G_ID are undefined. $T_ID does not get an undefined error as it is the first query as is not looped over.
If you need any more information or I have not explained well enough just ask!
A while back ago i have had the same issues, and also for the same reason i have migrated from mysql to mysqli.
Sadly, it wasnt the best sollution for many cases, as i have realized on the way.
While implementing stuff with mysqli and multi query i still have had some issues that i had with mysql.
The solution i recommend, is to use views on your complicated querys.
it makes your coding and querys far less complicated.
I know it is not the answer you have been looking for, but give it a thought.
So far, every time I use a mysqli_multi_query, my do-while condition is:
while(mysqli_more_results($connection) && mysqli_next_result($connection));
You may like to replace your while statement and see if that helps.
If not, are you differentiating between each subsequent query when you are assigning these $S_ID, $G_ID variables? Are you inadvertently overwriting a variable with an empty $row[] value?
I also didn't see mysqli_free_result($result) in your code.
I'll even offer some error checking as a garnish.
preg_match_all("/(?:SELECT\s(.*?),\s.*?(?:;|$))/",$qry,$first_columnname);
if (mysqli_multi_query($con,$qry)){
do {
if ($result=mysqli_store_result($con)){
$query_identifier=array_shift($first_columnname[1]);
while ($row=mysqli_fetch_array($result)){
if($query_identifier)=="T.T_ID"){
//do something...
}elseif($query_identifier)=="S_ID"){
//do something...
}elseif($query_identifier)=="G_ID"){
//do something...
}
}
mysqli_free_result($result);
}
}
while(mysqli_more_results($con) && mysqli_next_result($con));
}
if($error_mess=mysqli_error($con)){echo "Error = $error_mess";}
Let me know if any of these small changes do the trick.
I have pull back a lot of information and as a result, my page is loading in about 22~24 seconds. Is there anything I can do to optimize my code?
Here is my code:
<?php
$result_rules = $db->query("SELECT source_id, destination_id FROM dbo.rules");
while($row_rules = sqlsrv_fetch_array($result_rules)){
$result_destination = $db->query("SELECT pk_id, project FROM dbo.destination WHERE pk_id=" . $row_rules['destination_id'] . " ORDER by project ASC");
while($row_destination = sqlsrv_fetch_array($result_destination)){
echo "Destination project: ";
echo "<span class='item'>".$row_destination['project']."</span>";
echo "ID: ".$row_rules['destination_id']."<br>";
if ($row_rules['source_id'] == null) {
echo "Source ID for Destination ID".$row_rules['destination_id']." is NULL<br>";
} else {
$result_source = $db->query("SELECT pk_id, project FROM dbo.source WHERE pk_id=" . $row_rules['source_id'] . " ORDER by project ASC");
while($row_source = sqlsrv_fetch_array($result_source)){
echo "Source project: ";
echo $row_source['project'];
echo " ID: ".$row_rules['source_id']."<br>";
}
}
}
}
?>
Here's what my tables look like:
Source table: pk_id:int, project:varchar(50), feature:varchar(50), milestone:varchar(50), reviewGroup:varchar(125), groupId:int
Rules table: pk_id:int, source_id:int, destination_id:int, login:varchar(50), status:varchar(50), batchId:int, srcPGroupId:int, dstPGroupId:int
Destination table: pk_id:int, project:varchar(50), feature:varchar(50), milestone:varchar(50), QAAssignedTo:varchar(50), ValidationAssignedTo:varchar(50), Priority:varchar(50), groupId:int
If you want help with optimizing queries then please provide details of the schema and the output of the explain plan.
Running nested loops is bad for performance. Running queries inside nested loops like this is a recipe for VERY poor performance. Using '*' in select is bad for performance too (particularly as your only ever using a couple of columns).
You should start by optimizing your PHP and merging the queries:
$result_rules = $db->query(
"SELECT rule.destination_id, [whatever fields you need from dbo.rules]
dest.project AS dest_project,
src.project AS src_project,
src.pk_id as src_id
FROM dbo.rules rule
INNER JOIN dbo.destination dest
ON dest.pk_id=rule.destination_id
LEFT JOIN dbo.source src
ON src.pk_id=rule.source_id
ORDER BY rule.destination_id, dest.project, src.project");
$last_dest=false;
$last_src=false;
while($rows = sqlsrv_fetch_array($result)){
if ($row['destination_id']!==$last_dest) {
echo "Destination project: ";
echo "<span class='item'>".$row['dest_project']."</span>";
echo "ID: ".$row['destination_id']."<br>";
$last_dest=$row['destination_id'];
}
if (null===$row['src_id']) {
... I'll let you sort out the rest.
Add an index on (pk_id, project) so it includes all fields important for the query.
Make sure that pk_Id is indexed: http://www.w3schools.com/sql/sql_create_index.asp
Rather than using select *, return only the columns you need, unless you need all of them.
I'd also recommend moving your SQL code to the server and calling the stored procedure.
You could consider using LIMIT if your back end is mysql: http://php.about.com/od/mysqlcommands/g/Limit_sql.htm .
I'm assuming that the else clause is what's slowing up your code. I would suggest saving all the data you're going to need at the start and then accessing the array again in the else clause. Basically, you don't need this to run every time.
$result_destination = $db->query("SELECT * FROM dbo.destination WHERE pk_id=" . $row_rules['destination_id'] . " ORDER by project ASC")
You could grab the data earlier and use PHP to iterate over it.
$result_destinations = $db->query("SELECT * FROM dbo.destination ORDER by project ASC")
And then later in your code use PHP to determine the correct destination. Depending on exactly what you're doing it should shave some amount of time off.
Another consideration is the time it takes for your browser to render the html generated by your php code. The more data you are presenting, the longer it's going to take. Depending on the requirements of your audience, you might want to display only x records at a time.
There are jquery methods of increasing the number of records displayed without going back to the server.
For starters you would want to lower the number of queries run. For example doing a query, looping through those results and running another query, then looping through that result set running more queries is generally considered bad. The number of queries run goes up exponentially.
For example, if you have 100 rows coming back from the first query and 10 rows from each sub-query. The first query returns 100 rows that you loop over. For each of those you query again. You are now at 101 queries. Then, for each of those 100 you run another query each returning 10 rows. You are now at 1001 queries. Each query has to send data to the server (the query text), wait for a response and get data back. That is what takes so long.
Use a join to do a single query on all the tables and loop over the single result.