Large result (500000 item) query ending before done looping through - php

I have a large result set returned in my query. For some reason, when I loop through the results, doing what I need with each row, I'm finding that the script is ending early.
$stmt1 = $mysqli1->prepare("select distinct p.tn, n.model, n.software_version , c.card_part_num
from N_DB.DSLAMS n
join N_DB.dslam_card on n.nodealias = c.nodealias
join N_DB.dslam_port p on c.index = p.dslam_card_index
WHERE ping_reply = '1'
order by p.tn
LIMIT 60000000");
//it's less than 60 million, but I set that limit to make sure my issue wasn't that I was setting the limit too low
if(!$stmt1->bind_result($num, $model, $software_version, $card))
{
echo "Binding results failed: (" . $stmt1->errno . ") " . $stmt1->error;
}
print "will query depo next \n";
if(!$stmt1->execute())
{
$tempErr = "Error querying depo db: " . $stmt1->error;
printf($tempErr . "\n"); //show mysql execute error if exists
$err->logThis($tempErr);
}
print "done querying depo \n";
$total_records = $stmt1->num_rows();
print "Total records to process: ".$total_records." \n"; //says 0, but it clearly has many
// when it loops thru them,
//printing things
print " Please wait. \n";
$count = 0;
while($stmt1->fetch()) {
$count ++;
if (($count % 50000) == 0){
print "Working on record ".$count." \n"; //I never see this before it prints done
}
//do other things that are more interesting
}//while
print "\n done looping through tn's "; //it gets to this too soon
I think I'm processing too many records, but there isn't a logical way to cut the results to be less than 500000, so the only solution is to divide up the results. I've been looking at taking forever, and limit..peppy comments. I'd like to divide up the results, but I'm having trouble understanding what these people are doing.
How would I divide up my results into chunks so I don't, presumably, run out of memory. I'm relatively new to MySQL, so I apologize if I'm a little green.
Oh, and I checked max_execution_time => 0 => 0 , so that looks fine.

Related

Getting Count on Number of Instances in Array (subtracting after continue; statement in loop)

I have a 'Pricing History Table' based off MySQL table.
Because we have a lot of alike listings, a lot of these entries appear as duplicates, aside from of course the ItemID.
Here's how I'm currently weeding out duplicates (not really duplicates, but similar listings that have the same SKUs, Y-m-d Date, Previous Price, New Price and Sales Channels. -- and also trying to get a count (see $count logic below) of how many alike/"duplicate" rows were found)
$inventoryhistoryrows = array();
$skuarray = array();
foreach ($pricehistoryarray as $ph_key => $ph_values) {
// defining variables, etc.
. . . . .
$ph_sku_timestamp = $inventoryskus . ' ' . $ph_timestamp . ' ' . $ph_previousprice . ' ' . $ph_newprice . ' ' . $ph_channel;
if (in_array($ph_sku_timestamp, $skuarray)) { continue; }
$skuarray[] = $ph_sku_timestamp;
if(isset($prev)) {
$newcount = $count - $prev;
$prev = $count;
}
else {
$prev = $count;
$newcount = $count;
}
$inventoryhistoryrows[] = array($newcount, $ph_itemid, $inventoryskus, $ph_previousprice, $ph_newprice, $ph_channel, $ph_timestamp);
This is working.... but my $newcount is always one row ahead!
Here's an illustration of output in table:
Note the arrows on the far left side. The $newcount variables are correct but the entire first column needs to moved up by one row.
Meaning, 1 should be removed from the first row. 3 Should be in the first row. 17 Should be in the second row.
I can of course see that the reason why 1 is showing up in the first row is due to this statement
else {
$newcount = $count;
}
Meaning it will always return 1 for the first row, as prev does not exist. But I simply put this there as I was unsure of a proper way to get the data as I wanted.
Anyone have any ideas on how I can do this? In the initial foreach loop for $pricehistoryarray (I suppose this would be the better solution), or a relatively simply method for shifting up only the first column once the $inventoryhistoryrows array is constructed?
Apparently this might be better off in the MySQL Query (see comments)
This seems to be working
SELECT COUNT(*) as count
, itemid
, previousprice
, newprice
, channel
, timestamp
FROM listing_price_history
WHERE listing_price_history.timestamp > DATE_SUB(now(), INTERVAL 30 DAY)
GROUP
BY previousprice
, newprice
, channel
, DATE(timestamp)
ORDER
BY listing_price_history.timestamp DESC
Big thanks to #Cid for the guidance

Insert into MySQL db occurs twice in a loop

I'm currently building a small site to retrieve player stats for a computer game from a mysql db and display them online.
I get player statistics for a list of players from a third party API and am trying to insert them into a table in the db. However my code is performing the inserts twice - the primary key restriction on the table stops duplicate entries, but I don't want to accept this.
Something is amiss with my looping logic, but I'm going in my own loop trying to figure this one out.
Sequence of events is:
I query my db to get the player ID's needed for the API calls
I put these in an array
I query the third party api in a loop to get all the player stats
I want to insert the stats (1 row per player) to my db (I plan to escape the strings etc, it's a work in progress)
if ($result->num_rows > 0) {
while($row = $result->fetch_assoc()) {
$member_data[] = $row;
}
foreach ($member_data as $id) {
$endpoint = "http://www.bungie.net/Platform/Destiny/Stats/2/".$id[destiny_membership_id]."/".$id[destiny_character_id]."/";
$bungie_result = hitBungie($endpoint);
$response = json_decode($bungie_result, true);
$destiny_membership_id = $id[destiny_membership_id];
$destiny_character_id = $id[destiny_character_id];
$kills_deaths_ratio = $response[Response][allPvP][allTime][killsDeathsRatio][basic][displayValue];
// DB insert
$sql = "INSERT INTO xax_pvp_stats (destiny_membership_id,destiny_character_id,kills_deaths_ratio) ";
$sql .= "VALUES ('$destiny_membership_id','$destiny_character_id','$kills_deaths_ratio') ";
$result = $conn->query($sql);
if ($conn->query($sql) === FALSE) {
echo "<br />Error: " . $sql . "<br />" . $conn->error;
} else {
echo $destiny_character_id." is in the DB";
}
}
} else {
echo "0 results";
}
You're executing the query twice. Once here:
$result = $conn->query($sql);
and once here:
if ($conn->query($sql) === FALSE) {
I'm guessing you meant to examine the result in the second line:
if ($result === FALSE) {
Several issues, starting with the one you're worried about (insert happening twice):
1) You're calling $conn->query twice, which, of course, executes the INSERT query twice:
Here:
$result = $conn->query($sql);//insert1
if ($conn->query($sql) === FALSE) {//insert 2
echo "<br />Error: " . $sql . "<br />" . $conn->error;
} else {
echo $destiny_character_id." is in the DB";
}
2) Your code is vulnerable to injection, learn about prepared statements and use them
3) accessing values in an associative array requires the keys to be quoted: $response[Response][allPvP][allTime][killsDeathsRatio][basic][displayValue] issues notices. When developing, always use display_errors, and set the error level as strict as you can E_STRICT|E_ALL is recommended.
You're running the query twice: once when you set $result then again in the following if statement. Remove the $result line (you're not using that variable anyway) and you'll be good to go.

Data loss using mysqli_data_seek

I run a query, and loop through it modifying one of the fields with the code below. I only need the modified number for a short time, and do not need it to go back to the database. This works correctly, and using the echos printed out the expected values.
while ($d1 = mysqli_fetch_array($d1_query))
{
echo "Before: " . $d1['d1_name'] . ": " . $d1['d1_earn_rate'] . "<br>";
if ( $e1['e_id'] == $h1['e_id'] )
$d1['d1_earn_rate'] = $d1['d1_earn_rate'] * 1.2;
echo "After: " . $d1['d1_name'] . ": " . $d1['d1_earn_rate'] . "<br><br>";
}
Afterwards, I want to calculate the total of a subset of the results. I use mysqli_data_seek to reset the counter to the first row, so I can loop through it. However, when I do, it calculates the total based on the original numbers in the query, not the revised ones.
I have used msqli_data_seek previously with no issues, but this is the first time I have modified data in the results before trying to loop back through it. I don't understand why I'm losing the data.
mysqli_data_seek($d1_query,0);
$counter = 0;
while ($counter < 15)
{
$counter++;
$d1 = mysqli_fetch_array($d1_query);
echo $d1['d1_name'] . ": " . $d1['d1_earn_rate'] . "<br>";
$total_earn_rate += $d1['d1_earn_rate'];
}
You appear to be thinking way too deep here. The matter is pretty simple:
the MySQL server holds a result set in its memory, the result of your previous query
mysqli_fetch_array pulls the data from the MySQL server into PHP's memory and returns it
you're assigning that fetched data to $d1
you're manipulating $d1
you reset MySQL's internal pointer of the result set and repeat the above process
At no point are you manipulating the result set that is held by the MySQL server, and you're always pulling data afresh from said MySQL result set via mysqli_fetch_array. Every time you call that function you'll get the unmodified data from the result set held by MySQL.

PHP + mySQL when inserting many rows in loop to table my php hungs/freases/

I have some strange problem with inserting rows in loop into mySQL table.
Let me show you php code first that I use then I describe some statistic.
I tend to think that it is some mySQL issue, but absolutely no idea what kind of. Max inserts in table per minute? (Can't be max row reached - planty of spase on disk)
echo " For character=" . $row[1];
$xml = simplexml_load_file($api_url);
$i=0;
foreach ($xml->result->rowset->row as $value) {
$newQuery = 'INSERT INTO '.$tableName.' (transactionDateTime, quantity, typeName, price, clientName, station, transactionType, seller) VALUES ("'.$value['transactionDateTime'].'",'.$value['quantity'].',"'.$value['typeName'].'","'.$value['price'].'","'.$value['clientName'].'","'.$value['stationName'].'","'.$value['transactionType'].'","'.$row[1].'")';
$i++;
if (!mysqli_query($conn, $newQuery)) {
die('Error while adding transaction record: ' . mysqli_error($conn));
} // if END
} // foreach END
echo " added records=" . $i;
I have same data in XML that doesn't change. (XML has something like 1400+ rows that i would insert)
It always inserts different amount of rows. Max amount it inserted was around 800+
If I insert like 10sec delay into foreach loop at $i==400 it will add even less rows. And more delays - less rows.
It never comes to that part of code where mysqli_error($conn)
It never reaches echo " added records=" . $i; part of the code.
Since it alwasy stops on different recors I have to assume nothing wrong with INSERT query.
Since it never reaches line after foreach loop echo " added records=" . $i; I also assume XML data wasn't processed by the end of it.
If I use another sources of data (another character) where are less records in XML then this code works just fine.
What could possibly be my problem?
Could be that your firing multiple queries at your SQL server. Better to build a single SQL query via your foreach then fire it once.
Something like this, basically:
$db = new mysqli($hostname, $username, $password, $database);
if($db->connect_errno > 0)
{
$error[] = "Couldn't establish connection to the database.";
}
$commaIncrement = 1;
$commaCount = count($result);
$SQL[] = "INSERT INTO $table $columns VALUES";
foreach ($result as $value)
{
$comma = $commaCount == $commaIncrement ? "" : ",";
$SQL[] = "(";
$SQL[] = "'$value[0]'"."'$value[1]'"."'$value[2]'"."'$value[3]'";
$SQL[] = ")$comma";
$commaIncrement++;
}
$SQL[] = ";";
$completedSQL = implode(' ',$SQL);
$query = $db->prepare($completedSQL);
if($query)
{
$db->query($completedSQL)
}
$db->close();
Scrowler is right, your php is timing out. As a test, you can add
set_time_limit(0);
to the start of your php script.
WARNING - Don't use this in production or anywhere else. Always set a reasonable time limit for the script.

How to get a doctrine loop to find the number of grouped records

I am working on a query that is returning all published records, and grouping and ordering them by the latest updated_version.
Then I am filtering that result to show the results that were updated in the last 24 hrs, week, and month.
All works dandy. However, I would like to add the condition that if no records have been updated in each of these three date criteria, then to echo "No records have been updated".
I am curious if I can isolate these groups in the query and check against that, or possibly set a variable in the loop. The issue with the loop is, I can get the 0 result condition in the loop, but because it is checking INSIDE the loop, I get an echo of "No results found" for each record in the loop.
OK, Here is the query:
//return all unpublished records
$draftEntries = Doctrine::getTable("foo")
->createQuery()
->where('published = 0')
->groupBy('updated_at')
->orderBy("updated_at DESC")
->execute();
And the loop:
$message .= "These profiles were edited within the last 24 hours: \n";
foreach ($draftEntries as $entry) {
$currentDay = substr($entry['updated_at'], 0, 10);
$condition = false;
if($currentDay == $today) {
$condition = true;
$message .= $entry['last_name'] . ", " . $entry['first_name'] . "\n";
else {
$message .= "There were records updated yesterday";
echo $condition;
}
}
This is just on of the three loops, but I think you get the gist of my intention. Obviously, this is the loop that returns:
There were records updated yesterday.
There were records updated yesterday.
There were records updated yesterday.
...
which is not desired.
So, from the query, can I check to see if the groupBy is greater than 0?
Surely I can do this without setting up three queries to get a result right?
This is how I solved this. If there is a better answer, let me know.
I didi not try to isolate the group in the query, I just used a few conditional statements in the loop:
//Edited in the last 24 hours
$message .= "<p>These profiles were edited within the last 24 hours: </p><ul>\n";
$lineitem = "";
foreach ($draftEntries as $draftentry) {
$currentDay = substr($draftentry['updated_at'], 0, 10);
if($currentDay == $today) {
$listpiece .= $draftentry['id'];
$listpiece .= "&profile_type=faculty'>".$draftentry['last_name'] . ", " . $draftentry['first_name'] . "</a> / Edited by: ".$draftentry['last_updater']."</li> \n";
$lineitem .= $listpiece;
}
}
if ($lineitem == "") {
$message .= "No edits were made.";
}
else {
$message .= $lineitem;
}
I am concatenating the $message, but I needed to formulate a condition for what gets included in the $message outside the loop. $lineitem represents a single entry in the loop. It is instantiated outside the loop, then can be passed data according to the if statement inside the loop.
SO far, it works pretty well.

Categories