PHP get max. 1000 entries from MySql - php

So my problem is the following:
I am developing a WordPress plugin, and i am getting values from a custom database table. Lets call the table devices. From the device table, i am now getting all entries from the colum ids. But there is my problem: If this colum has more then 1000 entries, i need the "split" them, i mean, i need first to get the first 1000 entries, then the next 1000 until i reach the end. Any ideas how to do this in php?
For example, this is how i get the ids now:
function hlp_getIds() {
global $wpdb;
$table_name = $wpdb->prefix.'devices';
$devices = array();
$sql = "SELECT id FROM $table_name";
$res = $wpdb->get_results($sql);
if ($res != false) {
foreach($res as $row){
array_push($devices, $row->id);
}
}
return $devices;
}
So how can i integrate a method to get only each 1000 entries and not all the entries at once?

Use limit
select id
from device
order by id
limit 0, 1000
to get the next 1000 do
select id
from device
order by id
limit 1000, 1000

Related

Selecting data from DB using LIMIT

I have a total of 5000 rows in my database. Now, what i want to achieve is, my supervisor script should be able to select 1000 at a time and a record selected before shouldn't be selected any longer. I could achieve this by toggling status for a selected row but is there a way i can achieve this with LIMIT?
In the query below, i am selecting the data in batches of 1000. After the first 1000 rows is processed, the script must select another 1000 unique rows. Is there a way to achieve that with LIMITS ?
Index.php
$sql = "select * from `table` where `crude_at` between '".$date_from."'
and '".$date_to."' LIMIT 1000";
$res = $mysqli->query($sql);
while($row = mysqli_fetch_assoc($res)){
$no_of_counts = 0
//do something for the selected rows here
$no_of_counts++
if ($no_of_counts == '1000')
{
//fetch the next 1000 rows
}
}

Check if a value exists in mysql column

Is there a way to check if a value exists in a mysql column? I have table songs, and there are some columns, one of them is called 'agent_ip' where i will put a list/array of all user ip's that will visit the site. I need to check if current user ip is present in column 'agent_ip'. Here is some of my code:
public function voteSong($song_id, $case, $agent_ip) {
$query = $this->link->prepare("SELECT * FROM songs WHERE id = ? LIMIT 1");
$query->bindValue(1, $song_id);
$query->execute();
$rowcount = $query->rowCount();
if ($rowcount != 0)
{
if (!in_array($agent_ip, $r['ip']))
{
if ($case === 'like')
{
while($r = $query->fetch())
{
$vote = $r['votes'] + 1;
}
}
elseif ($case === 'dislike')
{
while ($r = $query->fetch())
{
if ($r['votes'] > 0)
{
$vote = $r['votes'] - 1;
}
else
{
$vote = 0;
}
}
}
$query = $this->link->prepare("UPDATE songs SET datetime = ?, votes = ?, agent_ip = ? WHERE id = ?");
$query->execute(array(date("Y-m-d H:i:s"), $vote, $agent_ip, $song_id));
}
}
}
The line if(!in_array($agent_ip, $r['ip'])) contains the wrong function which won't work, but i need an alternative for mysql. $r['ip'] variable is data from the 'agent_ip' column which look like this 127.0.0.1, 127.0.0.1, 127.0.0.1 (using 127.0.0.1 just for example, every 127.0.0.1 is a different ip)
If you're only checking against a single IP, why don't you just modify your query from:
"SELECT * FROM songs WHERE id = ? LIMIT 1"
To:
"SELECT * FROM songs WHERE id = ? AND agent_ip = ? LIMIT 1"
It seems a bit wasteful to query your whole result set when you are only querying against a specific IP and returning a single row.
EDIT: Your current method would be extremely inefficient, you are passing a unique agent_ip each time you want to query a song to check if the IP exists, that would be fine, but you are creating a new DB connection every time from which you pull back all info which belongs to that song.
Lets say we have 1 song, and 3IP's, currently the application would work like this:
1) Call the method, passing IP_1
2) Query the database getting all songs for ID1
3) Check if IP_1 is in the result set and do process
4) Call the method, passing IP_2
5) Query the database getting all songs for ID1
6) Check if IP_2 is in the result set and do process
7) Call the method, passing IP_3
8) Query the database getting all songs for ID1
9) Check if IP_2 is in the result set and do process
As you can see, there is a lot of repetition here which is going to hinder your apps performance as it scales, you would be so much better modifying your current function to accept a list of results for a song which is pre-queried only once and then recursively call a check function by passing that result array with your unique IP address.
UPDATE You stated I understand that i need to have 2 tables(1 = songs; 2 = votes). But i cannot imagine how i will get songs from database, arranged by votes quantity.
You should read SQL's JOIN documentation, the concept is simple - JOIN allows you to pull back a more detailed set of information based on what you want to query, in your example you may want to find out how many votes a specific song has.
Your tables may look like:
Songs
SONG_ID Primary Key
SONG_TITLE
SONG_DURATION
SONG_TAGS
Votes
VOTE_ID Primary Key
SONG_ID Foreign Key - (references the song_id table)
VOTE_RES Bool (either 0 for no, 1 for yes)
AGENT_IP Who sent the vote
You could then find out how many people said they liked the song by performing a join:
SELECT * FROM songs
JOIN votes
ON songs.song_id = votes.song_id
WHERE songs.song_id = 1
AND votes.vote_res = 1;
This would return all the song with the id of 1 and all of its associated likes. Hope that helps a bit :)
First you need to deserialize/decode the data from the column to the proper php array and then you can use in_array function. In your post edit you stated that you have a comma separated list of IP's, so to convert it to array you need to use an explode function:
$ip_list = explode(', ', $r['ip']);
now you can use the in_array function on the new array:
if(!in_array($agent_ip, $ip_list))

How to handle/optimize thousands of different to executed SELECT queries?

I need to synchronize specific information between two databases (one mysql, the other a remote hosted SQL Server database) for thousands of rows. When I execute this php file it gets stuck/timeouts after several minutes I guess, so I wonder how I can fix this issue and maybe also optimize the way of "synchronizing" it.
What the code needs to do:
Basically I want to get for every row (= one account) in my database which gets updated - two specific pieces of information (= 2 SELECT queries) from another SQL Server database. Therefore I use a foreach loop which creates 2 SQL queries for each row and afterwards I update those information into 2 columns of this row. We talk about ~10k Rows which needs to run thru this foreach loop.
My idea which may help?
I have heard about things like PDO Transactions which should collect all those queries and sending them afterwards in a package of all SELECT queries, but I have no idea whether I use them correctly or whether they even help in such cases.
This is my current code, which is timing out after few minutes:
// DBH => MSSQL DB | DB => MySQL DB
$dbh->beginTransaction();
// Get all referral IDs which needs to be updated:
$listAccounts = "SELECT * FROM Gifting WHERE refsCompleted <= 100 ORDER BY idGifting ASC";
$ps_listAccounts = $db->prepare($listAccounts);
$ps_listAccounts->execute();
foreach($ps_listAccounts as $row) {
$refid=$row['refId'];
// Refsinserted
$refsInserted = "SELECT count(username) as done FROM accounts WHERE referral='$refid'";
$ps_refsInserted = $dbh->prepare($refsInserted);
$ps_refsInserted->execute();
$row = $ps_refsInserted->fetch();
$refsInserted = $row['done'];
// Refscompleted
$refsCompleted = "SELECT count(username) as done FROM accounts WHERE referral='$refid' AND finished=1";
$ps_refsCompleted = $dbh->prepare($refsCompleted);
$ps_refsCompleted->execute();
$row2 = $ps_refsCompleted->fetch();
$refsCompleted = $row2['done'];
// Update fields for local order db
$updateGifting = "UPDATE Gifting SET refsInserted = :refsInserted, refsCompleted = :refsCompleted WHERE refId = :refId";
$ps_updateGifting = $db->prepare($updateGifting);
$ps_updateGifting->bindParam(':refsInserted', $refsInserted);
$ps_updateGifting->bindParam(':refsCompleted', $refsCompleted);
$ps_updateGifting->bindParam(':refId', $refid);
$ps_updateGifting->execute();
echo "$refid: $refsInserted Refs inserted / $refsCompleted Refs completed<br>";
}
$dbh->commit();
You can do all of that in one query with a correlated sub-query:
UPDATE Gifting
SET
refsInserted=(SELECT COUNT(USERNAME)
FROM accounts
WHERE referral=Gifting.refId),
refsCompleted=(SELECT COUNT(USERNAME)
FROM accounts
WHERE referral=Gifting.refId
AND finished=1)
A correlated sub-query is essentially using a sub-query (query within a query) that references the parent query. So notice that in each of the sub-queries I am referencing the Gifting.refId column in the where clause of each sub-query. While this isn't the best for performance because each of those sub-queries still has to run independent of the other queries, it would perform much better (and likely as good as you are going to get) than what you have there.
Edit:
And just for reference. I don't know if a transaction will help here at all. Typically they are used when you have several queries that depend on each other and to give you a way to rollback if one fails. For example, banking transactions. You don't want the balance to deduct some amount until a purchase has been inserted. And if the purchase fails inserting for some reason, you want to rollback the change to the balance. So when inserting a purchase, you start a transaction, run the update balance query and the insert purchase query and only if both go in correctly and have been validated do you commit to save.
Edit2:
If I were doing this, without doing an export/import this is what I would do. This makes a few assumptions though. First is that you are using a mssql 2008 or newer and second is that the referral id is always a number. I'm also using a temp table that I insert numbers into because you can insert multiple rows easily with a single query and then run a single update query to update the gifting table. This temp table follows the structure CREATE TABLE tempTable (refId int, done int, total int).
//get list of referral accounts
//if you are using one column, only query for one column
$listAccounts = "SELECT DISTINCT refId FROM Gifting WHERE refsCompleted <= 100 ORDER BY idGifting ASC";
$ps_listAccounts = $db->prepare($listAccounts);
$ps_listAccounts->execute();
//loop over and get list of refIds from above.
$refIds = array();
foreach($ps_listAccounts as $row){
$refIds[] = $row['refId'];
}
if(count($refIds) > 0){
//implode into string for use in query below
$refIds = implode(',',$refIds);
//select out total count
$totalCount = "SELECT referral, COUNT(username) AS cnt FROM accounts WHERE referral IN ($refIds) GROUP BY referral";
$ps_totalCounts = $dbh->prepare($totalCount);
$ps_totalCounts->execute();
//add to array of counts
$counts = array();
//loop over total counts
foreach($ps_totalCounts as $row){
//if referral id not found, add it
if(!isset($counts[$row['referral']])){
$counts[$row['referral']] = array('total'=>0,'done'=>0);
}
//add to count
$counts[$row['referral']]['total'] += $row['cnt'];
}
$doneCount = "SELECT referral, COUNT(username) AS cnt FROM accounts WHERE finished=1 AND referral IN ($refIds) GROUP BY referral";
$ps_doneCounts = $dbh->prepare($doneCount);
$ps_doneCounts->execute();
//loop over total counts
foreach($ps_totalCounts as $row){
//if referral id not found, add it
if(!isset($counts[$row['referral']])){
$counts[$row['referral']] = array('total'=>0,'done'=>0);
}
//add to count
$counts[$row['referral']]['done'] += $row['cnt'];
}
//now loop over counts and generate insert queries to a temp table.
//I suggest using a temp table because you can insert multiple rows
//in one query and then the update is one query.
$sqlInsertList = array();
foreach($count as $refId=>$count){
$sqlInsertList[] = "({$refId}, {$count['done']}, {$count['total']})";
}
//clear out the temp table first so we are only inserting new rows
$truncSql = "TRUNCATE TABLE tempTable";
$ps_trunc = $db->prepare($truncSql);
$ps_trunc->execute();
//make insert sql with multiple insert rows
$insertSql = "INSERT INTO tempTable (refId, done, total) VALUES ".implode(',',$sqlInsertList);
//prepare sql for insert into mssql
$ps_insert = $db->prepare($insertSql);
$ps_insert->execute();
//sql to update existing rows
$updateSql = "UPDATE Gifting
SET refsInserted=(SELECT total FROM tempTable WHERE refId=Gifting.refId),
refsCompleted=(SELECT done FROM tempTable WHERE refId=Gifting.refId)
WHERE refId IN (SELECT refId FROM tempTable)
AND refsCompleted <= 100";
$ps_update = $db->prepare($updateSql);
$ps_update->execute();
} else {
echo "There were no reference ids found from \$dbh";
}

Database Insert into Random Unused Row - With Transaction

I am writing an app that wishes to randomly assign a number to users, then puts in into a MySql database. There are many people who use it at the same time and as such I dont want parallel uses to overwite each other.
My current code is the following:
$sql_get = "SELECT * FROM database";
$results = mysql_query($sql_get, $bd);
$list = array();
while($row = mysql_fetch_array($results))
{
if ($row['userId'] == "")
{
array_push($list, $row['number']);
}
}
$rand_nums = array_rand($list , 1);
$sql_update = "UPDATE database SET userId='". $userId ."' WHERE number=". $rand_nums;
$results = mysql_query($sql_update, $bd);
So basically, it gets the empty rows, puts them into a list, chooses a random empty row number and puts the data into the row. The current issue is that the get and the empty rows can happen at the same time for multiple users and may overwrite data written at the same time.
How can I structure this code (transaction or otherwise) to ensure concurrent use has no bad effects?
Thank you
You could do it all in one query:
UPDATE database AS d
SET d.userId = $userId
WHERE d.userId = ''
ORDER BY RAND()
LIMIT 1

show row only 100 times PHP

How can I make a limit of showing the results? I need to limit it for 100 views.
In DB I have:
ID|NAME|PAGE|COUNT|DATE
In count I want to count untill 100 and then stop showing that ID. I could do it with count < 100. And then update the specific ID. I could get records with less than 100 views, but I couldn't manage to update count on the specific ID.
Row is showed with:
php code:
foreach($bannerGroups[0] as $ban) {
echo '<li class="right1">'.$ban->html().'</li>';
}
But I just don't know where to put the update in there. I tried, but all I got was to update only one ID. But it shows 4 on one page and randomizes them on refresh. So I don't know what to do.
Also I would like to say I am only learning php. Sorry for all the mess.
Code at http://pastebin.com/A9hJTPLE
If I understand correctly, you want to show all banners that have been previously-displayed less than 100 times?
If that's right, you can just add that to your WHERE clause:
$bannerResult = mysql_query("SELECT * FROM table WHERE page='cat' WHERE `COUNT` < 100");
To update them all, you can either run a query while displaying each individual banner, or "record" the id of each and run a single query at the end, like:
$ids = array();
foreach($bannerGroups[0] as $ban) {
$ids[] = $ban['ID']; // record the ID; don't know how Banner
// class works, assuming uses indexes; maybe ID() method?
echo '<li class="right1">'.$ban->html().'</li>';
}
...
mysql_query('UPDATE table SET `COUNT` = `COUNT` + 1 WHERE ID IN (' . join(',', $ids) . ')');
UPDATE:
Based off of a comment, your Banner class doesn't have a method to retrieve the individual banner's ID. In this case, you can record the ID values when you're building your banners array:
$ids = array();
while($row=mysql_fetch_assoc($bannerResult)) {
$banners[] = new Banner($row);
$ids[] = $row['ID']; // record the ID
}
// update the `count` on each record:
mysql_query('UPDATE table SET `COUNT` = `COUNT` + 1 WHERE ID IN (' . join(',', $ids) . ')');
sorry, but I got your question wrong...
first you have to insert a new sql-column like "viewcount" to the db...
on every read, you have to increment the value in viewcount...
for that behaviour (because, mysql does not allow sub-selects on update-clause on the same table), you have to fetch the results from db, as you do that, and pass all the primary-keys of the records to an array...
after the view-logic you have to fire up a query like:
UPDATE foo SET viewcount = viewcount + 1 WHERE id IN (1,2,3,4,5,6...,100);
where the IN-clause can be easily generated using your primary-keys-array with "implode(',', $arr);"
hope this helps.
$bannerResult = mysql_query("SELECT * FROM table WHERE page='cat' AND `count`<100");
#newfurniturey figured it out. in each foreach($banneruGroups added: $ids = $ban->getValue('id'); and then mysql_query("UPDATE dataa SET COUNT = COUNT + 1 WHERE id = '$ids'"); but is there any way to update them by adding query only once? And if the id is showed already 100 times i get Warning: Invalid argument supplied for foreach() in. Any idea how to fix it? I have 4 ids in DB . If one of them already have 100 views (count) then i get error!
Try to limit your data source for 100 items.
It's like OFFSET x LIMIT 100 in MySQL/PostgreSQL query or TOP 100 in MSSQL.

Categories