generating unique combinations without running out of memory in php - php

I am writing an algorithm to generate combinations of items from a database. They need to be unique permutations (i.e. 145, 156 == 156, 145). The problem I am running into is how to keep track of previous combinations so that i do not end up with 145, 156 and 156, 145.
Currently I am adding them to an array with index of id1_id2... (sorted so id's are always be lowest to highest) and setting the value equal to 1 when a combo is generated so that i can check if $combos[$index] exists or not. If it does not exist, create it. (there are other criteria to weed out EVERY permutation, but they are irrelevant) Once these combinations are generated, they are being stored in a table in MySQL.
The problem I am running into is that with the test items i'm using (about 85) I cannot generate a combinations with more than 3 items (id1_id2_id3) without running out of memory as the number of combinations is MASSIVE and the $combos array takes up more than the 64M i am allotted in PHP memory.
Is there a way that I can do this a) without keeping track of previous combos or b) skipping the $combos array route and only adding a unique row to mysql and let mysql handle the duplicate checking.
Here is some pseudo code for reference:
$items = array(/*85 items*/);
foreach ($items as $item1){
generate(array($item1));
foreach($items as $item2){
generate(array($item1, $item2));
}
}
}
function generate($items_arary){
$temp_array = array();
foreach ($items_array as $item){
$temp_array[] = $item['id'];
}
sort($temp_array);
$index = implode("_", $temp_array);
if (!$combos[$index]){
$combos[$index] = 1;
/* some code to generate query to store to db */
}
}
the query ends up looking like this: (the database is truncated at beginning of script)
INSERT INTO `combos` (combo_id, more_info) VALUES ('id1_id2', 'Item Name');
In the process of writing this question, I thought of a possible solution: Making sure id3 > id2 > id1. Would this be a viable solution to remove the need for $combos?

The reason I asked about the before data structure is because you could do something like this:
$sql = "SELECT id FROM test_a";
$result = mysql_query($sql);
while ($row = mysql_fetch_array($result)) {
$item1 = $row['id'];
$sql2 = "SELECT id FROM test_a";
$result2 = mysql_query($sql2);
while ($row2 = mysql_fetch_array($result2)) {
$item2 = $row2['id'];
$combo1 = $item1 . "_" . $item2;
$combo2 = $item2 . "_" . $item1;
$sql3 = "SELECT * FROM combos WHERE combo_id = '$combo1' OR combo_id = '$combo2'";
$result3 = mysql_query($sql3);
if (mysql_num_rows($result3) == 0) {
$sql4 = "INSERT INTO combos (combo_id, more_info) VALUES ('$combo1','Item Name')";
$result4 = mysql_query($sql4);
}
}
}
When table test_a has the values 1,2,3, and 4 this script inserts:
1_1
1_2
1_3
1_4
2_2
2_3
2_4
3_3
3_4
4_4
This shouldn't have any memory problems. Although if you have a huge database you may run into a issue with php's time limit

Here is the same concept as my other answer but in an all SQL format.
INSERT INTO combos (combo_id, more_info)
SELECT CONCAT_WS("_",t1.id,t2.id), "item_name"
FROM test_a t1, test_a t2
WHERE NOT EXISTS (SELECT * FROM combos WHERE combo_id = CONCAT_WS("_",t1.id,t2.id))
AND NOT EXISTS (SELECT * FROM combos WHERE combo_id = CONCAT_WS("_",t2.id,t1.id))
Assuming you can get item_name from the db somewhere, this will probably be your fastest and least memory intensive solution. I am running a test on around 1000 ids at the moment. I'll update this when it finishes.

Yes. You can store and use the lexicographical index of the combination to reconstruct/iterate them, or Grey Codes if you need to iterate all of them.
Take a look at: "Algorithm 515: Generation of a Vector from the Lexicographical Index"; Buckles, B. P., and Lybanon, M. ACM Transactions on Mathematical Software, Vol. 3, No. 2, June 1977.
I've translated into C here, and describe more here.

If you don't need to enforce referential integrity automatically (which you're not if you use string concatenation), use one table for the 85 items, give them each an index (0-84), and use a second table to represent a given set of items, using a numeric datatype where each bit position in the number represents one item. (e.g. 000001101 represents items 0, 2, and 3)
For items more than 64 you may have to split them up into more than one field, or use a BLOB or a string (gack!).
If you use this as a primary key field, you can enforce non-duplicates.

In TSQL you can use a recursive CTE, Can''t remember where I got it, but pretty sweet. Note MYSQL doesn't use "With" option, so it won't work in MySQL
WITH Numbers(N) AS (
SELECT N
FROM ( VALUES(1), (2), (3), (4), (5), (6)) Numbers(N)),
Recur(N,Combination) AS (
SELECT N, CAST(N AS VARCHAR(20))
FROM Numbers
UNION ALL
SELECT n.N,CAST(r.Combination + ',' + CAST(n.N AS VARCHAR(10)) AS VARCHAR(20))
FROM Recur r
INNER JOIN Numbers n ON n.N > r.N)
select Combination
from RECUR
ORDER BY LEN(Combination),Combination;

to increase memory change
memory_limit = 512M in your php.ini
or
ini_set('memory_limit', '512M') in your php script
or
php_value memory_limit 512M in your .htaccess

Related

MYSQL select mulitiple fields / columns from the table's row that are not empty

I have 12 columns on the below code and I want the best, correct and short way to select only fields for a user order(a row) where the field is not empty/0 in the table.(all fields are INT and numbers are less than 500)
$conn = mysqli_connect("localhost", "root", "","table_name") or die("error");
$sql = "SELECT dqnt_91,deal_91,dqnt_92,deal_92,dqnt_93,deal_93,dqnt_94,deal_94,dqnt_95,deal_95,dqnt_96,deal_96 FROM table_name WHERE fid='$userid' AND COLUMNS ARE NOT EMPTY ";
$result = mysqli_query($conn, $sql);
while( $row = mysqli_fetch_array($result, MYSQLI_ASSOC)) {
$d_91 = $row['deal_91']; $d_92 = $row['deal_92']; $d_93 = $row['deal_93']; $d_94 = $row['deal_94']; $d_95 = $row['deal_95']; $d_96 = $row['deal_96'];
$qnt_91 = $row['dqnt_91']; $qnt_91 = $row['dqnt_92']; $qnt_91 = $row['dqnt_93']; $qnt_91 = $row['dqnt_94']; $qnt_91 = $row['dqnt_95']; $qnt_91 = $row['dqnt_96']; }
echo 'You have selected:'.$d_91.'for'.$deal_91.'<br>';
echo 'You have selected:'.$d_92.'for'.$deal_92.'<br>';
echo 'You have selected:'.$d_96.'for'.$deal_96.'<br>';
Again, I want echo out only those fields that are not zero(0) or empty and if they are zero value then don't show or echo them!
The code below can work but because I have 12 columns and can add more to them then the below code is not handy.
SELECT [ all 12 x columns_name] FROM table_name WHERE table_name.column_name1!='' AND table_name.column_mame2!='' AND x another 10 times the last code ;
Thanks,
Short of modifying the data structure, you won't do much better in SQL than listing the 12 columns each with its own condition; and anything shorter will come at a cost of clarity, so personally I'd advise against it. But if brevity is more important than clarity for your application:
If you can add a view, it could provide a simple flag for you to test against.
create view my_view as (select -- each column
, case when -- all values are present
then 1 else 0 end nonempty_flg
from -- ...)
Then at least the complexity is hidden away from the calling app. If that's not an option:
You state the columns are small integer values. If you can further assume they're non-negative, then you can sum them up and compare against 0. If the might be negative, you could sum their absolute values.
In your examples the columns are treated as strings. If they are in fact CHAR or VARCHAR, you could concat them all and check the result against empty. You might have to coalesce null values to '' as you go.
But it sounds like what you really want is something that works without the query knowing in advance the set of columns; that's not doable in SQL. I suppose you could write a function/method/procedure in your calling program to generate the query based on examining the table columns in the catalog.

Generate a million unique random 12 digit numbers

I need to generate close to a million(100 batches of 10000 numbers) unique and random 12 digit codes for a scratch card application. This process will be repeated and will need an equal number of codes to be generated everytime.
Also the generated codes need to be entered in a db so that they can be verified later when a consumer enters this on my website. I am using PHP and Mysql to do this. These are the steps I am following
Get admin input on the number of batches and the codes per batch
Using for loop generate the code using
mt_rand(100000000000,999999999999)
Check every time a number is generated to see if a duplicate exists
in the db and if not add to results variable else regenerate.
Save generated number in db if unique
Repeat b,c, and d over required number of codes
Output codes to admin in a csv
Code used(removed most of the comments to make it less verbose and because I have already explained the steps earlier):
$totalLabels = $numBatch*$numLabelsPerBatch;
// file name for download
$fileName = $customerName."_scratchcodes_" . date('Ymdhs') . ".csv";
$flag = false;
$generatedCodeInfo = array();
// headers for download
header("Content-Disposition: attachment; filename=\"$fileName\"");
header("Content-Type: application/vnd.ms-excel");
$codeObject = new Codes();
//get new batch number
$batchNumber = $codeObject->getLastBatchNumber() + 1;
$random = array();
for ($i = 0; $i < $totalLabels; $i++) {
do{
$random[$i] = mt_rand(100000000000,999999999999); //need to optimize this to reduce collisions given the databse will be grow
}while(isCodeNotUnique($random[$i],$db));
$codeObject = new Codes();
$codeObject->UID = $random[$i];
$codeObject->customerName = $customerName;
$codeObject->batchNumber = $batchNumber;
$generatedCodeInfo[$i] = $codeObject->addCode();
//change batch number for next batch
if($i == ($numLabelsPerBatch-1)){$batchNumber++;}
//$generatedCodeInfo[i] = array("UID" => 10001,"OID"=>$random[$i]);
if(!$flag) {
// display column names as first row
echo implode("\t", array_keys($generatedCodeInfo[$i])) . "\n";
$flag = true;
}
// filter data
array_walk($generatedCodeInfo[$i], 'filterData');
echo implode("\t", array_values($generatedCodeInfo[$i])) . "\n";
}
function filterData(&$str)
{
$str = preg_replace("/\t/", "\\t", $str);
$str = preg_replace("/\r?\n/", "\\n", $str);
if(strstr($str, '"')) $str = '"' . str_replace('"', '""', $str) . '"';
}
function isCodeNotUnique($random){
$codeObject = new Codes();
$codeObject->UID = $random;
if(!empty($codeObject->getCodeByUID())){
return true;
}
return false;
}
Now this is taking really long to execute and I believe is not optimal.
How can I optimize so that the unique random numbers are generated quickly?
Will it be faster if the numbers were instead generated in mysql or other way rather than php and if so how do I do that?
When the db starts growing the duplicate check in step b will be really time consuming so how do I avoid that?
Is there a limit on the number of rows in mysql?
Note: The numbers need to be unique across all batches across lifetime of the application.
1) Divide your range of numbers up to smaller ranges based on the number of batches. E.g. if your range 0 - 1000 and you have 10 batches, then have a batch from 0 - 99, the next 100 - 199, etc. When you generate the numbers for a batch, only generate the random number from the batch range. This way you know that you can only have duplicate numbers within a batch.
Do not insert each number into the database individually, but store them in an array. When you generate a new random number, then check against the array, not the database using in_array() function. When the batch is complete, then use a single insert statement to insert the contents of the batch:
insert into yourtable (bignumber) values (1), (2), ..., (n)
Check MySQL's max_allowed_packet setting to see if it is able to receive the complete sql statement in one go.
Implement a fallback plan, just in case a duplicate value is still found during the insert (error handling and number regeneration).
2) MySQL is not that great on procedural stuff, so I would stick with an external language, such as php.
3) Add a unique index on the field containing the random numbers. If you try to insert a duplicate record, MySQL will prevent it and throws an error. It is really quick.
4) Depending on the actual table engine used (innodb, myisam, etc), its configuration, and the OS, certain limits may apply on the size of the table. See Maximum number of records in a MySQL database table question here on SO for a more detailed answer (check the most upvoted answer, not the accepted one).
You can do the following:
$random = getExistingCodes(); // Get what you already have (from the DB).
$random = array_flip($random); //Make them into keys
$existingCount = count($random); //The codes you already have
do {
$random[mt_rand(100000000000,999999999999)] = 1;
} while ((count($random)-$existingCount) < $totalLabels);
$random = array_keys($random);
When you generate a duplicate number it will just overwrite that key and not increase the count.
To insert you can start a transaction and do as many inserts as needed. MySQL will try to optimize all operations within a single transaction.
Here is a query that generates 1 million pseudo-random numbers without repetitions:
select cast( (#n := (13*#n + 97) % 899999999981)+1e11 as char(12)) as num
from (select #n := floor(rand() * 9e11) ) init,
(select 1 union select 2) m01,
(select 1 union select 2) m02,
(select 1 union select 2) m03,
(select 1 union select 2) m04,
(select 1 union select 2) m05,
(select 1 union select 2) m06,
(select 1 union select 2) m07,
(select 1 union select 2) m08,
(select 1 union select 2) m09,
(select 1 union select 2) m10,
(select 1 union select 2) m11,
(select 1 union select 2) m12,
(select 1 union select 2) m13,
(select 1 union select 2) m14,
(select 1 union select 2) m15,
(select 1 union select 2) m16,
(select 1 union select 2) m17,
(select 1 union select 2) m18,
(select 1 union select 2) m19,
(select 1 union select 2) m20
limit 1000000;
How it works
It starts by generating a random integer value n with 0 <= n < 900000000000. This number will have the function of the seed for the generated sequence:
#n := floor(rand() * 9e11)
Through multiple (20) joins with inline pairs of records, this single record is multiplied to 220 copies, which is just a bit over 1 million.
Then the selection starts, and as record after record is fetched, the value of the #n variable is modified according to this incremental formula:
#n := (13*#n + 97) % 899999999981
This formula is a linear congruential generator. The three constant numbers need to obey some rules to maximise the period (of non-repetition), but it is the easiest when 899999999981 is prime, which it is. In that case we have a period of 899999999981, meaning that the first 899999999981 generated numbers will be unique (and we need much less). This number is in fact the largest prime below 900000000000.
As a final step, 100000000000 is added to the number to ensure the number always has 12 digits, so excluding numbers that are smaller than 100000000000. Because of the choice of 899999999981 there will be 20 numbers that will never be generated, namely those between 999999999981 and 999999999999 inclusive.
As this generates 220 records, the limit clause will make sure this is chopped off to exactly one million records.
The cast to char(12) is optional, but may be necessary to visualise the 12-digit numbers without them being rendered on the screen in scientific notation. If you will use this to insert records, and the target data type is numeric, then you would leave out this conversion of course.
CREATE TABLE x (v BIGINT(12) ZEROFILL NOT NULL PRIMARY KEY);
INSERT IGNORE INTO x (v) VALUES
(FLOOR(1e12*RAND()), (FLOOR(1e12*RAND()), (FLOOR(1e12*RAND()),
(FLOOR(1e12*RAND()), (FLOOR(1e12*RAND()), (FLOOR(1e12*RAND()),
(FLOOR(1e12*RAND()), (FLOOR(1e12*RAND()), (FLOOR(1e12*RAND()),
(FLOOR(1e12*RAND()), (FLOOR(1e12*RAND()), (FLOOR(1e12*RAND()),
(FLOOR(1e12*RAND()), (FLOOR(1e12*RAND()), (FLOOR(1e12*RAND());
Do that INSERT 1e6/15 times.
Check COUNT(*) to see if you have a million. Do this until the table as a million rows:
INSERT IGNORE INTO x (v) VALUES
(FLOOR(1e12*RAND());
Notes:
ZEROFILL is assuming that you want the display to have leading zeros.
IGNORE is because there will be some number of duplicates. This avoids the costly check after each insert.
"Batch insert" is faster than one row at a time. (Doing 100 at a time is about optimal, but I am lazy.)
Potential problem: While I think the pattern of values for RAND() does not repeat at, say 2^16 or 2^32 values, I do not know for a fact. If you can't get to a million, then the random number generator is bad; you should switch to PHP's rand, or something else.
Beware of linear consequential random number generators. They are probably easily hacked. (I assume there is some "money" behind the scratch cards.)
Do not plan on mt_rand() being unique for small ranges
<?php
// Does mt_rand() repeat?
TryMT(100);
TryMT(100);
TryMT(1000);
TryMT(10000);
TryMT(1e6);
TryMT(1e8);
TryMT(1e10);
TryMT(1e12);
TryMT(1e14);
function TryMT($max) {
$h = [];
for ($j = 0; $j<$max; $j++) {
$v = mt_rand(1, $max);
if (isset($h[$v])) {
echo "Dup after $j iterations (limit=$max)<br>\n";
return;
}
$h[$v] = 1;
}
}
Sample output:
Dup after 7 iterations (limit=100)<br>
Dup after 13 iterations (limit=100)<br>
Dup after 29 iterations (limit=1000)<br>
Dup after 253 iterations (limit=10000)<br>
Dup after 245 iterations (limit=1000000)<br>
Dup after 3407 iterations (limit=100000000)<br>
Dup after 29667 iterations (limit=10000000000)<br>
Dup after 82046 iterations (limit=1000000000000)<br>
Dup after 42603 iterations (limit=1.0E+14)<br>
mt_rand() is a "good" random number generated because it does have dups.

how to fix error with mysql random

I have project in php + mysql (over 2 000 000 rows). Please view this php code.
<?php
for($i=0;$i<20;$i++)
{
$start = rand(1,19980);
$select_images_url_q = "SELECT * FROM photo_gen WHERE folder='$folder' LIMIT $start,2 ";
$result_select = (mysql_query($select_images_url_q));
while($row = mysql_fetch_array($result_select))
{
echo '<li class="col-lg-2 col-md-3 col-sm-3 col-xs-4" style="height:150px">
<img class="img-responsive" src="http://static.gif.plus/'.$folder.'/'.$row['code'].'_s.gif">
</li>';
}
}
?>
This code work very slowly in $start = rand(1,19980); position, Please help how I can make select request with mysql random function, thank you
Depending on what your code is doing with $folder, you may be vulnerable to SQL injection.
For better security, consider moving to PDO or MySQLi and using prepared statements. I wrote a library called EasyDB to make it easier for developers to adopt better security practices.
The fast, sane, and efficient way to select N distinct random elements from a database is as follows:
Get the number of rows that match your condition (i.e. WHERE folder = ?).
Generate a random number between 0 and this number.
Select a row with a given offset like you did.
Store the ID of the previously generated row in an ever-growing list to exclude from the results, and decrement the number of rows.
An example that uses EasyDB is as follows:
// Connect to the database here:
$db = \ParagonIE\EasyDB\Factory::create(
'mysql;host=localhost;dbname=something',
'username',
'putastrongpasswordhere'
);
// Maintain an array of previous record IDs in $exclude
$exclude = array();
$count = $db->single('SELECT count(id) FROM photo_gen WHERE folder = ?', $folder);
// Select _up to_ 40 values. If we have less than 40 in the folder, stop
// when we've run out of photos to load:
$max = $count < 40 ? $count : 40;
// The loop:
for ($i = 0; $i < $max; ++$i) {
// The maximum value will decrease each iteration, which makes
// sense given that we are excluding one more result each time
$r = mt_rand(0, ($count - $i - 1));
// Dynamic query
$qs = "SELECT * FROM photo_gen WHERE folder = ?";
// We add AND id NOT IN (2,6,7,19, ...) to prevent duplicates:
if ($i > 0) {
$qs .= " AND id NOT IN (" . implode(', ', $exclude) . ")";
}
$qs .= "ORDER BY id ASC LIMIT ".$r.", 1";
$row = $db->row($qs, $folder);
/**
* Now you can operate on $row here. Feel free to copy the
* contents of your while($row=...) loop in place of this comment.
*/
// Prevent duplicates
$exclude []= (int) $row['id'];
}
Gordon's answer suggests using ORDER BY RAND(), which in general is a bad idea and can make your queries very slow. Furthermore, although he says that you shouldn't need to worry about there being less than 40 rows (presumably, because of the probability involved), this will fail in edge cases.
A quick note about mt_rand(): It's a biased and predictable random number generator with only 4 billion possible seeds. If you want better results, look into random_int() (PHP 7 only, but I'm working on a compatibility layer for PHP 5 projects. See the linked answer for more information.)
Actually, even though the table has 2+ million rows, I'm guessing that a given folder has many fewer. Hence, this should be reasonable with an index on photo_gen(folder):
SELECT *
FROM photo_gen
WHERE folder = '$folder'
ORDER BY rand()
LIMIT 40;
If a folder can still have tens or hundreds of thousands of examples, I would suggest a slight variation:
SELECT pg.**
FROM photo_gen pg cross join
(select count(*) cnt from photo_gen where folder = $folder) as cnt
WHERE folder = '$folder' and
rand() < 500 / cnt
ORDER BY rand()
LIMIT 40;
The WHERE expression should get about 500 rows (subject to the vagaries of sample variation). There is a really high confidence that there will be at least 40 (you don't need to worry about it). The final sort should be fast.
There are definitely other methods, but they are complicated by the where clause. The index is probably the key thing you need for improved performance.
It's better to firstly compose your SQL query (as a string in PHP) once and then just execute it once.
Or you could use this way to select values if it fits your case: Select n random rows from SQL Server table

Optimizing for loop in php

I have been running a foreach loop 1000 times on php page. The code inside the foreach loop looks like below:
$first = mysql_query("SELECT givenname FROM first_names order by rand() LIMIT 1");
$first_n = mysql_fetch_array($first);
$first_name = $first_n['givenname'];
$last = mysql_query("SELECT surname FROM last_name order by rand() LIMIT 1");
$last_n = mysql_fetch_array($last);
$last_name = $last_n['surname'];
$first_lastname = $first_name . " " . $last_name;
$add = mysql_query("SELECT streetaddress FROM user_addresss order by rand() LIMIT 1");
$addr = mysql_fetch_array($add);
$address = $addr['streetaddress'];
$unlisted = "unlisted";
$available = "available";
$arr = array(
$first_lastname,
$address,
$unlisted,
$available
);
Then I have been using array_rand function to get a randomized value each time the loop runs:
<td><?php echo $arr[array_rand($arr)] ?></td>
So loading the php page is taking a really long time. Is there a way I could optimize this code. As I need a unique value each time the loop runs
The problem is not your PHP foreach loop. If you order your MySQL table by RAND(), you are making a serious mistake. Let me explain to you what happens when you do this.
Every time you make a MySQL request, MySQL will attempt to map your search parameters (WHERE, ORDER BY) to indices to cut down on the data read. It will then load the relevant info in memory for processing. If the info is too large, it will default to writing it to disk and reading from disk to perform the comparison. You want to avoid disk reads at all costs as they are inefficient, slow, repetitive and can sometimes be flat-out wrong under specific circumstances.
When MySQL finds an index that is possible to be used, it will load the index table instead. An index table is a hash table between memory location and the value of the index. So, for instance, the index table for a primary key looks like this:
id location
1 0 bytes in
2 17 bytes in
3 34 bytes in
This is extremely efficient as even very large index tables can fit in tiny amounts of memory.
Why am I talking about indices? Because by using RAND(), you are preventing MySQL from using them. ORDER BY RAND() forces MySQL to create a new random value for each row. This requires MySQL to copy all your table data in what is called a temporary table, and to add a new field with the RAND() value. This table will be too big to store in memory, so it will be stored to disk.
When you tell MySQL to ORDER BY RAND(), and the table is created, MySQL will then compare every single row by pairs (MySQL sorting uses quicksort). Since the rows are too big, you're looking at quite a few disk reads for this operation. When it is done, it returns, and you get your data -at a huge cost.
There are plenty of ways to prevent this massive overhead SNAFU. One of them is to select ID from RAND() to maximum index and limit by 1. This does not require the creation of an extra field. There are plenty of similar Stack questions.
It has already been explained why ORDER BY RAND() should be avoided, so I simply provide a way to do it with some faster queries.
First get a random number based on your table size:
SELECT FLOOR(RAND()*COUNT(*)) FROM first_names
Second use the random number in a limit
SELECT * FROM first_names $pos,1
Unfortunately I don't think there is any way to combine the two queries into one.
Also you can do a SELECT COUNT(*) FROM first_names, store the number, and generate random $pos in PHP as many times as you like.
You should switch to using either mysqli or pdo if your host supports it but something like this should work. You will have to determine what you want to do if you don't have a enough record in either of the tables though (array_pad or wrap the indexes and restart)
function getRandomNames($qty){
$qty = (int)$qty;
$fnames = array();
$lnames = array();
$address = array();
$sel =mysql_query("SELECT givenname FROM first_names order by rand() LIMIT ".$qty);
while ($rec = mysql_fetch_array($sel)){$fnames[] = $rec[0]; }
$sel =mysql_query("SELECT surname FROM last_name order by rand() LIMIT ".$qty);
while ($rec = mysql_fetch_array($sel)){ $lnames[] = $rec[0]; }
$sel =mysql_query("SELECT streetaddress FROM user_addresss order by rand() LIMIT ".$qty);
while ($rec = mysql_fetch_array($sel)){ $address[] = $rec[0]; }
// lets stitch the results together
$results = array();
for($x = 0; $x < $qty; $x++){
$results[] = array("given_name"=>$fnames[$x], "surname"=>$lnames[$x], "streetaddress"=>$address[$x]);
}
return $results;
}
Hope this helps
UPDATE
Based on Sébastien Renauld's answer a more complete solution may be to structure the queries more like
"SELECT givenname from first_names where id in (select id from first_names order by rand() limit ".$qty.")";

GROUP BY give priority to in MySQL

I have the following query.
$query_assignments = "SELECT * FROM tb_scheduler_assignments
WHERE company_id = '".$company_id."' OR
dept_id = '".$dept_id."' OR
user_id = '".$user_id."' ORDER BY
due_date GROUP BY purchase_id";
What I'd like is a single query solution that would keep the results for user_id over dept_id and dept_id over company_id.
For example:
if the same purchase_id occurs for
rows that were gotten via dept_id and
user_id, then I only want the result
for the user_id;
if the same purchase_id occurs for
rows that were gotten via company_id
and user_id, then I only want the
result for the user_id
First, you're interpolating variables in your SQL, which suggests you might be vulnerable to SQL injection. Just to make sure. PHP should offer prepared statements, or some escaping function.
Second, your SQL statement won't compile because you're using GROUP BY a but selecting * which includes at least three more columns.
Third, it sounds like you're misunderstanding SQL in thinking that it might, in a query such as you're trying to formulate (without UNION ALL), retrieve duplicate rows, i.e. the same row multiple times because it matches multiple criteria. This is not so.
The "single query" solution that I was looking for doesn't seem to exist, or if it does, it would be way slower than just handling all the sorting in php.
So, I ran 3 separate queries, put each of them into arrays, and then in order to put them all into a final array with the hierarchy that I needed, I did the loops below to see if the purchaseID existed for the levels up the hierarchy. If it didn't, then I put it in to the array.
$finalArray = array();
foreach ($companyArray as $purchaseID => $companyData) {
if (empty($deptArray[$purchaseID]) && empty($userArray[$purchaseID])) {
$finalArray[] = $companyData;
}
}
foreach ($deptArray as $purchaseID => $deptData) {
if (empty($userArray[$purchaseID])) {
$finalArray[] = $deptData;
}
}
foreach ($userArray as $purchaseID => $userData) {
$finalArray[] = $userData;
}
Then I can sort that array however I want and loop through that to echo what I need to.
Not sure if that's the best way, but it worked well and is lightning fast for me.
$query_assignments = "SELECT *,
IF(user_id = {$user_id}, 30,
IF(dept_id = {$dept_id}, 20,
IF(company_id = {$company_id}, 10, 0)
)
) as priority
FROM tb_scheduler_assignments
WHERE company_id = {$company_id} OR
dept_id = {$dept_id} OR
user_id = {$user_id}
GROUP BY purchase_id
ORDER BY due_date, priority DESC";
You can make a virtual field with the if statement.
user_id: 30 pts
dept_id: 20 pts
company_id: 10 pts
else: 0 pts
WARNING: can not be Indexed!
Syntax FIX: GROUP BY and ORDER BY reordered

Categories