How to rewrite a php script to JavaScript - php

I have an express server running using nodejs, and I'm using SQL syntax to query data from my MYSQL database.
The query is to get all members under the current user ID and also get members under it's children and so on, to get a genealogy tree. I'm trying to query a genealogy tree for a user.
I have a PHP script for the query, which worked fine
This is the PHP script:
$memory = $_GET['memory'];
echo '<div style="background:#ccc;display:flex;width:50px;height:50px;justify-
content:center;align-items:center;border-radius:50%;">'.$memory.'</div>';
$transend = array($memory);
$step = 1;
for ($i=0; $i < $step; $i++) {
$under = $transend[$i];
$get = $con->query("SELECT * FROM multilevel WHERE `under`='$under'");
$cnt = $get->num_rows;
if ($cnt>0) {
for ($g=0; $g < $cnt; $g++) {
$fet = $get->fetch_object();
$id = $fet->id;
$position = $fet->position;
array_push($transend, $id);
$step = $step*2;
echo '<div style="background:#ccc;display:flex;width:50px;height:50px;justify-content:center;align-items:center;border-radius:50%;">'.$id.'-'.$position.'</div>';
}
} else {
break;
}
}
I am trying to rewrite it using SQL syntax in node. This is my nodejs code:
A Tree Constructor
const Tree = function (binary) {
this.position = binary.position;
this.level = binary.level;
this.user_id = binary.user_id;
this.under_user_id = binary.under_user_id;
this.brought_by = binary.brought_by;
this.username = binary.username;
};
A method to get all binary for a user
Tree.findBinaries = async (id, result) => {
const theId = parseInt(id);
let transend = [theId];
let step = 1;
let data = [];
for (let i = 0; i < step; i++) {
const under = transend[i];
connection.query(
`SELECT * FROM multilevel WHERE under = ?`,
under,
(err, res) => {
if (err) {
result(null, err);
return;
}
if (res.length) {
for (let index = 0; index < res.length; index++) {
const { id } = res[index];
// push id
transend.push(id);
data.push(res[index]);
step = step * 2;
}
result(null, data);
} else {
result(null, "no data found");
}
}
);
}
};
The transend.push(id) and data.push(res[index]), are not been pushed to the top level, also the step is not been updated. So the loop is running just once.
The expected Result is supposed to be 16 items of children and children of children, But it is currently returning to just the first level. I don't know what I'm currently doing wrong.

Related

Google cloud task divide into subtasks by firebase data

I have a google task which gets all companies from my firebase database. I then go through each of those companies in a loop and call additional task for updating each specific company. My problem is that my companies count is increasing and when doing foreach like this i can get into memory limit issues. Here is the actual code for calling the tasks and subtasks:
$router->get('companies', function () use ($router) {
$slackDataHelpersService = new \App\Services\SlackDataHelpersService();
$companiesDocuments = $slackDataHelpersService->getCompanies();
foreach ($companiesDocuments->documents() as $document) {
$cid = $document->id();
createTask('companies', 'updateCompany', "{$cid}");
}
return res(200, 'Task done');
});
How can i separate my initial companies documents into chunks and call a task for each of those chunks? For example, a task that will go through every 100 documents instead of the whole list?
Here is what i tried without success(i used members in this case):
$router->get('test2', function () use ($router) {
$db = app('firebase.firestore')->database();
$membersRef = $db->collection('companies')->document('slack-T01L7H2NDPB')->collection('members');
$query = $membersRef->orderBy('created', 'desc')->limit(10);
$perPage = 10;
$batchCount = 10;
$lastCreated = null;
while ($batchCount == $perPage) {
$loopQuery = clone $query;
if ($lastCreated != null) {
$loopQuery->startAfter($lastCreated);
}
$docs = $loopQuery->documents();
$docsRows = $docs->rows();
$batchCount = count($docsRows);
if ($batchCount > 1) {
$lastCreated = $docsRows[$batchCount - 1];
}
echo $lastCreated['created'];
//createTasksByDocs($docs);
}
//return res(200, 'Task done');
});
I ended up making a function which uses a while loop and loops until it reaches the limit:
function paginateCollections($ref, $limit, $functionName)
{
$query = $ref->orderBy('created', 'desc')->limit($limit);
$perPage = $limit;
$batchCount = $limit;
$lastCreated = null;
while ($batchCount == $perPage) {
$loopQuery = clone $query;
if ($lastCreated != null) {
$loopQuery = $loopQuery->startAfter([$lastCreated]);
}
$docs = $loopQuery->documents();
$docsRows = $docs->rows();
$batchCount = count($docsRows);
if ($batchCount > 1) {
$lastCreated = $docsRows[$batchCount - 1]['created'];
}
if (function_exists($functionName)) {
$functionName($docs);
}
}
}

MariabDB synchronous UPDATE query

I'm working with PHP and MariaDB and I run into a problem.
I update a value to multiple rows, and then SELECT there rows to make a new calculation the data for another task.
The problem here that I get the wrong number. I guess that the MariaDB has not finished the UPDATE query, but it return the finished flag to PHP and then the PHP proceeds the SELECT query. [I just guess]
I open to any idea. If I'm wrong, please correct me.
Thank you for sharing
This is my code
$modelAdminOrderBidSys = $this->load->model('Admin\Order\BidSys');
$acceptedItem = typeCast($modelAdminOrderBidSys->getItem($cartItemId));
if (!$acceptedItem) {
return array(
'result' => 'error',
'message' => 'Cannot find item #' . $cartItemId
);
}
$acceptedItem['lastOffer'] = $acceptedItem['offer'];
$acceptedItem['accepted'] = 1;
$acceptedItem['isBot'] = 0;
$modelAdminOrderBidSys->updateItem($cartItemId, array2object($acceptedItem));
$cartItems = typeCast($modelAdminOrderBidSys->getItems($acceptedItem['cartId']));
$accepted = 1;
$total = 0;
$offer = 0;
$lastOffer = 0;
foreach($cartItems as $cartItem) {
if ((int)$cartItem['accepted'] < 1) {
$accepted = 0;
}
$total += (float)$cartItem['total'];
$offer += (float)$cartItem['offer'];
$lastOffer += (float)$cartItem['lastOffer'];
}
$postField = new \stdClass();
$postField->accepted = $accepted;
$postField->total = $total;
$postField->offer = $offer;
$postField->lastOffer = $lastOffer;
$modelAdminOrderBidSys->updateCart($acceptedItem['cartId'], $postField);
It sounds like your SELECT transaction starts before the UPDATE has committed. Try changing the transaction_isolation (in config) / tx_isolation (at runtime with SET GLOBAL) to READ-COMMITTED. Default is REPEATABLE-READ.

Same serial number is generating for different requests

I am inserting a serial number in a table that is increment by one always but when multiple request is coming in same time it is inserting same serial number for different requests.I am using mysql database.
I know i am fetching the max serial number too early in the code and if request is come in same time so it will fetching same serial number for both. is it good idea to update serial number after all work done. what if inserting a record for new request and updating the serial number for previous one is in same time.
public function add(){
$session = $this->request->session();
$company_id = $session->read('Admin.company_id');
$emp_id = $session->read('Admin.emp_id');
$user_email_id = $session->read('Admin.email_id');
$employee_name = $session->read('Admin.employee_name');
$conn = ConnectionManager::get('default');
if ($this->request->is('post')) {
try{
$conn->begin();
$department = $this->request->data['department'];
$data = $this->request->data;
if(!array_key_exists('is_requisition_for_contractor', $data)){
$is_requisition_for_contractor = 0;
} else {
$is_requisition_for_contractor = $data['is_requisition_for_contractor'];
}
if(!array_key_exists('is_requisition_for_employee', $data)){
$is_requisition_for_employee = 0;
} else {
$is_requisition_for_employee = $data['is_requisition_for_employee'];
}
if(!array_key_exists('is_boulder_requisition', $data)){
$is_requisition_for_boulder = 0;
} else {
if($data['is_boulder_requisition'] == ''){
$is_requisition_for_boulder = 0;
} else {
$is_requisition_for_boulder = $data['is_boulder_requisition'];
}
}
$is_requisition_for_plant = 0;
if(!array_key_exists('is_plant_requisition', $data)){
$is_requisition_for_plant = 0;
} else {
if($data['is_plant_requisition'] == ''){
$is_requisition_for_plant = 0;
} else {
$is_requisition_for_plant = $data['is_plant_requisition'];
}
}
if(array_key_exists("files",$this->request->data)) {
$files = $this->request->data['files'];
if (count($files)) {
$files_uploading_response = $this->uploadMultipleFiles($files, 'files/requisitions/');
}
}
$last_material_insert_id = '';
if($this->request->data('material_id')[0] == ''){
if($this->request->data('department') == 1){
$type = 1;
} elseif($this->request->data('department') == 3){
$type = 3;
} elseif($this->request->data('department') == 2){
$type = 2;
}
if($this->request->data('department') == 1 || $this->request->data('department') == 3){
$conn->execute("INSERT INTO material (material_name, material_type_id, company_id, status, is_approved_by_admin) VALUES (?,?,?,?,?)",[$this->request->data('material_name'), $type, $company_id, 1,0]);
$last_material_insert_id = $conn->execute("SELECT LAST_INSERT_ID() AS last_id")->fetchAll('assoc');
} elseif($this->request->data('department') == 2) {
//todo for unapproved material
$conn->execute("INSERT INTO material (part_no, material_type_id, company_id, status, is_approved_by_admin,unique_category_id) VALUES (?,?,?,?,?,?)",[$this->request->data('part_no')[0], $type, $company_id, 1,0,$this->request->data('unique_category_id')[0]]);
$last_material_insert_id = $conn->execute("SELECT LAST_INSERT_ID() AS last_id")->fetchAll('assoc');
}
}
// here i am fatching max serial number from table
$requistion_number = $conn->execute("SELECT IF(MAX(requisition_no) IS NULL, 0,MAX(requisition_no)) AS requisition_no FROM requisition WHERE site_id = ?",[$this->request->data('site_id')])->fetchAll('assoc');
$Requisition = TableRegistry::get('requisition');
$requisition = $Requisition->newEntity();
$requisition->registered_on = $this->request->data['date'];
$requisition->department_id = $this->request->data('department');
$requisition->site_id = $this->request->data('site_id');
$requisition->issues_to_id = $this->request->data['prepared_by_id'];
$requisition->prepared_by_id = $this->request->data['prepared_by_id'];
$requisition->approved_by_id = $this->request->data['hod_id'];
$requisition->hod_id = $this->request->data['hod_id'];
$requisition->is_diesel_requisition_for_employee = $is_requisition_for_employee;
$requisition->is_diesel_requisition_for_contractor = $is_requisition_for_contractor;
$requisition->is_requisition_for_boulder = $is_requisition_for_boulder;
$requisition->is_requisition_for_plant = $is_requisition_for_plant;
if(array_key_exists('for_tanker_stock', $this->request->data)) {
$requisition->for_tanker_stock = 1;
}
if($last_material_insert_id != ''){
$requisition->is_material_approved_by_admin = 0;
}
$requisition->status = 1;
$site_id = $this->request->data['site_id'];
$requisition->requisition_no = $requistion_number[0]['requisition_no'] + 1;
$requistionnumber = $requistion_number[0]['requisition_no'] + 1;
$saveRequsition = $Requisition->save($requisition);
$conn->commit();
}
I am expecting the output different serial number for each request.any optimise way to do this. thanks in advance.
Ok, how about the same strategy, setting the $requisition_number after the row has been inserted (see my other answer), but using a single query with the same method you use to determine the new requisition id:
$conn->execute("UPDATE requisition
SET requisition_no = (SELECT IF(MAX(requisition_no) IS NULL, 0,MAX(requisition_no)) AS requisition_no FROM requisition WHERE site_id = ?) + 1",
[$this->request->data('site_id')]);
The idea here is that a single query will be executed in one step, without another, similar query, being able to interfere.
What you currently do is to first get the old requistion number like this:
$requistion_number = $conn->execute("SELECT IF(MAX(requisition_no) IS NULL, 0,MAX(requisition_no)) AS requisition_no
FROM requisition WHERE site_id = ?",[$this->request->data('site_id')])->fetchAll('assoc');
and then increase it before you save and commit.
My suggestion is to not set the $requistion_number at all before you save and commit the requisition row, but to determine the $requistion_number afterwards.
You now wonder how?
Well, you need to count the total number of requisition rows in the table for the site the requisition is for, and add one, like this:
$last_requisition_id = $conn->execute("SELECT LAST_INSERT_ID() AS last_id")->fetchAll('assoc');
$site_id = $this->request->data('site_id');
$requisition_number = $conn->execute("SELECT COUNT(*) AS requisitionsCount
FROM requisition
WHERE <primary_key> <= ? AND
site_id = ?",
[$last_requisition_id, $site_id]) + 1;
$conn->execute("UPDATE requisition
SET requisition_no = ?
WHERE <primary_key> <= ?",
[$requisition_number, $last_requisition_id]);
I know this code is not working. The $requisition_number will probably contain an array with the requisitionsCount as a value, but you can correct that.
Because you're using data that is already present in the database table you don't run the risk that two rows will get the same $requisition_number. The assumption here is that requisitions are never deleted.

Weighted Load Balancing Algorithm into PHP Application

I want to resolve weighted an Adapter from an factory which could be configured by user (enable/disable and weight %).
Example:
AdapterW ≃ 20% of transaction
AdapterX ≃ 30% of transaction
AdapterY ≃ 40% of transaction
AdapterZ ≃ 10% of transaction
I can grant that all items will never sum more than one hundred (100%), but sometimes any adapter could be deactivated.
I have the following parameters:
public function handleAdapter()
{
$isWActive = (boolean)$this->_config[self::W];
$isXActive = (boolean)$this->_config[self::X];
$isYActive = (boolean)$this->_config[self::Y];
$isZActive = (boolean)$this->_config[self::Z];
$WPercentage = (int)$this->_config[self::LOAD_BALANCE_W];
$XPercentage = (int)$this->_config[self::LOAD_BALANCE_X];
$YPercentage = (int)$this->_config[self::LOAD_BALANCE_Y];
$ZPercentage = (int)$this->_config[self::LOAD_BALANCE_Z];
.
.
.
return (self::W | self::X | self::Y | self::Z);
}
How can i balance weighted between this adapters dynamically?
Edit
created a gist to a executable code: https://gist.github.com/markomafs/5d892d06d6670909f9b4
This may not be the best approach, but you can try something like this:
public function handleAdapter()
{
//an array to return the balanced entries
$balancedEntries[] = false;
//verifies which of the options are active
$isWActive = (boolean)$this->_config[self::W];
$isXActive = (boolean)$this->_config[self::X];
$isYActive = (boolean)$this->_config[self::Y];
$isZActive = (boolean)$this->_config[self::Z];
//get configured percentage of each
$WPercentage = (int)$this->_config[self::LOAD_BALANCE_W];
$XPercentage = (int)$this->_config[self::LOAD_BALANCE_X];
$YPercentage = (int)$this->_config[self::LOAD_BALANCE_Y];
$ZPercentage = (int)$this->_config[self::LOAD_BALANCE_Z];
//here you fill the array according to the proportion defined by the percentages
if ($isWActive) {
for ($i = 0; $i < $WPercentage; $i++) {
$balancedEntries[] = self::W;
}
}
if ($isXActive) {
for ($i = 0; $i < $XPercentage; $i++) {
$balancedEntries[] = self::X;
}
}
if ($isYActive) {
for ($i = 0; $i < $YPercentage; $i++) {
$balancedEntries[] = self::Y;
}
}
if ($isZActive) {
for ($i = 0; $i < $ZPercentage; $i++) {
$balancedEntries[] = self::Z;
}
}
return $balancedEntries;
}
And then, in case you want a proportion of 1 to 100 (as in percentages):
$balancedResult = $balancedEntries[array_rand($balancedEntries, 1)];
Since array_rand will return 1 key from the original array, you use it to get it's value.
Another try, this should work for your case - But it only work if you have an adapter as a single char string, this is not visible by your question.
public function handleAdapter()
{
# a map with all adapters
$map = array(
self::W => self::LOAD_BALANCE_W,
self::X => self::LOAD_BALANCE_X,
self::Y => self::LOAD_BALANCE_Y,
self::Z => self::LOAD_BALANCE_Z
);
# generate a string map with one char per percentage point
$stringMap = "";
foreach($map as $key => $value){
# skip if disabled
if(!$this->_config[$key]) continue;
# repeat the key for each percentage point
$stringMap .= str_repeat($key, (int)$this->_config[$value]);
}
# return a random string char from the map
return $stringMap[rand(0, strlen($stringMap) - 1)];
}
Edit: I've misunderstood the question, the answer is wrong.
I understand your question so that you always want to return the adapter with the lowest load to force traffic to this adapter.
public function handleAdapter()
{
$isWActive = (boolean)$this->_config[self::W];
$isXActive = (boolean)$this->_config[self::X];
$isYActive = (boolean)$this->_config[self::Y];
$isZActive = (boolean)$this->_config[self::Z];
$WPercentage = (int)$this->_config[self::LOAD_BALANCE_W];
$XPercentage = (int)$this->_config[self::LOAD_BALANCE_X];
$YPercentage = (int)$this->_config[self::LOAD_BALANCE_Y];
$ZPercentage = (int)$this->_config[self::LOAD_BALANCE_Z];
$map = array();
if($isWActive) $map[self::W] = $WPercentage;
if($isXActive) $map[self::X] = $XPercentage;
if($isYActive) $map[self::Y] = $YPercentage;
if($isZActive) $map[self::Z] = $ZPercentage;
asort($map);
return key($map);
}
Edit: Fixed wrong sort(), you need asort() to maintain the index.

Most effective way of data collection?

Let's first get to an important note about my situation:
I have 1 table in my MySQL database with approx 10 thousand entries
Currently, when collecting information from table #1. I collect a total of 20 - 24 rows per page.
Example being:
Q1 : SELECT * FROM table WHERE cat = 1 LIMIT 0,25
R1: id: 1, name: something, info: 12
The PHP file that does these queries, is called by the jquery ajax function, and creates an XML file that that jquery function reads and shows to the user.
My question here is. How do i improve the speed & stability of this process. I can have up to 10 thousand visitors picking up information at the same time, which makes my server go extremely sluggish and in some cases even crash.
I'm pretty much out of idea's, so i'm asking for help here. Here's an actual presentation of my current data collection (:
public function collectItems($type, $genre, $page = 0, $search = 0)
{
// Call Core (Necessary for Database Interaction
global $plusTG;
// If Search
if($search)
{
$searchString = ' AND (name LIKE "%'.$search.'%")';
}
else
{
$searchString = '';
}
// Validate Query
$search = $plusTG->validateQuery($search);
$type = $plusTG->validateQuery($type);
$genre = $plusTG->validateQuery($genre);
// Check Numeric
if((!is_numeric($genre)))
{
return false;
}
else
{
if(!is_numeric($type))
{
if($type != 0)
{
$typeSelect = '';
$split = explode(',',$type);
foreach($split as $oneType)
{
if($typeSelect == '')
{
$typeSelect .= 'type = '.$oneType.' ';
}
else
{
$typeSelect .= 'OR type = '.$oneType.' ';
}
}
}
}
else
{
$typeSelect = 'type = ' . $type . ' ';
}
//echo $typeSelect;
$limit = ($page - 1) * 20;
if(($type != 0) && ($genre != 0))
{
$items = $plusTG->db->query('SELECT * FROM dream_items WHERE active = 1 AND genre = '.$genre.' AND ('.$typeSelect.')'.$searchString.' ORDER BY name LIMIT '.$limit.',20');
$total = $plusTG->db->query('SELECT COUNT(*) as items FROM dream_items WHERE active = 1 AND genre = '.$genre.' AND ('.$typeSelect.')'.$searchString);
}
elseif(($type == 0) && ($genre != 0))
{
$items = $plusTG->db->query('SELECT * FROM dream_items WHERE active = 1 AND genre = '.$genre.$searchString.' ORDER BY name LIMIT '.$limit.',20');
$total = $plusTG->db->query('SELECT COUNT(*) as items FROM dream_items WHERE active = 1 AND genre = '.$genre.$searchString);
}
elseif(($type != 0) && ($genre == 0))
{
$items = $plusTG->db->query('SELECT * FROM dream_items WHERE active = 1 AND ('.$typeSelect.')'.$searchString.'ORDER BY name LIMIT '.$limit.',20');
$total = $plusTG->db->query('SELECT COUNT(*) as items FROM dream_items WHERE active = 1 AND ('.$typeSelect.')'.$searchString);
}
elseif(($type == 0) && ($genre == 0))
{
$items = $plusTG->db->query('SELECT * FROM dream_items WHERE active = 1'.$searchString.' ORDER BY name LIMIT '.$limit.',20');
$total = $plusTG->db->query('SELECT COUNT(*) as items FROM dream_items WHERE active = 1'.$searchString);
}
$this->buildInfo($items->num_rows, $total->fetch_assoc());
while($singleItem = $items->fetch_assoc())
{
$this->addItem($singleItem);
}
}
return true;
}
The build info call & add item call are adding the items to the DOMXML.
This is my javascript (domain and filename filtered):
function itemRequest(type,genre,page, search)
{
if(ajaxReady != 0)
{
ajaxReady = 0;
$('#item_container').text('');
var searchUrl = '';
var searchLink;
var ajaxURL;
if(search != 0)
{
searchUrl = '&search=' + search;
searchLink = search;
ajaxURL = "/////file.php";
}
else
{
searchLink = 0;
ajaxURL = "////file.php";
}
$.ajax({
type: "GET",
url: ajaxURL,
data: "spec=1&type="+type+"&genre="+genre+"&page="+page+searchUrl,
success: function(itemListing){
$(itemListing).find('info').each(function()
{
var total = $(this).find('total').text();
updatePaging(total, page, type, genre, searchLink);
});
var items = $(itemListing).find('items');
$(items).find('item').each(function()
{
var itemId = $(this).find('id').text();
var itemType = $(this).find('type').text();
var itemGenre = $(this).find('genre').text();
var itemTmId = $(this).find('tm').text();
var itemName = $(this).find('name').text();
buildItem(itemId, itemType, itemGenre, itemTmId, itemName);
});
$('.item_one img[title]').tooltip();
},
complete: function(){
ajaxReady = 1;
}
});
}
Build item calls this:
function buildItem(itemId, itemType, itemGenre, itemTmId, itemName)
{
// Pick up Misc. Data
var typeName = nameOfType(itemType);
// Create Core Object
var anItem = $('<div/>', {
'class':'item_one'
});
// Create Item Image
$('<img/>', {
'src':'///'+typeName+'_'+itemTmId+'_abc.png',
'alt':itemName,
'title':itemName,
click:function(){
eval(typeName + 'Type = ' + itemTmId);
$('.equipped_item[name='+typeName+']').attr('src','//'+typeName+'_'+itemTmId+'_abc.png');
$('.equipped_item[name='+typeName+']').attr('alt',itemName);
$('.equipped_item[name='+typeName+']').attr('title',itemName);
$('.equipped_item[title]').tooltip();
recentEquipped(typeName, itemTmId, itemName);
updateSelfy();
}
}).appendTo(anItem);
// Favs
var arrayHack = false;
$(favEquips).each(function(){
if(arrayHack == false)
{
if(in_array(itemTmId, this))
{
arrayHack = true;
}
}
});
var itemFaved = '';
if(arrayHack == true)
{
itemFaved = 'activated';
}
$('<div/>',{
'class':'fav',
'id':itemFaved,
click:function(){
if($(this).attr('id') != 'activated')
{
$(this).attr('id','activated');
}
else
{
$(this).removeAttr('id');
}
itemFav(itemTmId, typeName, itemName);
}
}).appendTo(anItem);
$(anItem).appendTo('#item_container');
}
If anyone could help me improve this code, it'd be very much appreciated.
add an index to your table for cat column
figure out what the bottleneck is, if it is your XML then try json,
if it is your network, try enabling gzip compression
I agree with Zepplock, it is important to find out where the bottleneck is - if not, you're only guessing. Zepplock's list is good but I would also add caching:
Find out where the bottleneck is.
Use indexes in your db table.
Cache your query results
Find the Bottleneck.
There are number opinions and ways to do this... Basically when your site is under load, get the time it takes to complete each step in the process: The DB queries, the server-side processes, the client side processes.
Use Indexes.
If your DB is slow chances are you can get a lot of improvement by optimizing your queries. A table index may be in order... Use 'EXPLAIN' to help identify where indexes should be placed to optimize your queries:
EXPLAIN SELECT * FROM dream_items WHERE active = 1 AND (name LIKE "%foo%") ORDER BY name LIMIT 0,20;
(I bet an index on active and name would do the trick)
ALTER TABLE `dream_items` ADD INDEX `active_name` (`active` , `name`);
Also try to avoid using the wildcard '*'. Instead only ask for the columns you need. Something like:
SELECT `id`, `type`, `genre`, `tm`, `name` FROM `dream_items` WHERE...
Cache Your Results.
If the records in the DB have not changed then there is no reason to try an re-query the results. Use some sort of caching to reduce the load on your DB (memcached, flat file, etc..). Depending on the database class / utilities you're using it may already be capable of caching results.

Categories