I have a recursive MySQL query that returns about 25000 results, which takes a long time to load.
I would like to know if there is a way to paginate the result or to make things faster.
The code is below:
function getChildIds($id) {
echo ' <tr><td> Client No.: </td><td> ';echo $id;echo ' </td></tr> ';
$sql_query="select id from rev_r_clients WHERE parent_client_id='$id'";
$res = mysql_query($sql_query);
$ids = Array();
while($row = mysql_fetch_object($res)){
if ($row->id) {
$ids[] = $row->id;
}
}
array_walk($ids, 'getChildIds');
}
As others mentioned, the main issue is that you are running one MySQL for every user. The overhead for that can be huge so you'd better reduce the amount of queries.
One way is to query all the data in a single query and then process it recursively as you are doing. But, from the looks of it, you just want to get all children of all your users. You can do that in a very simple way with a single MySQL query:
SELECT
p.id AS parent,
GROUP_CONCAT(c.id) AS children
FROM
rev_r_clients AS p
JOIN rev_r_clients AS c ON c.parent_client_id = p.id
GROUP BY
p.id
That will provide you with every user and their children in a single query. You can get them in an array using explode afterwards (as GROUP_CONCAT will provide you a comma-separated list).
Based on your comments you just want to create a user tree. Based on the information, this is the code you could use:
# NOTE: I am using mysqli, that is what you should do as well
$res = $mysqli_conn->query(<THE_QUERY_ABOVE>);
$parents = Array();
while($row = $res->fetch_assoc()) {
$parents[$row['parent']] = explode($row['children'])
}
After that code runs, the $parent array is a mapping of IDs to an array of their children IDs that you can use for whatever you need/want. If an ID is not in the array, it means it has no children.
Ideally you would fetch the database results once and use a recursive function on the result - passing the result or part of it around - not make database calls in the recursive function itself.
That would use more memory but it would definitely speed things up.
Also, it will need to pass $limit and $offset numbers in through the next/previous button.
concern: are you handling mysql injection?
$sql_query="select id from rev_r_clients WHERE parent_client_id='$id' limit $limit offset $offset";
Related
I have three tables: "users", "posts", and "likes" almost formatted as:
For example the three table entries are:
users (two users): 1. uid: 12,
2. uid: 15.
, and
posts (three posts): 1. pid: 3, publisherId = 12, likers = 2,
2. pid: 6, publisherId = 12, likers = 0,
3. pid: 7, publisherId = 12, likers = 1.
, and
likes (three likes): 1. lid: 1, postId = 3, likerId = 12,
2. lid: 2, postId = 7, likerId = 15,
3. lid: 3, postId = 3, likerId = 15.
What I need is: To get all the posts in a multi dimensional array with an array for the unique publisher (user) and another array for the likers (users also). The output I am looking for is something like:
Array: (
post:(
pid = 3,
publisher = Array (uid = 12),
likers = Array (uid=12, uid=15)
),
post:( ....
)
).
I am already getting that with the following time consuming (I believe):
$sql = "SELECT dev_posts.* FROM posts";
if (!$result = mysql_query($sql)) die("Query failed.");
$response = array();
while($result_array = mysql_fetch_object($result)) {
$entries = array();
foreach($result_array as $key => $value) {
if ($key == "byUserId") {
$publisherID = $result_array->byUserId;
$anotherSql = "SELECT * FROM users WHERE users.uid = $publisherID";
if ($anotherResult = mysql_query($anotherSql)) {
$anothers = array();
while($anotherResult_array = mysql_fetch_object($anotherResult)) {
$another = array();
foreach($anotherResult_array as $anotherKey => $anotherValue) {
$another[$anotherKey] = $anotherValue;
}
$anothers[] = $another;
}
$entries[$key] = $anothers;
}
}
else if ($key == "likes") {
if ($value > 0){
$PID = $result_array->pid;
$anotherSql = "SELECT likes.*, users.* FROM likes LEFT JOIN users ON likes.likeUserId = users.uid WHERE $PID = likes.likePostId";
if ($anotherResult = mysql_query($anotherSql)) {
$anothers = array();
while($anotherResult_array = mysql_fetch_object($anotherResult)) {
$another = array();
foreach($anotherResult_array as $anotherKey => $anotherValue) {
$another[$anotherKey] = $anotherValue;
}
$anothers[] = $another;
}
$entries[$key] = $anothers;
}
}
else {
$entries[$key] = array();
}
}
else {
$entries[$key] = $value;
}
}
$posts[] = $entries;
}
Any suggestions are appreciated. I am still looking for join and left join solutions!
it really depends on what you're looking for:
user data for all posts:
SELECT user.*, post.*
FROM post
LEFT JOIN user ON (post.publisherid=user.id)
since it's only one publisher per post, this should give the user's data for each and every post.
liker ids
SELECT post.*, GROUP_CONCAT(likes.likerid) as likerids
FROM post
LEFT JOIN likes ON (likes.postid=post.pid)
GROUP BY post.pid
this will give you rows:
["pid" => 3, "publisherid" => 12, "likerids" => "15,17,19"]
and all you have to do in php then is:
$likerids = explode(',', $row['likerids']);
combine for fun and profit
of course, you can combine both queries into one. However, the second query only works well, if you only need the ids of likers. If you want the user data as well, it might be good, (depending on your actual use case), to collect the likerids first and fetch their user data later
SELECT *
FROM user
WHERE user.uid IN (15,17,19)
Also, you should REALLY REALLY REALLY use prepared statements to protect against sql injections. (this is not bold by accident! this is important) If you don't know what sql injections are, read it up. If anyone finds a query that's vulnerable to user provided input and sql injections, all your users' data can (and most likely will) leak into the darkness that is the internet.
Also, please use pdo or mysqli libraries for your database queries. the mysql library is deprecated and is gone in 7.[something] I believe.
update
There are a bunch of problems associated with fetching both sides of an m:n relation. I mean, essentially it's easy, just fetch it:
SELECT post.*, user.*
FROM post
LEFT JOIN likes ON (post.pid=likes.postid)
LEFT JOIN user ON (likes.likerid=user.uid)
ORDER BY post.pid
however, this will produce these rows:
pid1, publisherid1, userid1, username1
pid1, publisherid1, userid2, username2
...
pid2, publisherid2, userid1, username1
...
as you will notice, the post itself appears multiple times, once for each liker. This is a problem, which cannot be avoided by standard sql alone, because of the fundamentals of sql (being row-based).
This is essentially the data you want, but I suppose in a more aggregated form. This form also contains lots and lots of redundant data, especially assuming the post data is way bigger than the user data. To gather the data, you would have to check the pid for every row, if it's the same pid as in the row before, you somehow merge the records. ... But I would strongly advise against this approach.
I would also advise against using GROUP_CONCAT for every single field of user, although it might work. The problem is, that GROUP_CONCAT needs a delimiter, which YOU need to be different from any character in the username field (or any other field, you want to retrieve). This might or might not be a problem, but it's dirty nonetheless. In any case, you then would have to explode every of those aggregated fields in php, rebuild the users' data to build your wanted structure.
Another solution might be, to create a new field, that holds aggregated userdata as json or something, and with the intelligent use of GROUP_CONCAT and CONCAT one could create a hierarchical string for each row, that could be json itself. But this goes beyond this post. (Also I condone such use of databases that aren't made nor designed for this). There is however a JSON data type, that could be interesting ...
Ultimately, in those cases, you let the database server do the work that IMHO should be done by the client.
I would do this:
two queries, for limited number of users (because YAGNI)
first we're going to fetch the posts we want, we also add a count for likes and the publisher's user data are included as well (if you add a WHERE with data, that comes from outside the server like a browser, use prepared statements! also read up on SQL, if you don't understand all or parts of this query!) - I would assume, this is all the data you would show to a user at first. (With the power of caching, showing likers for distinct posts could be quite efficient.)
$pdo = new PDO('#yourdatabasestring#'); // rtfm!
$postresult = $pdo->query(
'SELECT p.*, '.
' pub.uid, pub.username, '.
' COUNT(likers.uid) as likecount '.
'FROM post p '.
'LEFT JOIN user as pub ON (pub.uid=post.publisherid) '.
'LEFT JOIN likes ON (post.pid=likes.postid) '.
'LEFT JOIN user as likers ON (likers.uid=likes.likerid) '
'GROUP BY p.pid '.
'LIMIT 50' // learn about offsets!!!
);
now, put all results into an array
$pids = []; // this will contain post ids for which we want to fetch likes
$posts = [];
while($post = $postresult->fetch()) {
$pids[] = $post['pid'];
$post['likers'] = []; // prepare for later
$posts[$post['pid']] = $post;
}
At this point, this array only contains the data, that was requested in the first query (post, user data of publisher). Next, we query for the likes, we use the temporarily stored post ids.*
$likers = $pdo->query(
'SELECT likes.postid, user.* '.
'FROM likes '.
'LEFT JOIN user ON (likes.likerid=user.uid) '.
'WHERE likes.postid IN ('.implode(',', $pids).')'
);
and fetching them and assigning them to the right post.
while($like = $likers->fetch()) {
$posts[$like['postid']]['likers'][] = $like;
}
now ... this solution should actually work for almost every sql database. GROUP_CONCAT doesn't provide any benefit here. Two queries are actually quite alright here. If you have a very large set of posts that you want to fetch at once, this might absolutely not be the right approach. For fairly small data sets (some hundred posts or so), this should be very much okay.
*) the WHERE clause could be replaced by WHERE postid IN ([first query with only poist.pid in select]). For certain use cases, this could be preferable.
word of advice
However, for the usual web case, I can't imagine anyone wanting to see more than 50 posts at once with already displayed userdata, likers' data and stuff. don't try to show everything at once. fetch what's necessary, try to cluster information (as I did with the $pids) to reduce the number of queries. Doing a few well-designed and short-running queries in general beats doing many queries (as in your original code), but also is more appropriate than running one huge query, where most data will (on average) be irrelevant.
I am displaying images on my website with their details, however of late the site is very slow at loading.
The main thing is this foreach loop that loops through 100 times to display 100 posts in a grid. It takes 14 seconds to run
foreach($posts as $post) {
$hashtags[] = $this->HashTagsModel->get_hashtags($post["id"]);
$author[] = $this->UserModel->get_user_details($post["user_id"]);
$comment_count[] = $this->CommentModel->get_comments_count($post["id"]);
$is_favourited[] = $this->FavouriteModel->is_favourited($post["id"]);
$is_reposted[] = $this->RepostModel->is_reposted($post["id"]);
$vote_status[] = $this->vote_status($post["id"]);
$comments[] = $this->count_comments($post["id"]);
}
How can I do this differently to make it more efficient? This worked before our websites database became massive
Any help would be appreciated,
regards,
Almost Fired
The efficient way to foreach loop query database is to not foreach query database. This is because you are allowing an unknown amount of queries to be fired off which will cause massive queueing. What if suddenly 5000 images get added, do you loop through them all? That query will take a very long time.
You have $post["id"] as your where variable I am assuming, so you could reduce this process significantly by doing a single query after formulating an array of post ids, something like this:
$postids = array();
foreach($posts as $post) {
$postids[] = $post['id'];
}
// Selecting from 1 table
$query = 'SELECT * FROM hashtags WHERE id IN ('. implode(",", $postids) .')';
This would fetch all the information on hashtags where the id is is one of your postids. That is just 1 table, you would likely want to fetch multiple, without knowing your database structure I'm going to be generic, so something like:
// Selecting and joining data from multiple tables
$query = ' SELECT author.name, table.col FROM posts
LEFT JOIN author ON author.id = post.author_id
LEFT JOIN table ON table.id = post.table_id
WHERE posts.id IN IN ('. implode(",", $postids) .')';
It is a bit more difficult to be more accuracy. I think joining tables would provide you a better result, you can even join counts for votes/comments. If that is not possible, you could query all data related to your posts and then formulate it in PHP, then you know exactly how many queries you have. For example, change your models to accept an array instead of a single post ID and then change your "WHERE post_id = x" to "WHERE post_id IN (x)" then you can do something like:
$postids = array();
foreach($posts as $post) {
$postids[] = $post['id'];
}
$hashtags = $this->HashTagsModel->get_hashtags($postids);
$author = $this->UserModel->get_user_details($postids);
$comment_count = $this->CommentModel->get_comments_count($postids);
$is_favourited = $this->FavouriteModel->is_favourited($postids);
$is_reposted = $this->RepostModel->is_reposted($postids);
$vote_status = $this->vote_status($postids);
$comments = $this->count_comments($postids);
This gets your queries outside of the loop, and you know there will only ever be 7 SQL queries, and not queries * posts. In PHP you would loop through the results of each array to assign each one back to posts based on its ID.
Whilst populating a table based on ids and labels from different tables, it appeared apparent there must potentially be a better way of achieving the same result with less code and a more direct approach using LEFT JOIN but i am puzzled after trying to work out if its actually capable of achieving the desired result.
Am i correct in thinking a LEFT JOIN is usable in this instance?
Referencing two tables against one another where one lists id's related to another table and that other table has the titles allocated for each reference?
I know full well that if theres independent information for each row LEFT JOIN is suitable, but where theres in this case only several ids to reference for many rows, i just am not clicking with how i could get it to work...
The current way i am achieving my desired result in PHP/MySQL
$itemid = $row['item_id'];
$secid = mysql_query(" SELECT * FROM item_groups WHERE item_id='$itemid' ");
while ($secidrow = mysql_fetch_assoc($secid)) {
//echo $secidrow["section_id"]; //testing
$id = $secidrow["section_id"];
$secnameget = mysql_query(" SELECT * FROM items_section_list WHERE item_sec_id='$id' ");
while ($secname = mysql_fetch_assoc($secnameget)) {
echo $secname["section_name"];
}
}
Example of the data
Item groups
:drink
:food
:shelf
Item List
itemId, groupId
Group List
groupId, groupTitle
The idea so outputting data to a table instead of outputting "Item & Id Number, in place of the ID Number the title actually appears.
I have achieved the desired result but i am always interested in seeking better ways to achieve the desired result.
If I've deciphered your code properly, you should be able to use the following query to get both values at the same time.
$itemid = $row['item_id'];
$secid = mysql_query("
SELECT *
FROM item_groups
LEFT JOIN items_section_list
ON items_section_list.item_sec_id = item_groups.section_id
WHERE item_id='$itemid'
");
while ($secidrow = mysql_fetch_assoc($secid)) {
//$id = $secidrow["section_id"];
echo $secidrow["section_name"];
}
I have quite a complicated situation here. I can't find a better way to solve this without putting a SELECT query inside a loop that rolls over 70000 times when I enter that page (don't worry, I use array_chunk to split the array into pages). I guess this would be a resource killer if I use a query here. Because of this, here I am, asking a question.
I have this big array I need to loop on:
$images = scandir($imgit_root_path . '/' . IMAGES_PATH);
$indexhtm = array_search('index.htm', $images);
unset($images[0], $images[1], $images[$indexhtm]);
Now I have an array with all file names of the files (images) in my IMAGES_PATH. Now the problem comes here:
Some of these images are registered on the database, because registered users have their images listed on my database. Now I need to retrieve the user_id based on the image name that the array above gives me.
Inside a loop I simply did this:
foreach ($images as $image_name)
{
$query = $db->prepare('SELECT user_id FROM imgit_images WHERE image_name = :name');
$query->bindValue(':name', $image_name, PDO::PARAM_STR);
$query->execute();
$row = $query->fetch(PDO::FETCH_ASSOC);
$user_id = $row['user_id'];
echo $user_id;
}
This works just fine, but the efficiency equals to 0. Using that user_id I plan on getting other stuff from the imgit_users table, such as username, which would require another query inside that loop.
This is too much and I need a simpler way to deal with this.
Is there a way to get those user_ids before going inside the loop and use them IN the loop?
This is the table structure from imgit_images:
While this is the schema for imgit_users:
Something like this would work (I'm not sure if it's possible to prepare the WHERE IN query since the # of values is unknown... Else, make sure you sanatize $images):
$image_names = "'".implode("', '", $images)."'";
$query = $db->prepare("SELECT img.user_id, image_name, username
FROM imgit_images img
INNER JOIN imgit_users u ON u.user_id = img.user_id
WHERE image_name IN(".$image_names.")");
$query->execute();
while($row = $query->fetch(PDO::FETCH_ASSOC))
{
echo $row['user_id']."'s image is ".$row['image_name'];
}
You might need to tweak it a little (haven't tested it), but you seem to be able to, so I'm not worried!
Not sure if it is going to help, but I see a couple of optimizations that may be possible:
Prepare the query outside the loop, and rebound/execute/get result within the loop. If query preparation is expensive, you may be saving quite a bit of time.
You can pass an array as in Passing an array to a query using a WHERE clause and obtain the image and user id, that way you may be able to fragment your query into a smaller number of queries.
Can you not just use an INNER JOIN in your query, this way each iteration of the loop will return details of the corresponding user with it. Change your query to something like (i'm making assumptions as to the structure of your tables here):
SELECT imgit_users.user_id
,imgit_users.username
,imgit_users.other_column_and_so_on
FROM imgit_images
INNER JOIN imgit_users ON imgit_users.user_id = imgit_images.user_id
WHERE imgit_images.image_name = :name
This obviously doesn't avoid the need for a loop (you could probably use string concatenation to build up the IN part of your where clause, but you'd probably use a join here anyway) but it would return the user's information on each iteration and prevent the need for further iterations to get the user's info.
PDO makes writing your query securely a cinch.
$placeholders = implode(',', array_fill(0, count($images), '?'));
$sql = "SELECT u.username
FROM imgit_images i
INNER JOIN imgit_users u ON i.user_id = u.id
WHERE i.image_name IN ({$placeholders})";
$stmt = $db->prepare($sql);
$stmt->execute($images);
while($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
// use $row['username']
}
Create a string of comma separated ?s and write them into IN's parentheses. Then pass the array of images to execute(). Easily done, and now all of your desired data is available in a single resultset from a single query. (Add additional columns to your query's SELECT clause as needed.)
I have a table A, with just two columns: 'id' and 'probably'. I need to go over a long list of ids, and determine for each one whether he is in A and has probability of '1'. What is the best way to do it?
I figured that it would be best to have 1 big query from A in the beginning of the script, and after that when I loop each id, I check the first query. but than i realized I don't know how to do that (efficiently). I mean, is there anyway to load all results from the first query to one array and than do in_array() check? I should mention that the first query should had few results, under 10 (while table A can be very large).
The other solution is doing a separate query in A for each id while I loop them. But this seems not very efficient...
Any ideas?
If you have the initial list of ids in array, you can use the php implode function like this:
$query = "select id
from A
where id in (".implode (',', $listOfIds).")
and probability = 1";
Now you pass the string as first parameter of mysql_query and receive the list of ids with probability = 1 that are within your initial list.
// $skip the amount of results you wish to skip over
// $limit the max amount of results you wish to return
function returnLimitedResultsFromTable($skip=0, $limit=10) {
// build the query
$query = "SELECT `id` FROM `A` WHERE `probability`=1 LIMIT $skip, $limit";
// store the result of the query
$result = mysql_query($query);
// if the query returned rows
if (mysql_num_rows($result) > 0) {
// while there are rows in $result
while ($row = mysql_fetch_assoc($result)) {
// populate results array with each row as an associative array
$results[] = $row;
}
return $results;
} else {
return false;
}
}
for each time you call this function you would need to increment skip by ten in order to retrieve the next ten results.
Then to use the values in the $results array (for example to print them):
foreach ($results as $key => $value) {
print "$key => $value <br/>";
}
Build a comma separated list with your ids and run a query like
SELECT id
FROM A
WHERE id IN (id1, id2, id3, ... idn)
AND probability = 1;
Your first solution proposal states that:
You will query the table A, probabyly using limit clause since A is a table with large data.
You will place the retrieved data in an array.
You will iterate through the array to look for the id's with probability of '1'.
You will repeat the first three steps several times until table A is iterated fully.
That is very inefficient!
Algorithm described above would require lots of database access and unneccessary memory (for the temporary array). Instead, just use a select statement with 'WHERE' clause and process with the data you want.
You need a query like the following I suppose:
SELECT id, probably FROM A WHERE A.probably = 1
If i understood you correctly, you should filter in the SQL query
SELECT * FROM A WHERE A.probably = 1