Pluck not working as expected in laravel - php

Following on from my previous question, I need to get an array of a list of random ids from a model, I do:
User::all('id')->random(2)->pluck('id')->toArray()
The above works as expected, plucking 2 ids and putting them in an array:
array:2 [
0 => 63
1 => 270
]
When I switch it to just one random user though:
User::all('id')->random(1)->pluck('id')->toArray()
The entire table is returned.
array:200 [
0 => 63
1 => 270
2 => 190
...
]
My plan is to use the following to get between 1 and 10 users:
User::all('id')->random(rand(1,10))->pluck('id')->toArray()
But how can I make this work when only 1 random user is selected, why is the entire table being returned?

You can use your database to randomize your users, and take full advantage of the query builder and its take method.
User::inRandomOrder()
->take(rand(1,10))
->pluck('id')
->toArray();
inRandomOrder was added in Laravel 5.2+, so if you have less, you can do orderBy(DB::raw('RAND()')) for MySQL (see this post to learn more Laravel - Eloquent or Fluent random row)

random(1) will not return a collection, while random(2) and above will. You can first get the query and then handle the results differently dependent on whether or not the result was a collection.

Related

Laravel whereIn not returning all

Happy Holidays!
I'm stuck! I dabble here and there but am not extremely proficient so some help would be appreciated.
I have an incident record in the MySQL database which contains the field incident_uid. This is a comma-separated string which contains user IDs. This is then exploded across the application to form an array to look up user names when multiple users are associated with the incident.
I also have a scheme_vehicle_log record which contains a responders column - again a comma-separated string of user IDs.
I am trying to run the following but it isn't functioning as intended.
$incident_responders = explode(',', $incident->incident_uid);
$schemevehicles = SchemeVehicleLog::whereIn('responders', $incident_responders)
->whereDate('signout_date', '<=', $incident->incident_date)
->where(function ($q) use ($incident) {
$q->where('return_date', '>=', $incident->incident_date)
->orWhere('return_date', null);
})
->get();
It only either returns some rows or none at all. A dd($incident_responders) confirms the array as intended.
array:2 [▼ // app/Http/Controllers/IncidentController.php:63
0 => "97"
1 => "17"
]
However dd($schemevehicles) shows as empty.
If I replace $incident_responders within the $schemevehicles query with [97,17], I get the intended result.
Tried for the past hour and reckon this will be something very simple - what am I overlooking here?
Thanks both.
I managed to find the problem.
dd($incident_responders) was returning the data as type string and not integer and therefore the query was failing ("97" and rather 97).
Adding an array_map helped.
Thanks.

php Mongo driver cursor traveling take long so much

I have a query like this
$results = $collection->find([
'status' => "pending",
'short_code' => intval($shortCode),
'create_time' => ['$lte' => time()],
])
->limit(self::BATCH_NUMBER)
->sort(["priority" => -1, "create_time" => 1]);
Where BATCH_SIZE is 70.
and i use the result of query like below :
foreach ($results as $mongoId => $result) {
}
or trying to convert in to array like :
iterator_to_array($results);
mongo fetch data and traveling on iterate timing is :
FetchTime: 0.003173828125 ms
IteratorTime: 4065.1459960938 ms
As you can see, fetching data by mongo is too fast, but iterating (in both case of using iterator_to_array or using foreach) is slow.
It is a queue for sending messages to another server. Destination server accept less than 70 documents per each request. So i forced to fetch 70 document. anyway. I want to fetch 70 documents from 1,300,000 documents and we have problem here.
query try to fetch first 70 documents which have query conditions, send them and finally delete them from collection.
can anybody help? why it takes long? or is there any config for accelerating for php or mongo?
Another thing, when total number of data is like 100,000 (isntead of 1,300,000) the traveling is fast. traveling time will increase by increasing number of total documents.
That was because of sorting.
The problem :
Fetching data from mongo was fast, but traveling in iterator was slow using foreach.
solution :
there was a sort which we used. sorting by priority DESC and create_time ASC. These fileds was index ASC seperatly. Indexing priority DESC and create_time ASC together fixed problem.
db.queue_1.createIndex( { "priority" : -1, "create_time" : 1 } )
order of fileds on indexing is important. means you should use priority at first. then use create_time.
because when you try to sort your query, you sort them like below :
.sort({priority : -1, create_time : 1});

Doctrine Paginator selects entire table (very slow)?

This is related to a previous question here: Doctrine/Symfony query builder add select on left join
I want to perform a complex join query using Doctrine ORM. I want to select 10 paginated blog posts, left joining a single author, like value for current user, and hashtags on the post. My query builder looks like this:
$query = $em->createQueryBuilder()
->select('p')
->from('Post', 'p')
->leftJoin('p.author', 'a')
->leftJoin('p.hashtags', 'h')
->leftJoin('p.likes', 'l', 'WITH', 'l.post_id = p.id AND l.user_id = 10')
->where("p.foo = bar")
->addSelect('a AS post_author')
->addSelect('l AS post_liked')
->addSelect('h AS post_hashtags')
->orderBy('p.time', 'DESC')
->setFirstResult(0)
->setMaxResults(10);
// FAILS - because left joined hashtag collection breaks LIMITS
$result = $query->getQuery()->getResult();
// WORKS - but is extremely slow (count($result) shows over 80,000 rows)
$result = new \Doctrine\ORM\Tools\Pagination\Paginator($query, true);
Strangely, count($result) on the paginator shows the total number of rows in my table (over 80,000) but traversing the $result with foreach outputs 10 Post entities, as expected. Do I need to do some additional configuration to properly limit my paginator?
If this is a limitation of the paginator class what other options do I have? Writing custom paginator code or other paginator libraries?
(bonus): How can I hydrate an array, like $query->getQuery()->getArrayResult();?
EDIT: I left out a stray orderBy in my function. It looks like including both groupBy and orderBy causes the slowdown (using groupBy rather than the paginator). If I omit one or the other, the query is fast. I tried adding an index on the "time" column in my table, but didn't see any improvement.
Things I Tried
// works, but makes the query about 50x slower
$query->groupBy('p.id');
$result = $query->getQuery()->getArrayResult();
// adding an index on the time column (no improvement)
indexes:
time_idx:
columns: [ time ]
// the above two solutions don't work because MySQL ORDER BY
// ignores indexes if GROUP BY is used on a different column
// e.g. "ORDER BY p.time GROUP BY p.id is" slow
You should simplify your query. That would shave off some execution time. I can't test your query but here are a few pointers:
don't do sort while executing count()
you could sort by orderBy('p.id', 'DESC'), index would be used
instead of leftJoin() you could use join() if at least one record always exists at joined table. Else that record is skipped.
KNP/Paginator uses DISTINCT() to read only distinct records, but that could lead to using disk tmp table
$query->getArrayResult() uses array hidration mode, which returns multidimension array and it is way faster than object hidration for large result set
you could use partial select('partial p.{id, other used fields}'), this way you would load only needed fields, maybe skip unneded relations when using object hydration
check SF profiler EXPLAIN on a given query under doctrine section, maybe indexes are not used
does p.hashtags and p.likes return only one row or is oneToMany, which multiplies result
maybe some Posts design changes, that would remove some joins:
have p.hashtags field defined as #ORM\Column(type="array") and have stored string values of tags. Later maybe using full text search on serialized array.
have p.likesCount field defined as #ORM\Column(type="integer") which would have count of likes
I use KnpLabs/KnpPaginatorBundle and can also have speed issues for complex queries.
Usually using LIMIT x,z is slow for DB, because it runs COUNT on whole dataset. If indexes are not used it is painfully slow.
You could use different approach and do some custom pagination by ID advancing, but that would complicate your approach. I have used this with large datasets like SYSLOG tables. But you loose sorting and total record count functionality.
At the end of the day, many of the queries used in my application are too complex to make proper use of the Paginator, and I wasn't able to use array hydration mode with the Paginator.
According to MySQL documentation, ORDER BY cannot be resolved by indexes if GROUP BY is used on a different column. Thus, I ended up using a couple post-processing queries to populate my base results (ORDERed and LIMITed) with one-to-many relations (like hashtags).
For joins that load a single row from the joined table, I was able to join the desired values in the base ordered query. For example, when loading the "like status" for a current user, only one like from the set of likes needs to be loaded to indicate whether or not the current post has been liked. Similarly, the presence of only one author for a given post produces a single joined author row. e.g.
$query = $em->createQueryBuilder()
->select('p')
->from('Post', 'p')
->leftJoin('p.author', 'a')
->leftJoin('p.likes', 'l', 'WITH', 'l.post_id = p.id AND l.user_id = 10')
->where("p.foo = bar")
->addSelect('a AS post_author')
->addSelect('l AS post_liked')
->orderBy('p.time', 'DESC')
->setFirstResult(0)
->setMaxResults(10);
// SUCCEEDS - because joins only join a single author and single like
// no collections are joined, so LIMIT applies only the the posts, as intended
$result = $query->getQuery()->getArrayResult();
This produces a result in the form:
[
[0] => [
['id'] => 1
['text'] => 'foo',
['author'] => [
['id'] => 10,
['username'] => 'username',
],
['likes'] => [
[0] => [
['post_id'] => 1,
['user_id'] => 10,
]
],
],
[1] => [...],
...
[9] => [...]
]
Then in a second query I load the hashtags for posts loaded in the previous query. e.g.
// we don't care about orders or limits here, we just want all the hashtags
$query = $em->createQueryBuilder()
->select('p, h')
->from('Post', 'p')
->leftJoin('p.hashtags', 'h')
->where("p.id IN :post_ids")
->setParameter('post_ids', $pids);
Which produces the following:
[
[0] => [
['id'] => 1
['text'] => 'foo',
['hashtags'] => [
[0] => [
['id'] => 1,
['name'] => '#foo',
],
[2] => [
['id'] => 2,
['name'] => '#bar',
],
...
],
],
...
]
Then I just traverse the results containing hashtags and append them to the original (ordered and limited) results. This approach ends up being much faster (even though it uses more queries), as it avoids GROUP BY and COUNT, fully leverages MySQL indexes, and allows for more complex queries, such as the one I posted here.
You can configure the paginator to use a simpler 'count' sql strategy by doing one or more of the optimizations below.
$paginator = new Paginator($query, false);
$paginator->setUseOutputWalkers(false);
If results are unexpected you may want to do a DISTINCT select (select('DISTINCT p'))
For us it made massive improvements and we had no need to write or use a custom paginator.
More details can be found on this site. Note that I am owner of that website.

SPARQL query returning incomplete/inconsistent results

I am trying to run the following query to get all the properties of a resource:
select distinct ?property
where {
<http://dbpedia.org/resource/Bildøy> ?property ?value
}
on http://dbpedia.org/snorql/
However, I only get a few results, and not the ones I was expecting. Most of the properties on this page are missing http://dbpedia.org/page/Bild%C3%B8y
Could this be because of the ø-letter in the URI? The query seems to be working fine with other resources, but having the same problem with other resource with the ø-letter (Example: http://dbpedia.org/page/Rad%C3%B8y).
When I run the query in a PHP script I get the following results:
array (
0 => 'dbpedia-owl:wikiPageInLinkCountCleaned',
1 => 'dbpedia-owl:wikiPageRank',
2 => 'dbpedia-owl:wikiHITS',
3 => 'dbpedia-owl:wikiPageOutLinkCountCleaned',
)
array (
0 => 'http://www.w3.org/2002/07/owl#sameAs',
)
It was the ø-letter causing the problem. By using the PHP urlencode() function on the resource name (turning it into UTF8 hex?) before the query, it will return the properties. ø is translated to %C3%B8 and is also the value being used in the DBpedia URI http://dbpedia.org/page/Bild%C3%B8y.

Laravel : How to take last n(any number) rows after ordered in ascending order?

I have 3 columns id, msg and created_at in my Model table. created_at is a timestamp and id is primary key.
I also have 5 datas, world => time4, hello => time2,haha => time1,hihio => time5 and dunno => time3 and these datas are arranged in ascending order (as arranged here) based on their id.
In laravel 4, I want to fetch these data, arrange them in ascending order and take the last n(in this case, 3) number of records. So, I want to get dunno,world and hihio rows displayed like this in a div :
dunno,time3
world,time4
hihio,time5
What I have tried
Model::orderBy('created_at','asc')->take(3);
undesired result :
haha,time1
hello,time2
dunno,time3
Also tried
Model::orderBy('created_at','desc')->take(3);
undesired result :
hihio,time5
world,time4
dunno,time3
I have also tried the reverse with no luck
Model::take(3)->orderBy('created_at','asc');
This problem seems fairly simple but I just can't seem to get my logic right. I'm still fairly new in Laravel 4 so I would give bonus points to better solutions than using orderBy() and take() if there is. Thank you very much!
You are very close.
It sounds like you want to first order the array by descending order
Model::orderBy('created_at','desc')->take(3);
but then reverse the array. You can do this one of two ways, either the traditional PHP (using array_reverse).
$_dates = Model::orderBy('created_at','desc')->take(3);
$dates = array_reverse($_dates);
Or the laravel way, using the reverse function in Laravel's Collection class.
$_dates = Model::orderBy('created_at','desc')->take(3)->reverse();
Check out Laravel's Collection documentation at their API site at http://laravel.com/api/class-Illuminate.Support.Collection.html
Now $dates will contain the output you desire.
dunno,time3
world,time4
hihio,time5
You're pretty close with your second attempt. After retrieving the rows from the database, you just need to reverse the array. Assuming you have an instance of Illuminate\Support\Collection, you just need to the following:
$expectedResult = $collection->reverse();
To get last three rows in ascending order:
$_dates = Model::orderBy('created_at','desc')->take(3)->reverse();
Now, the json output of $_dates will give you a object of objects.
To get array of objects use:
$_dates = Model::orderBy('created_at','desc')->take(3)->reverse()->values();
$reverse = Model::orderBy('created_at','desc')->take(3);
$show = $reverse->reverse();

Categories