I have this code:
$fee = sys_fee::where('payment', '=', 'Paid')->get();
$totalFee = $fee->sum('amount');
Can anyone tell me, whether traverses database for both of these lines above or does it only go to database once in the first line?
In simple, does the following line, execute another query on database or does it only work with the array?
$totalFee = $fee->sum('amount');
My current understanding is that it doesn't execute another query on database.
Your $fee is an objection of Collection, the sum() method doesn't make SQL calls, instead it goes through all the elements of that collection.
https://laravel.com/docs/5.7/collections#method-sum
Related
I am trying to extract from my php laravel some data like this:
$x = Carbon::now()->timestamp;
$data=Notifications->where('happened_at','>',$x)
->where('id_user',Auth::user()->getAuthIdentifier());
If I enter this sql statement in my oracle database directly it works perfectly:
select * from notifications where happened_at > '10-06-2017 00:11:12,000000000';
^ this returns correct, however in my php laravel it doesn't return the good rows(returns all the rows from the database)
Later Edit: The where is the problem, I just want to compare the timestamp location in my DB('happened_at') with the current time...I don't know how though
Let's break down your model call code.
// first, you retrieve all notifications and assign them to $data
$data = Notifications::all()
// then, you try to start a new query
->where('happened_at','>',$x)
// then, you continue the new query
->where('id_user',Auth::user()->getAuthIdentifier())
// and finally, you finish the call
;
So basically, you already get all rows at the beginning, and then you try to start building a query that you never execute. (But you can't, since all() returns a collection.)
Remove the all() and finish up with ->get() at the end, and it should work. (Although I don't know anything about Oracle timestamps.)
I have a query that returns roughly 6,000 results. Although this query executes in under a second in MySQL, once it is run through Zend Framework 2, it experiences a significant slowdown.
For this reason, I tried to do it a more "raw" way with PDO:
class ThingTable implements ServiceLocatorAwareInterface
{
// ...
public function goFast()
{
$db_config = $this->getServiceLocator()->get('Config')['db'];
$pdo = new PDO($db_config['dsn'], $db_config['username'], $db_config['password']);
$statement = $pdo->prepare('SELECT objectNumber, thingID, thingmaker, hidden, title FROM Things ', array(PDO::MYSQL_ATTR_COMPRESS, PDO::CURSOR_FWDONLY));
$statement->execute();
return $statement->fetchAll(PDO::FETCH_ASSOC);
}
}
This doesn't seem to have much of a speedup, though.
I think the problem might be that Zend is still trying to create a new Thing object for each record, even though it is only a partial list of columns. I'd really be okay not populating any objects. I really just need a few columns with those attributes to iterate over.
As suggested by user MonkeyZeus, the following was used for bench-marking:
$start = microtime(true);
$result = $statement->fetchAll(PDO::FETCH_ASSOC);
echo (microtime(true) - $start).' seconds';
And in response:
In a VM, that returns 0.0050520896911621. This is in line with what it
is when I just run the command straight in MySQL. I believe the
overhead is in Zend, but not sure how to quite go about that. Again if
I had to guess, I'd say it is because Zend is adding overhead while
trying to be nice with the results, but I'm not quite sure how to
proceed after that.
[I'm] not so worried about the query. It is a single select statement.
goFast() gets called by the Zend indexAction() --similar to other
actions used across the project--this one is just way slower at
returning the page. One problem I found was that Zend's $this->url()
was slowing things down a bit. So I removed those, but the performance
still isn't great.
How can I speed this up?
When you say , that query runs under a second in MySQL , what do you mean ? did you try to run this query and print ALL 6000 rows ? or you just queried this and command line printed first/last few of them ?
The problem might be that , you are fetching them all , going through cursor , you are copying all the data ( 6000 rows ) from MySQL to PHP and then returning it , are you sure you want to do this ?
Maybe you could return a statement/cursor to the Query and then iterate through rows when you really need it ?
Your problem is not the SQL itself , but fetching them into PHP array all at once.
You can test it by logging the time it needs to actually execute SQL and fetching it into PHP array.
Do not use fetchall , return the statement itself and in the function/code where you have to run "foreach" this array , use statement to fetch each row one by one.
When I execute a PDO statement, internally a result set is stored, and I can use ->fetch() to get a row from the result.
If I wanted to convert the entire result to an array, I could do ->fetchAll().
With Laravel, in the Query Builder docs, I only see a way to get an array result from executing the query.
// example query, ~30,000 rows
$scores = DB::table("highscores")
->select("player_id", "score")
->orderBy("score", "desc")
->get();
var_dump($scores);
// array of 30,000 items...
// unbelievable ...
Is there any way to get a result set from Query Builder like PDO would return? Or am I forced to wait for Query Builder to build an entire array before it returns a value ?
Perhaps some sort of ->lazyGet(), or ->getCursor() ?
If this is the case, I can't help but see Query Builder is an extremely short-sighted tool. Imagine a query that selects 30,000 rows. With PDO I can step through row by row, one ->fetch() at a time, and handle the data with very little additional memory consumption.
Laravel Query Builder on the other hand? "Memory management, huh? It's fine, just load 30,000 rows into one big array!"
PS yes, I know I can use ->skip() and ->take() to offset and limit the result set. In most cases, this would work fine, as presenting a user with 30,000 rows is not even usable. If I want to generate large reports, I can see PHP running out of memory easily.
After #deczo pointed out an undocumented function ->chunk(), I dug around in the source code a bit. What I found is that ->chunk() is a convenience wrapper around multiplying my query into several queries queries but automatically populating the ->step($m)->take($n) parameters. If I wanted to build my own iterator, using ->chunk with my data set, I'd end up with 30,000 queries on my DB instead of 1.
This doesn't really help, too, because ->chunk() takes a callback which forces me to couple my looping logic at the time I'm building the query. Even if the function was defined somewhere else, the query is going to happen in the controller, which should have little interest in the intricacies of my View or Presenter.
Digging a little further, I found that all Query Builder queries inevitably pass through \Illuminate\Database\Connection#run.
// https://github.com/laravel/framework/blob/3d1b38557afe0d09326d0b5a9ff6b5705bc67d29/src/Illuminate/Database/Connection.php#L262-L284
/**
* Run a select statement against the database.
*
* #param string $query
* #param array $bindings
* #return array
*/
public function select($query, $bindings = array())
{
return $this->run($query, $bindings, function($me, $query, $bindings)
{
if ($me->pretending()) return array();
// For select statements, we'll simply execute the query and return an array
// of the database result set. Each element in the array will be a single
// row from the database table, and will either be an array or objects.
$statement = $me->getReadPdo()->prepare($query);
$statement->execute($me->prepareBindings($bindings));
return $statement->fetchAll($me->getFetchMode());
});
}
See that nasty $statement->fetchAll near the bottom ?
That means arrays for everyone, always and forever; your wishes and dreams abstracted away into an unusable tool Laravel Query Builder.
I can't express the valley of my depression right now.
One thing I will say though is that the Laravel source code was at least organized and formatted nicely. Now let's get some good code in there!
Use chunk:
DB::table('highscores')
->select(...)
->orderBy(...)
->chunk($rowsNumber, function ($portion) {
foreach ($portion as $row) { // do whatever you like }
});
Obviously returned result will be just the same as calling get, so:
$portion; // array of stdObjects
// and for Eloquent models:
Model::chunk(100, function ($portion) {
$portion; // Collection of Models
});
Here is a way to use the laravel query builder for making the query, but to then use the underlying pdo fetch to loop over the record set which I believe will solve your problem - running one query and looping the record set so you don't run out of memory on 30k records.
This approach will use all the config stuff you setup in laravel so you don't have to config pdo separately.
You could also abstract out a method to make this easy to use that takes in the query builder object, and returns the record set (executed pdo statement), which you would then while loop over as below.
$qb = DB::table("highscores")
->select("player_id", "score")
->orderBy("score", "desc");
$connection = $qb->getConnection();
$pdo = $connection->getPdo();
$query = $qb->toSql();
$bindings = $qb->getBindings();
$statement = $pdo->prepare($query);
$statement->execute($bindings);
while ($row = $statement->fetch($connection->getFetchMode()))
{
// do stuff with $row
}
I have a query that is returning a single result and I'm wondering if there's a way to access the properties of that result without having to loop through it since I am limiting it to a single result.
Here is my query:
$user = Model_User::find()
->where('email_address', Input::post('email_address'))
->where('password', Input::post('password'))
->limit(1);
The only way I've found to access the results is to run the get() method on $user and loop through the result, but I figured I was missing something and that there was an easier way to return $user as a single object that I can work with since I am limiting it to a single result.
What's the most efficient way to do this?
Did you try
$user->get_one()?
You could also do
$user = Model_User::find_by_email_address_and_password(Input::post('email_address'), Input::post('password'));
Have a nice day :)
Uku Loskit ask you the right syntax.
If you want to retrieve always a single result, you can merge the code:
$user = Model_User::find()
->where('email_address', Input::post('email_address'))
->where('password', Input::post('password'))
->get_one();
An advice for you: be careful using directly Input::post('var_name'), it would be better to use a validation before saving variables.
Another way is to set the framework to perfrom some action, like htmlentities(), for every $_GET and $_POST variable.
I want to use a single query to retrieve:
items of any categories (no filter applied);
only items of a single category (limited to a particular category);
For that purpose I should be able to write a Doctrine query which would include a where clause only when certain condition is met (eg. part of URL existing), otherwise, where clause is not included in the query.
Of course, i tried with using the If statement, but since doctrine query is chained, the error is thrown.
So i guess the solution might be some (to me unknown) way of writing doctrine queries in an unchained form (by not having each row started with "->" and also having each row of a query ending with semicolon ";")
That way the usage of IF statement would be possible i guess.
Or, maybe there's already some extremely simple solution to this matter?
Thanks for your reply!
I am unfamiliar with Codeigniter but can't you write something like this?
$q = Doctrine_Query::create()
->from('items');
if ($cat)
$q->where('category = ?', $cat);
In your model pass the condition for where as a parameter in a function.
In below example i am assuming the function name to be filter_query() and passing where condition as a parameter.
function filter_query($condition=''){
$this->db->select('*');
$this->db->from('TABLE NAME');
if($condition != ''){
$this->db->where('condition',$condition);
}
}
In above example i have used Codeigniter's Active Record Class.