I have been working on laravel using eloquent query builders. I have a situation in which I am filtering records based on search queries. I have an integer field, for which I want to filter data in ranges. User can select any of available ranges for example;
0-15, 15-30 and 30 and above.
For this purpose I found Query-builders whereBetween() clause very helping. But It becomes difficult for me for last option when I want to select for 30 and above.
I would really appreciate if someone could help me with some trick make this query
->whereBetween('time_taken', [$lower-limit,$uper_limt])
working for all cases.
I don't want to write an additional line for this case, at which I can use simple where clause
->where('time_taken','>=',$uper_limt).
The practical solution here is to just choose the appropriate SQL condition (I'm using a ternary operator here to keep it more compact):
$query = App\Operation::query();
(empty($upper_limit)) ? $query->where('time_taken','>=', $lower_limit)
: $query->whereBetween('time_taken', [$lower_limit, $upper_limit]);
$results = $query->get();
Sure it would be nice to have just one line:
$results = App\Operation::whereBetween('time_taken', [$lower_limit, $upper_limit])->get();
But that's not possible in this case, not unless you want to extend the Laravel Query Builder and modify the way it handles empty parameters in the range passed as the value.
Writing clean and concise code is something we all strive to achieve, but one line solutions are not always possible. So my advice is to stop fixating (something I sometimes do myself) on things like this, because in some cases it's a lost cause that just ends up wasting time.
You can try any of these:
Method 1:
->whereBetween('time_taken', [$lower-limit,ModelName::max('time_taken')])
Method 2:
->whereBetween('time_taken', [$lower-limit,DB::table('table_name')->max('time_taken')])
Method 3:
$max = ModelName::max('time_taken');
//or
$max = DB::table('table_name')->max('time_taken');
//then
->whereBetween('time_taken', [$lower-limit,$max])
max() returns the highest value from your corespondent column.
Related
My application dynamically builds and runs complex queries to generate reports. In some instances I need to get multiple, somewhat arbitrary date ranges, with all other parameters the same.
So my code builds the query with a bunch of joins, wheres, sorts, limits etc and then runs the query. What I then want to do is jump into the Builder object and change the where clauses which define the date range to be queried.
So far, I have made it so that the date range is setup before any other wheres and then tried to manually change the value in the relevant attribute of the wheres array. Like this;
$this->data_qry->wheres[0]['value'] = $new_from_date;
$this->data_qry->wheres[1]['value'] = $new_to_date;
Then I do (having already done it once already)
$this->data_qry->get();
Doesn't work though. The query just runs with the original date range. Even if my way worked, I still wouldn't like it though as it seems to be shot through with a precarious dependence (some sort of coupling?). Ie; if the date wheres aren't set up first then it all falls apart.
I could set the whole query up again from scratch, just with a different date range, but that seems ott as everything else in the query needs to be the same as the previous time it was used.
Any ideas for how to achieve this in the correct / neatest way are very welcome.
Thanks,
Geoff
You can use clone to duplicate the query and then run it with different where statements. First, build the query without the from-to constraints, then do something like this:
$query1 = $this->data_qry;
$query2 = clone $query1;
$result1 = $query1->where('from', $from1)->where('to', $to1)->get();
$result2 = $query2->where('from', $from2)->where('to', $to2)->get();
The suggestion from #lukasgeiter using clone is definitely the way to go; the reason is that an Eloquent\Builder object contains an internal reference to a Query\Builder that needs to be duplicated.
To keep the flow of your app and get back to a more functional style, you can use Laravel's with() helper, which simply returns the object passed in:
$result1 = with(clone $this->data_qry)->where('from', $from1)->where('to', $to1)->get();
$result2 = with(clone $this->data_qry)->where('from', $from2)->where('to', $to2)->get();
For the people who want a simpler and shorter syntax, you can daisy chain the clone() method on the query builder.
$result1 = $this->data_qry->clone()->where('from', $from1)->where('to', $to1)->get();
$result2 = $this->data_qry->clone()->where('from', $from2)->where('to', $to2)->get();
Kind of strange question, I assume PHP is faster but maybe someone knows better.
Question is the following:
A table with approximately 10k rows, varchar(550) field. Field is indexed.
Select that retrieves data has a lot of OR-cases and does a LIKE scan for every condition.
Query is run from PHP script.
Would it be faster to lowercase search substring before running query in script?
e.g.
SELECT * FROM table WHERE field LIKE LOWER('%substring_1%') OR field LIKE LOWER ('%substring_2%') OR ... field LIKE LOWER ('%substring_100%')
Or something similar to:
//$substr_array contains all substrings
$condition = "";
foreach ($substr_array as $substring) {
$condition .= "field LIKE '%".mb_strtolower($substring, "UTF-8")."%' OR ";
}
$condition = substr($condition, 0, -4);
$query = "SELECT * FROM table WHERE $condition";
$result = query($query);
This is an opinion based question, but a question nonetheless. So here goes my answer:
SELECT *
FROM table
WHERE field LIKE LOWER('%substring_1%')
OR field LIKE LOWER ('%substring_2%')
OR ... field LIKE LOWER ('%substring_100%')
this scenario depends on how well you DB is structured. If you don't have a lot of records in your database then it should not matter giving the "outstanding" performances that we have nowadays...
However: "Another way for case-insensitive matching is to use a different “collation”. The default collations used by SQL Server and MySQL do not distinguish between upper and lower case letters—they are case-insensitive by default." (extract from use-the-index-luke)
You should read that article, it's very good.
Now down to the second part. The most expensive, as a performance cost, in the PHP code will be the foreach() statement. HERE you can find more info on how much time does a foreach() loop takes and a very good comparison with for() and while() loops. It's good to know them. So basically, what you do here is first take some time preparing the query, which costs time, and then actually running it, which costs time.
In conclusion, the real battle here will be: Which is faster? A foreach() or the LOWER() function in SQL.
Providing that I gave you some good materials to read in, and form your own conclusions, in my opinion, I think that the best way to go about this is to just leave the query and drop the foreach(), because queryes are the most cost-expensive and adding a foreach() will just slow you performance down. At least you cut something out of the picture. :)
I really hope that this helps you!
Keep on coding!
Ares.
I would like to know if there is a way / method to set conditional boosts
Ex:
if( category:1 )( field_1^1.5 field_2^1.2 )else if( category:3 )( field_3^7.5 field_4^5.2 )
I'm planning to set the "qf" and "pf" parameters this way in order to boost my results, is it possible?
Conceptually - yes, it could be done using function queries (http://wiki.apache.org/solr/FunctionQuery), it contains if function, but I wasn't able to do that by myself, since i couldn't use == operator.
Also, you could write your own function query.
But anyway right now it more looks like a good place to start, not concrete answer.
I think you have two ways of doing this...
First way, is by simplifying things at index time, so maybe create other set of redundant fields in the schema (ex: boostfield_1, boostfield_2, etc), and if the document category is 1, you can set the value of boostfield_1 to field_1, and boostfield_2 to field_2. But if category is 2, you can set it to other fields.
This will allow you to use "pf" straight away without having any conditions, as you already specified the conditions at index time, and indexed the document differently based on the category. The problem with that, is you won't be able to change the score of boost values of the fields according to the category, but it is a simpler way anyway
Use the _val_, or bq parameters to specify a boost query, and you can write the same query differently, so you can write the same condition as the following:
url?q=query AND _val_:"(category:1 AND (field_1:query OR field_2:query)) OR (category:3 AND field_2:query)"
The little problem here as well is you repeat the query text in every inner query, which is not a big deal anyway.
I'm not sure that I have the terminology correct but basically I have a website where members can subscribe to topics that they like and their details go into a 'favorites' table. Then when there is an update made to that topic I want each member to be sent a notification.
What I have so far is:
$update_topic_sql = "UPDATE topics SET ...My Code Here...";
$update_topic_res = mysqli_query($con, $update_topic_sql)or die(mysqli_error());
$get_favs_sql = "SELECT member FROM favourites WHERE topic = '$topic'";
$get_favs_res = mysqli_query($con, $get_favs_sql);
//Here I Need to update the Members selected above and enter them into a notes table
$insert_note_sql = "INSERT INTO notes ....";
Does anyone know how this can be achieved?
Ok, so we've got our result set of users. Now, I'm going to assume from the nature of the question that you may be a bit of a newcomer to either PHP, SQL(MySQL in this case), web development, or all of the above.
Your question part 1:
I have no idea how to create an array
This is easier than what you may think, and if you've already tried this then I apologize, I don't want to insult your intelligence. :)
Getting an array from a mysqli query is just a matter of a function call and a loop. When you ran your select query and saved the return value to a variable, you stored a mysqli result set. The mysqli library supports both procedural and object oriented styles, so since you're using the procedural method, so will I.
You've got your result set
$get_favs_res = mysqli_query($con, $get_favs_sql);
Now we need an array! At this point we need to think about exactly what our array should be of, and what we need to do with the contents of the request. You've stated that you want to make an array out of the results of the SELECT query
For the purposes of example, I'm going to assume that the "member" field you've returned is an ID of some sort, and therefore a numeric type, probably of type integer. I also don't know what your tables look like, so I'll be making some assumptions there too.
Method 1
//perform the operations needed on a per row basis
while($row = mysqli_fetch_assoc($get_favs_res)){
echo $row['member'];
}
Method 2
//instead of having to do all operations inside the loop, just make one big array out of the result set
$memberArr = array();
while($row = mysqli_fetch_assoc($get_favs_res)){
$memberArr[] = $row;
}
So what did we do there? Let's start from the beginning to give you an idea of how the array is actually being generated. First, the conditional in the while loop. We're setting a variable as the loop condition? Yup! And why is that? Because when PHP (and a lot of other languages) sets that variable, the conditional will check against the value of the variable for true or false.
Ok, so how does it get set to false? Remember, any non boolean false, non null, non 0 (assuming no type checking) resolves to true when it's assigned to something (again, no type checking).
The function returns one row at a time in the format of an associative array (hence the _assoc suffix). The keys to that associative array are simply the names of the columns from the row. So, in your case, there will be one value in the row array with the name "member". Each time mysqli_fetch_assoc() is called with your result set, a pointer is pointed to the next result in the set (it's an ordered set) and the process repeats itself. You essentially get a new array each time the loop iterates, and the pointer goes to the next result too. Eventually, the pointer will hit the end of the result set, in which case the function will return a NULL. Once the conditional picks up on that NULL, it'll exit.
In the second example, we're doing the exact same thing as the first. Grabbing an associative array for each row, but we're doing something a little differently. We're constructing a two dimensional array, or nested array, of rows. In this way, we can create a numerically indexed array of associative arrays. What have we done? Stored all the rows in one big array! So doing things like
$memberArr[0]['member'];//will return the id of the first member returned
$memberArr[1]['member'];//will return the id of the second member returned
$lastIndex = sizeof($memberArr-1);
$memberArr[$lastIndex]['member'];//will return the id of the last member returned.
Nifty!!!
That's all it takes to make your array. If you choose either method and do a print_r($row) (method 1) or print_r($memberArr) (method 2) you'll see what I'm talking about.
You question part 2:
Here I Need to update the Members selected above and enter them into a notes table
This is where things can get kind of murky and crazy. If you followed method 1 above, you'd pretty much have to call
mysqli_query("INSERT INTO notes VALUES($row['member']);
for each iteration of the loop. That'll work, but if you've got 10000 members, that's 10000 inserts into your table, kinda crap if you ask me!
If you follow method two above, you have an advantage. You have a useful data structure (that two dim array) that you can then further process to get better performance and make fewer queries. However, even from that point you've got some challenges, even with our new processing friendly array.
The first thing you can do, and this is fine for a small set of users, is use a multi-insert. This just involves some simple string building (which in and of itself can pose some issues, so don't rely on this all the time) We're gonna build a SQL query string to insert everything using our results. A multi insert query in MySQL is just like a normal INSERT except for one different: INSERT INTO notes VALUES (1),(2),(x)
Basically, for each row you are inserted, you separate the value set, that set delineated by (), with a comma.
$query = 'INSERT INTO notes VALUES ';
//now we need to iterate over our array. You have your choice of loops, but since it's numerically indexed, just go with a simple for loop
$numMembers = sizeof($memberArr);
for($i = 0; $i < $numMembers; $i++){
if($i > 0){
$query .= ",({$membersArr[$i]['member']})";//all but the first row need a comma
}
else {
$query .= "({$membersArr[$i]['member']})";//first row does not need a comma
}
}
mysqli_query($query);//add all the rows.
Doesn't matter how many rows you need to add, this will add them. However, this is still going to be a costly way to do things, and if you think your sets are going to be huge, don't use it. You're going to end up with a huge string, TONS of string concats, and an enormous query to run.
However, given the above, you can do what you're trying to do.
NOTE: These are grossly simplified ways of doing things, but I do it this way because I want you to get the fundamentals down before trying something that's going to be way more advanced. Someone is probably going to comment on this answer without reading this note telling me I don't know what I'm doing for going about this so dumbly. Remember, these are just the basics, and in no way reflect industrial strength techniques.
If you're curious about other ways of generating arrays from a mysqli result set:
The one I used above
An even easier way to make your big array but I wanted to show you the basic way of doing things before giving you the shortcuts. This is also one of those functions you shouldn't use much anyway.
Single row as associative(as bove), numeric, or both.
Some folks recommend using loadfiles for SQL as they are faster than inserts (meaning you would dump out your data to a file, and use a load query instead of running inserts)
Another method you can use with MySQL is as mentioned above by using INSERT ... SELECT
But that's a bit more of an advanced topic, since it's not the kind of query you'd see someone making a lot. Feel free to read the docs and give it a try!
I hope this at least begins to solve your problem. If I didn't cover something, something didn't make sense, or I didn't your question fully, please drop me a line and I'll do whatever I can to fix it for you.
So I have a variable and a recordset:
$firstRecordID = 1;
$records = Recordset::all();
I want to filter the recordset:
$filteredRecords = $records->find(function($record) use ($firstRecordID){
return ($record->id == $firstRecordID);
});
In this example, assuming the record id is a primary key, there will only be one row that will get returned, however, I think the filter will keep on going.
My question is, how do I force the filter to stop after a certain condition is met?
edit: I'll add another more complex example:
$filteredRecords = $records->find(function($record) use ($lastRecordID){
$conditions == $records->conditions;
// in here I would find some value
// based on some other unknown record based on the $conditions
// if the value matches any of the $conditions return the row
// if the value matches a specified stop condition
// (which is defined by the user) stop retrieving rows.
});
I think the solution is to use first($filter)
$filteredRecords = $records->first(function($record) use ($firstRecordID){
return ($record->id == $firstRecordID);
});
http://li3.me/docs/lithium/util/Collection::first
Short answer
Throw an exception inside the filter and catch it on the outside. You will probably have to collect the included items in an array yourself since you won't have access to find's return value.
Long answer
You are focusing too much on a particular tactic (stopping find using a filter) when it might do you some good to step back and consider what you are trying to accomplish.
It honestly sounds like you are trying to eat your cake and have it too: you want to be flexible and specify your search conditions in a function you can easily change, but you also want to be efficient and not pull more data out of the database than you have to.
The thing is, passing a filter function into find is already as inefficient as you can get it, because li3 will have to pull every record from the database before your filter is even called. That's where the inefficiency is. Stopping the filtering process once you hit 10 items or whatever won't make much of a difference (unless your filtering is also very expensive).
I would recommend thinking of the following:
Do you really need to allow any filter method at all? Is it possible to use declarative conditions? If so, you can supply limit options and let the database do all the work.
If you can't, consider using find to fetch a small batch of records and collect the ones you want to keep in an array. Repeat with another small batch until you've checked every item in the database or filled your array.