Creating PDO Iterator - php

I'm flowing this article to create Database Iterator. I've similar tables and records. But when I executing I'm getting blank page. I think issue on $data = new DbRowIterator($stmt);, It's not returning proper iterator to run accept method in FilterIterator class. I'm using laravel

This article is a hoax. It claims that offered technique would speed up database calls with PDO, but in fact it slows them down significantly.
Premises on which it is grounded are also wrong. PDOStatement is already traversable, you don't need no tricks to iterate over PDOStatement using foreach.
The benchmarking section (as it often happens) is a blatant swindle. The guy is comparing fetchAll(), which obviously consumes a lot of time/memory, with fetching a single row at a time. And even this way his timing is much worse than with a proper solution.
The guy who wrote this article knows no PDO and - worse yet - no SQL.
Everything he invented already exists in PDO and SQL:
PDOStatement is already traversable. No need to reinvent the wheel.
The most important part: filtering has to be done in SQL, not in PHP. That's a textbook rule. If you need only 63992 out of 250000 records, you should select only 63992 in the first place. And if you ask a database to select the records for you, it will be incomparable faster.
So, to get everything this guy wrote with such effort, you need only few lines with PDO
$period = date_create("last_week")->format('Y-m-d 00:00:00');
$sql = 'SELECT * FROM `gen_contact` WHERE contact_modified > ? ORDER BY `contact_modified` DESC';
$stmt = $pdo->prepare($sql);
$stmt->execute([$period]);
foreach ($stmt as $row) {
echo sprintf(
'%s (%s)| modified %s',
$row->contact_name,
$row->contact_email,
$row->contact_modified
) . PHP_EOL;
}
And this code will be a multitude times faster as it does the filtering on the database level, instead of fetching the whole table and then filtering on PHP side.
As of the error you get, just set PDO in exception mode and watch for the regular PHP errors.

PDOStatement is not iterable, it's traversable:
It doesn't implements the \Iterable interface but the \Traversable interface, meaning it can be used in a foreach but you can't iterate manually over it because it doesn't expose it's inner behavior.
I don't think this makes much difference in the present case but there is still a major difference between the two interfaces.
I stumbled upon this question looking for a way to properly iterate over a PDOStatement (without using a foreach) and the article mentioned, although pretty clumsy looks actually fine to me.

As #VincentChalnot said - PDO doesn't return instances of the Iterable class, but it does return Traversable's, and it turns out that PHP provides an Iterator for Traversable objects - the IteratorIterator (http://php.net/manual/en/class.iteratoriterator.php).
So if you want to use any of PHP's built in Iterator classes, for example the CachingIterator, with a PDOStatement then you would do it like this:
$iterator = new CachingIterator(new IteratorIterator($stmt));

Related

Is better one function with variable table/field/value, or 60 functions with variable value?

In a php/mysql project design, is it better to write one function like this
function getFromTableOneInt($l, $t, $f, $v) {
$stmt = mysqli_prepare($l, "select * from $t where $f = ?");
mysqli_stmt_bind_param($stmt, "i", $v);
mysqli_stmt_execute($stmt);
$res = mysqli_stmt_get_result($stmt);
$out = array();
while ($r=mysqli_fetch_assoc($res)){
$out[] = $r;
}
return $out;
}
where table name, field name, value, are variables, or is it better to write N functions each one with defined table name and field name?
Considering that table names and field names comes from direct calls and are not defined by user. User inputs define just the last value, the $v.
Is there some exploit on table/field name that I can't see?
Is it memory heavier to load 1200 lines of functions saeved in a php file each include? Or in php, functions declarations loads in memory just pointers?
Which one is faster or more readable, based on your experience?
Thanks for any input/suggestions.
Generalize your Database handling like opening, query and etc into a Class (not the statement).
But provide multiple methods that use mentioned general class, like:
registerUser($name, $password);
findUserByName($name);
findUserByEmail($email);
Think about it like this: Can my single method handle all of above?
No, you will still end up writing multiple methods.
Reusing same statement and saving few bytes of memory is a concern once you write a virus (not in Web-Site's case).
TL;DR - For me its about extensibility and scalability. IMHO use a supported framework for this kind of thing or stick with inline SQL.
Frameworks
There are frameworks for this. Codeigniter, Laravel, Symfony, CakePHP and others abstract out the database functions to make them agnostic (PDO, MSSQL, MySQL users all use the same syntax to run their queries). So at any given moment you could move your MySQL DB over to MSSQL and not have to update your code. This can make the syntax more readable for most simple queries and help with things like query binding and value escaping. I have found with larger, more complex queries, these frameworks get in the way and I just want to look at a block-structured query to troubleshoot.
DIYs
I once was brought in to fix code like this from another dev - he had abstracted all the mysql queries in the app (as well as other tasks) to helper functions. Over time, the special conditions and special needs added up and these helper functions became quite large. What's worse was to other developers (like me) it made the code less readable. I had to go back and undo all that mess.
That's not to say you don't have a fine function there, just my 2 cents.

is this the wrong/inefficient way to create and use PDO objects in php?

I'm messing around with an example site from teamtreehouse's PHP/MySQL project... In their 'model' code, they have all DB calls inside functions inside a file called products.php.. each of these functions will create a new PDO object by importing an include file.. for example:
function get_products_recent() {
require(ROOT_PATH . "inc/database.php"); //this instantiates a new PDO object called $db
try {
$results = $db->query("
SELECT name, price, img, sku, paypal
FROM products
ORDER BY sku DESC
LIMIT 4");
} catch (Exception $e) {
echo "Data could not be retrieved from the database. get_products_recent";
exit;
}
$recent = $results->fetchAll(PDO::FETCH_ASSOC);
$recent = array_reverse($recent);
return $recent;
}
But I was finding the db queries were slowing down page loads significantly..
After some googling I found the PDO::ATTR_PERSISTENT => true attribute that can be added to the PDO constructor... and that has sped up the pages loads back to 'normal'..
But is this wrong/inefficient practice for real-world scenarios..? is there a better way to be opening and using the PDO object rather than creating a new PDO object inside every function call that makes a db call?
Yes, you are doing it wrong.
Yes, you have to declare single PDO object at the beginning of the file and use it throughout the whole application. This is a very basic rule, that have to be followed despite any circumstances. This is not the "best", this is the only acceptable design.
In fact, you are practically killing your your DB server, opening as many connections as many times PDO object is created. And time consumed for the connection is not the only problem - on a live server max number of connections will be reached immediately, and using persistent connection will make it even worse.
And yes, you should have titled your question other way, as people here never bother to read the question body but judge the question by its title only.
Setting persistent connection is a false solution. This feature has its own reasons to use which has noting to do with correcting initially wrong design.
The reason why you get your page loading slow is just creating a new database connection every time you run a query. As simple as that. So - create a PDO object only once and then pass it as a parameter (as most convenient and less hated by fanatics solution).
I typically create a database connection class. Usually it's a singleton. The PDO object is a protected property of the class. My model files will extend this class. Thus, I have a single PDO object which is secure because it can only be accessed by classes which extend the superclass.
Are you using prepared statements? This will enhance performance. It will also greatly reduce the risk of sqli attacks.
Here are some other things to consider when optimizing DB performance:
what storage engine are you using?
what are your mysql configuration settings?
are you using primary keys?
can you use a join instead of performing multiple queries?
can you cache queries?
do you have any particularly slow queries which could possibly be refactored?
HTH -- n

With Doctrine what are the benefits of using DQL over SQL?

Can someone provide me a couple clear (fact supported) reasons to use/learn DQL vs. SQL when needing a custom query while working with Doctrine Classes?
I find that if I cannot use an ORM's built-in relational functionality to achieve something I usually write a custom method in the extended Doctrine or DoctrineTable class. In this method write the needed it in straight SQL (using PDO with proper prepared statements/injection protection, etc...). DQL seems like additional language to learn/debug/maintain that doesn't appear provide enough compelling reasons to use under most common situations. DQL does not seem to be much less complex than SQL for that to warrant use--in fact I doubt you could effectively use DQL without already having solid SQL understanding. Most core SQL syntax ports fairly well across the most common DB's you'll use with PHP.
What am I missing/overlooking? I'm sure there is a reason, but I'd like to hear from people who have intentionally used it significantly and what the gain was over trying to work with plain-ole SQL.
I'm not looking for an argument supporting ORMs, just DQL when needing to do something outside the core 'get-by-relationship' type needs, in a traditional LAMP setup (using mysql, postgres, etc...)
To be honest, I learned SQL using Doctrine1.2 :) I wasn't even aware of foreign-keys, cascade operations, complex functions like group_concat and many, many other things. Indexed search is also very nice and handy thing that simply works out-of-the-box.
DQL is much simpler to write and understand the code. For example, this query:
$query = ..... // some query for Categories
->leftJoin("c.Products p")
It will do left join between Categories and Products and you don't have to write ON p.category_id=c.id.
And if in future you change relation from one-2-many to let's say many-2-many, this same query will work without any changes at all. Doctrine will take care for that. If you would do that using SQL, than all the queries would have to be changed to include that intermediary many-2-many table.
I find DQL more readable and handy. If you configure it correctly, it will be easier to join objects and queries will be easier to write.
Your code will be easy to migrate to any RDBMS.
And most important, DQL is object query language for your object model, not for your relational schema.
Using DQL helps you to deal with Objects.
in case inserting into databae , you will insert an Object
$test = new Test();
$test->attr = 'test';
$test->save();
in case of selecting from databae, you will select an array and then you can fill it in your Object
public function getTestParam($testParam)
{
$q=Doctrine_Query::create()
->select('t.test_id , t.attr')
->from('Test t ')
$p = $q->execute();
return $p;
}
you can check the Doctrine Documentation for more details
Zeljko's answer is pretty spot-on.
Most important reason to go with DQL instead of raw SQL (in my book): Doctrine separates entity from the way it is persisted in database, which means that entities should not have to change as underlying storage changes. That, in turn, means that if you ever wish to make changes on the underlying storage (i.e. renaming columns, altering relationships), you don't have to touch your DQL, because in DQL you use entity properties instead (which only happen to be translated behind the scenes to correct SQL, depending on your current mappings).

Multiple PDO instances in a script?

I've got a 'best practice' question about using PDO. I'm trying to finally get into object-oriented development and away from the PHP habits I developed ten years ago.
My normal method of development was to open a DB connection at the beginning of a script/page, then do a bunch of mysql_query calls as needed (mostly SELECT and INSERT). I'm using PDO for the first time, and it looks like the practice here is to create distinct PDO objects for each query/transaction. These seems like it would create multiple connections to the same DB during a string, which seems like a lot of unnecessary overhead.
Is my read on this totally wrong?
My apologies if this is covered somewhere I missed. I did look through StackOverflow, php.net, and a PHP 5.3 book I have.
No, you should not create multiple instances of PDO in that case. Just create 1 instance and use PDO::query() on it. For example:
$pdo = new PDO(...);
/* ... */
$pdo->query("SELECT * FROM table1");
/* ... */
$pdo->query("SELECT * FROM table2");
/* ... etc ... */
If the query contains parameters, then prefer using PDO::prepare() and PDOStatement::execute() instead of PDO::query(). You can find an example in the documentation for PDO::prepare().
The PDO Object stores the connection in which the queries are carried out. Normally, you only need one of those.
While there are cases where 2 connections might be convinient (When connecting to entirely different databases for instance), you generally only need one instance of the PDO class.
What you will have multiple instances of, is the PDOStatement class. Which stores the query statements themselves (as well as the results). So for every query you will have a single PDOStatement instance.
yes, you initially connect to the database by creating an instance of the PDO object. But you use this object to run queries with and the results are packed into another class.
So you mainly have several classes with your query results, but just one connection to the database which runs the queries.

Designing a general database interface in PHP

I'm creating a small framework for my web projects in PHP so I don't have to do the basic work over and over again for every new website. It is not my goal to create a second CakePHP or Codeigniter and I'm also not planning to build my websites with any of the available frameworks as I prefer to use things I've created myself in general.
I have had no problems in designing and coding the framework when it comes to parts like the core structure, request handling, and so on, but I'm getting stuck with designing the database interface for my modules.
I've already thought about using the MVC pattern but found out that it would be a bit of an overkill for my rather small project(s).
So the exact problem I'm facing is how my frameworks modules (viewCustomers could be a module, for example) should interact with the database.
Is it (still) a good idea to mix in SQL directly into PHP code? (Would be "old way": mysql_query( 'SELECT firstname, lastname(.....))?
How could I abstract a query like the following?
SELECT firstname, lastname FROM customers WHERE id=X
Would MySQL "helper" functions like
$this->db->customers->getBy( 'id', $x );
be a good idea?
I'm not really sure because they tend to become useless when dealing with more complicated queries like the pretty much trivial one above.
Is the "Model" pattern from MVC my only real option to solve this?
What do you currently use to solve the problems shown above?
I believe you just want to get access to your DB from your module. I'd avoid using mysql_query directly from the code. Rather, going for simple model with abstracted DB access would be easy and straight-forward.
For example, you can have a file like models/Customers.php with this code:
<?php
class Customers {
public function getById($id) {
$sql = "SELECT first_name, last_name FROM customers WHERE id='$id'";
$res = $DB::getRow($sql);
return ($res);
}
}
I am assuming some kind of DB helper is already instantiated and available as $DB. Here is a simple one which uses PDO.
Now, you should include this in your module and use the following way:
<?php
include_once "models/Customers.php";
$customers = new Customers();
$theCustomer = $customers->getById(intval($_REQUEST['cust_id']));
echo "Hello " . $theCustomer['first_name']
Cheers.
have you looked into http://www.doctrine-project.org/ or other php orm frameworks (zend_db comes to mind)?
If you need speed, then use raw queries (but you should really use PDO with prepared queries).
If you want something more OOP, you can —as you suggest it— design this with helpers.
Once, I've designed something similar which had the following concept:
DB connection/handler classes (handling multi-connections to different databases and different servers such as MySQL, Oracle, etc.);
A class per action (ie. SELECT, DELETE, etc.);
Filter classes (eg. RangeFilter);
The code looked something like this:
$select = new Select('field1', 'field2', );
$result = $select->from('myTable')
->addFilter(SQLFilter::RangeFilter, 'field2')
->match(array(1, 3, 5))
->unmatch(array(15, 34))
->fetchAll();
It's a simple example of how you can build it.
You can go further and implements automated handling of table relations, field type check (using introspection on your tables), table and field alias support, etc.
It might seem to be a long and hard work, but actually, it won't take you that much time to make all these features (≈1 month).
Three tips:
Use Stored Procedures (so you can separate the php from the db)
Use PDO/MySQLi for prepared statements CALL NEWS_LIST(?, ?)
Use a Static Class for your DB. Allows you to access it within any module.
Raw SQL is still the winner for me, I like to control what I send to the server (for cases like index usage, complex JOIN clauses and etc) so I generally stay away from helper functions.
You should use PDO which already provides a lot of power and if that's not enough, you can extend that (possibly with your own functions, such as checking for hits on Memcached/APC before actually querying the database). You can also extend the class to implement your own SQL functions like:
function getUser($user_id) {
return $this->query("SELECT * FROM users WHERE id = " . (int) $user_id);
}
Of course that, from the model you should still be able to send:
$this->db->query("SELECT * FROM users WHERE id = " . (int) $user_id);
and get the same result. The functions should act merely as a shortcut and the extended class should not be included with the framework as it will be site-dependant.
The MVC pattern will fit nicely into this because you can use the database merely as a driver and your model can then transform the data into what you need. It's not hard to create a simple MVC structure and it will bring you benefits later.
You sound like me. Have you seen http://github.com/Xeoncross/micromvc and the one file ORM in http://github.com/Xeoncross/database? Dig through my code and I think you will find what you're looking for.
The solution is to use the full raw power of some queries - while still allowing ORM and query builders (like codeigniter's AR) for other things.
Both are good.
Not that i know the definitive answer (nor do i think it exists), but i thought i can just share what i have here. I use my own db 'framework', lightweight (~1000 lines currently) and easy to use. My main goal was to simplify the use of sql, not to 'hide' it from the programmer (me:). Some examples:
// row() is 'query' + 'fetch' in one
$user = $db->row("select * from users where id=25");
// the same, injection safe
$user = $db->row("select * from users where id=?", $_GET['id']);
// ? placeholders are smart
$someUsers = $db->rows("select * from users where id IN(?)", array(1, 2, 10));
// ...and even smarter
$data = array('name' => 'Joe', 'age' => 50);
$id = 222;
$db->exec("update users set ?a where id=?", $data, $id);
// 'advanced' fetch functions
$topNames = $db->vlist("select name from users order by name limit 10");
$arrayOfIds = $db->nlist("select id from users where age > 90");
// table() returns a Table Gateway
$db->table('users')->delete('where id=?', 25);
// yes, this is safe
$db->table('users')->insert($_POST);
// find() returns a Row Gateway object
$db->table('users')
->find('where name=?', 'Joe')
->set('status', 'confirmed')
->save();
Understand this: database interaction is a solved problem.
So unless you really want to do it a) for the experience or b) because you're OCD and want to know every character of the code you'll be using, then I'd choose an existing solution.
And there are many: PEAR::MDB2, Zend::Db, Creole, Doctrine, Propel, just to name a few.
I've just come off the "helper functions" path and the one thing that bugged me was that I continued adding functions in one file which grew and grew with identical or similar or defunct functions. I think the line count was at 600 and that is way to much for a single file in my opinion. This has not put me off the idea but I'll be more organised for the next trek. I'll probably split the db functions into multi files according to the db operation (select, insert etc...).
So my advice is to go try the "helper functions" and be as organized as you can.
Also, I used PDO for the first time and quite liked it. Its not as low tech as the mysql() functions or as bloat tech like some we could mention but won't. I'll be using PDO again.
It seems like there are many different opinions on this topic and as I haven't found a really satisfying answer here yet and the bounty is nearly over, I'll just write what I have come up in the last days after some trial and error:
I'm using a singleton MySQL class to handle the connection and the very basic queries as well as errors that may occur.
Single pages like /users/show/1 (using mod_rewrite) don't use raw SQL but some kind of lightweight ORM that works like in the following example:
$user = $this->db
->users
->getBy( 'id', $id );
$this->db is an instance of a Database Abstraction class with a __get( $tableName ) method. Accessing the undefined users property then triggers it. The rest explains itself; A query is formed from the arguments passed to getBy( ) (SQL escaping is also handled by it) and its results are returned as an array.
I haven't finished the whole idea yet, but adding a new user to the database could look like the following:
$user = $this->db
->users
->new;
$user->id = 2;
$user->name = 'Joe';
$user->save( );
As I said the concept isn't really completed and may have (huge) flaws in it. Yet I think that it may be easier to write, more secure and easier to maintain than plain MySQL.
Some other good sides of the whole "thing" would be that it is small, therefore rather fast and also pretty straightforward.
I know that this can't compete with the extremely powerful ORMs and frameworks already out there but I'm still creating this for some reasons mentioned in one of my comments above.
If you do plan on making a database class it may be an idea looking into making it a singleton, allowing you to use it without declaring it/creating it, as...
global $db;
$db = new db;
$db->query ('... sql ...');
is kinda redundant when you can do
db::query ('... sql ...');
I have a set of SQL functions that I use on a near regular basis to reduce what used to be a multi-line escaped lot of SQL to a single call, for example:
get_element ($table, $element, $value, $column='id');
get_row ($table, $value, $column='id');
So if you just want to get the name from a table 'customers' where the id is 4 you:
$name = db::get_element ('customers', 'name', 4);
There are also accompanying functions query_element and query_row, where you just pass it an SQL string and it returns a single element/row.
Along with the functions for insert/update, e.g.
$array = array (
'name' => 'bob jones',
'age' => 28
);
$insert_id = db::insert_array ('customers', $array);
$customer_details = db::get_row ('customers', $insert_id);
$customer_details['age'] = 30;
db:update_array ('customers, $customer_details);
Would create a new row, pull the details back out, update the age, then re-write it to the database.
Creating custom SQL access modules on a per-table basis is generally a mistake I have always found - it's better to just generically query the database using sensible functions.
If you do have to use anything with complex joins then it is always best to create function for it getCustomerInfo () for example, but if you just want a generic table value lookup making lots of custom methods just increases the chances of mistakes in one of them. Plus escaping data is very important - if you can use non-sql as much as possible and funnel it through a few core functions you can ensure everything is properly escaped fairly easily.
If you want to look at my custom database class let me know.

Categories