Currently I use in my projects ADODB library for integration with the database.
I want to migrate to PDO, but I have a question about the consultations.
Currently, with the ADODB I do a query and use the row with the set numerous times using the method MoveFirst().
Example:
//I consultation
$rs = $conn->execute('select * from mytable');
//Loop through the results
while(!$rs->EOF) {
echo $rs->fields('name');
$rs->MoveNext();
}
//I move the "pointer" to the beginning of the list
$rs->MoveFirst();
//I can go over the results without needing to re-select
while(!$rs->EOF) {
echo $rs->fields('name');
$rs->MoveNext();
}
I wonder if there is any way similar in PDO, so I do not need to run the query again.
The goal is to avoid unnecessary queries on the bench more often since they use the same query.
I'm not sure why you want to loop through the result set more than once over the database connection. Why pull the data over the network again when you can just retrieve and save it the first time? But what you are looking for is a scrollable cursor, which isn't supported by mysql. At least not through any PHP/MySQL driver. You may also want to look into buffered/unbuffered queries, which is supported by PDO.
PDO::CURSOR_SCROLL
http://www.php.net/manual/en/pdo.prepare.php
Related
I wrote one databasse class which I use often and I used mysqli.I wanted to write it with PDO but it was slow (It is not about ip connection :) ),and my website is really huge and this pdo little slowness will be really big problem,that's why I choosed difficult way --Mysqli--.I wrote some dynamic class which bind params dynamicly and easily usage like this:
DB::getInstance()->query(sql,'ss',array($a,$b));
This was really useful untill today.I wanted to get result and also count values but I discover reaally big problem that when I use num_rows mysqli get_result will not work,When I use get_result num rows will never work,also when I use get_result and if I want to use it again for same query second one will not work
.Also get_result is not good function because it support only mysqlid.Then I have tried bind result which is useless because every select query I should write bind_result(params) which is not good for other developers on company also.What Should I do?Pdo is slow and for my website it is really slow,mysqli is not for developers it increase development time.How can I bind results dinamicly for query?I want something like I will write sql statement and bind result should get column names dinamicly bind them aoutomaticly and then I will write fetch() and I will write column names and I will get result.How can I do that?
When I use get_result num rows will never work
this is not true
besides, you never need num rows anyway
when I use get_result and if I want to use it again for same query
you don't want it
get_result is not good function because it support only mysqlid
this is true
however, if your site is so big and distinct, there is no problem to install a required module or two.
How can I bind results dinamicly for query?
use get_result.
To reuse a result get all the rows into array using fetch_all() and then use this array anywhere you wish
Instead of num_rows just fetch the data and see whether anything was fetched or not.
I have been trying to run my stored procedure using mysql unsuccessful for quite sometime. whenever I use the code below
$link_id = DbConnection::getInstance('mycon')->connectMysql();
$table_count = mysql_query("SELECT TABLE_NAME FROM information_schema.tables WHERE table_schema = 'mycon' AND table_name LIKE 'table_%' ")
while($row = mysql_fetch_array($table_count)){
$table = $row["TABLE_NAME"];
$excute = mysql_query("dummy_2('$table')") or die(mysql_error());
$result = mysql_fetch_assoc($excute);
var_dump($result);
}
it gives an error saying
Commands out of sync; you can't run this command now
so when I searched the internet, it said that I need to use MYSQL PDO.. Therefore can anyone convert my above statement to mysql pdo.. since i got no clue about PDO whatsoever
When you query something from the MySQL database the result is presented as a result set. Actually some queries might have multiple result sets associated. But there can be only one active list of results sets per connection. I.e. you, your script somehow has to close all currently active result sets before you can issue another query.
If e.g. your stored function uses multiple SELECTs the function has multiple result sets and you have to iterate/close/drop them all.
http://dev.mysql.com/doc/refman/5.1/en/stored-routines-syntax.html:
MySQL supports a very useful extension that enables the use of regular SELECT statements (that is, without using cursors or local variables) inside a stored procedure. The result set of such a query is simply sent directly to the client. Multiple SELECT statements generate multiple result sets, so the client must use a MySQL client library that supports multiple result sets.
The old, deprecated mysql_* functions do not support multiple result sets - you simply can't iterate/drop them.
The mysqli_* extension does: see http://docs.php.net/mysqli.next-result
And so does PDO: see http://docs.php.net/pdostatement.nextrowset.php
I was informed by someone senior in our company today that the PHP code I have written for performing prepared statements on a MySQL database is "inefficient" and "too taxing on our server". Since then I find myself in the difficult position of trying to understand what he meant and then to fix it. I have no contact to said person for four days now so I am asking other developers what they think of my code and if there are any areas that might be causing bottlenecks or issues with server performance.
My code works and returns the results of my query in the variable $data, so technically it works. There is another question though as to whether it is efficient and written well. Any advice as to what that senior employee meant or was referring to? Here is the method I use to connect and query our databases.
(Please note, when I use the word method I do not mean a method inside a class. What I mean to say is this how I write/structure my code when I connect and query our databases.)
<?php
// Create database object and connect to database
$mysqli=new mysqli();
$mysqli->real_connect($hostname, $username, $password, $database);
// Create statement object
$stmt=$mysqli->stmt_init();
// Prepare the query and bind params
$stmt->prepare('SELECT `col` FROM `table` WHERE `col` > ?');
$stmt->bind_param('i', $var1);
// Execute the query
$stmt->execute();
// Store result
$stmt->store_result();
// Prepare for fetching result
$rslt=array();
$stmt->bind_result($rslt['col']);
// Fetch result and save to array
$data=array();
while($stmt->fetch()){
foreach($rslt as $key=>$value){
$row[$key]=$value;
}
$data[]=$row;
}
// Free result
$stmt->free_result();
// Close connections
$stmt->close();
$mysqli->close();
?>
Any advice or suggestions are useful, please do contribute and help out even if you are only guessing. Thanks in advance :)
There are two types of code that may be inefficient, the PHP code and the SQL code, or both.
For example, the SQL is a problem if the `col` column isn't indexed in the database. This puts lots of load on the database because the database has to scan very many rows to answer queries. If `col` isn't indexed in the given query, then all of the rows in the table would be scanned. Also, if the value passed in isn't very selective, then many rows will have to be examined, perhaps all of the rows, as MySQL will choose a table scan over an index scan when many rows will be examined. You will need to become familiar with the MySQL EXPLAIN plan feature to fix your queries, or add indexes to the database to support your queries.
The PHP would be a problem if you followed something like the pattern:
select invoice_id from invoices where customer_id = ?
for each invoice_id
select * from line_items where invoice_id = ?
That kind of pattern will lead to "over querying" the database, which puts extra load on it. Instead use a join:
select li.* from invoices i join line_items li using (invoice_id)
Ask your database administrator to turn on the slow query log and then process it with pt-query-digest
You can use pt-query-digest to report on queries that are expensive (take a long time to execute) and also to use it to report by frequency to detect over querying.
Well, the php code below successfull adds up all the rows in the url field.
$sql = "SELECT * FROM table WHERE url <> ''";
$result = mysql_query($sql,$con);
$sql_num = mysql_num_rows($result);
while($sql_row=mysql_fetch_array($result))
{
urls[] = $sql_row["url"];
}
The problem is that if the list of url are in millions, then it takes a lot of time (especially in localhost). So, I'd like to know anothe way of getting the sql query result directly into an array without using a loop. Is it possible?
The problem is not the loop, it's that you are transferring millions of pieces of data (possibly large) from your database into memory. Whichever way you're doing that, it'll take time. And somebody needs to loop somewhere anyway.
In short: reduce the amount of data you get from the database.
You should consider using mysqli for that purpose. The fetch_all() method would allow to do that.
http://php.net/manual/en/mysqli-result.fetch-all.php
UPDATE
As per comments, I tried both methods. I tried using mysql_fetch_array in a loop, and using mysqli::fetch_all() method, on a large table we have in production. mysqli::fetch_all() did use less memory and ran faster than the mysql_fetch_array loop.
The table has about 500000 rows. mysqli::fetch_all() finished loading the data in an array in 2.50 seconds, and didn't hit the 1G memory limit set in the script. mysql_fetch_array() failed from memory exhaustion after 3.45 seconds.
mysql is deprecated, and the functionality you want is found in mysqli and PDO. It's the perfect excuse to switch to the newer MySQL extensions. For both mysqli and PDO, the method is fetchAll (note that the mysqli::fetch_all requires the mysqlnd driver to run).
there is no option for it in mysql. though you can use pdo's fetchall()
http://php.net/pdostatement.fetchall
I know you can execute 10 SQL queries inside a mysql_query() (or mysqli_...) but how is that different from executing 10 mysql_query()s with one SQL query in each one?
If they are different and the first solution was more efficient, how would I use mysql_fetch_assoc() function on one of the queries inside of it?
If the first solution allows me to limit the number of connections per page to 1 per mysql_query(), then I think I will have enough mysql connections to handle my traffic, but if it doesn't, what SQL technology (or other?) can I use that will allow me to connect to my database from PHP more efficiently (so I can handle more users)?
I am using Apache and PHP 5.4
You can only send a single query at a time ... see the docs
mysql_query() sends a unique query (multiple queries are not supported) to the currently active database on the server that's associated with the specified link_identifier.
You could use mysqli_mutli_query .. example from the docs :
$query = "SELECT CURRENT_USER();";
$query .= "SELECT Name FROM City ORDER BY ID LIMIT 20, 5";
/* execute multi query */
if ($mysqli->multi_query($query)) {
do {
/* store first result set */
if ($result = $mysqli->store_result()) {
while ($row = $result->fetch_row()) {
printf("%s\n", $row[0]);
}
$result->free();
}
/* print divider */
if ($mysqli->more_results()) {
printf("-----------------\n");
}
} while ($mysqli->next_result());
}
/* close connection */
$mysqli->close();
The difference between running 10 queries in one call to mysql_query() and running 10 mysql_query() would be that in the first case you can only get the result from the very last query. But as long as you don't do a new mysql_connect() it shouldn't reconnect between queries.
The alternative is to use mysqli_multi_query which will let you run several queries in parallel which will optimize it some, but not a whole lot.
From the sound of it you don't need a more effective way to connect to your database, you either need to optimize your queries, sort indices and the like, or maybe the database machine is simply incorrectly configured or under-dimensioned hardware wise. I'm guessing your actual problem is that your database questions are too slow?
Just running with mysql_query(), one query per call, mysql running on a separate machine from PHP and you can still do hundreds of MySQL queries per second without breaking a sweat.