I have a large and complicated system consisting of php and javascript code, which make many mysql queries. Is there a way to 'backtrace' each mysql query to the exact line of code, which makes the query?
In mysql it is possible to trace all queries (adding a log statement to the mysql config), but it does not show which php or javascript code/module/line did the query. Is it possible to find the offending lines for each mysql query?
No, there is no way of MySQL knowing what line of code, class, function or file you're making a call from. It just receives a socket connection from the application running the code, and accepts input, processes it and returns a result.
All it knows about is the data it receives, and who is sending it.
You can view active connections and a brief description of what they're doing using
SHOW PROCESSLIST;
You'll get output similar to this:
Id User Host db Command Time State Info
48 test 10.0.2.2:65109 test Sleep 4621
51 test 10.0.2.2:49717 test Sleep 5
52 test 10.0.2.2:49718 test Query 0 SHOW PROCESSLIST
Generally when people want to log queries it happens somewhat similar to this
Before the query is run, log the query and any parameters
Run the query
Log the success/failure of the query, and any errors
To execute this process for systems with hundreds or thousands of queries, you'll generally find a wrapper function/class is created which accepts the appropriate parameters, processes the query as listed above, and returns the result. You could pass your wrapper method the PHP Constants __FILE__ and __LINE__ when you call it, to then log where the database call is being initiated from.
pseudo code only
// Wrapper method
function query_wrapper($stm, $file, $line)
{
log_prequery($stm, $file, $line); // Log the query, file and line
$result = $stm->execute(); // Execute the query
log_postquery($result); // Log the result (and any errors)
return $result; // Return the result
}
// In your code where you're making a database query
$db = new Database();
$stm = $db->prepare("SELECT foo FROM bar");
query_wrapper($stm, __FILE__, __LINE__);
Related
Suppose I have code like this:
mysqli_multi_query('<first query>');
include_once 'secondQuery.php';
This is an enormous simplification, and hopefully I haven't simplified the error out, but secondQuery.php relies on <first query> to be completed in order to execute properly. When I run the two manually, in the correct order, everything works perfectly. But when I run this, the error I get is consistent with them either executed in the wrong order, or simultaneously.
How would I write the middle line of:
mysqli_multi_query('<first query>');
wait for mysqli_multi_query to conclude;
include_once 'secondQuery.php';
in correct PHP syntax?
Every time you use mysqli_multi_query() you need to execute a blocking loop after it, because this function sends SQL queries to be executed by MySQL asynchronously. An example of a blocking loop which waits for MySQL to process all queries asynchronously is this:
$mysqli->multi_query(/* ... */);
do {
$result = $mysqli->use_result();
if ($result) {
// process the results here
$result->free();
}
} while ($mysqli->next_result()); // Next result will block and wait for next query to finish
$mysqli->store_result(); // Needed to fetch the error as exception
It is always a terrible idea to use mysqli_multi_query() in your PHP code. 99.99% of the time there are better ways to achieve what you want. This function has so many downsides that using it should be avoided at all cost.
What you need are database transactions. If your queries depend on each other then you need to switch off implicit commit and commit when all of them execute successfully. You can't achieve this with mysqli_multi_query().
Im getting this error "General error: 2006 MySQL server has gone away" when saving an object.
Im not going to paste the code since it way too complicated and I can explain with this example, but first a bit of context:
Im executing a function via Command line using Phalcon tasks, this task creates a Object from a Model class and that object calls a casperjs script that performs some actions in web page, when it finishes it saves some data, here's where sometimes I get mysql server has gone away, only when the casperjs takes a bit longer.
Task.php
function doSomeAction(){
$object = Class::findFirstByName("test");
$object->performActionOnWebPage();
}
In Class.php
function performActionOnWebPage(){
$result = exec ("timeout 30s casperjs somescript.js");
if($result){
$anotherObject = new AnotherClass();
$anotherObject->value = $result->value;
$anotherObject->save();
}
}
It seems like the $anotherObject->save(); method is affected by the time exec ("timeout 30s casperjs somescript.js"); takes to get an answer, when it shouldn`t.
Its not a matter of the data saved since it fails and saves succesfully with the same input, the only difference I see is the time casperjs takes to return a value.
It seems like if for some reason phalcon opens the MySQL conection during the whole execution of the "Class.php" function, provoking the timeout when casperjs takes too long, does this make any sense? Could you help me to fix it or find a workaround to this?
Problem seems that either you are trying to fetch heavy data in single packet than allowed in your mysql config file or your wait_timeout variable value is not set properly as per your code requirement.
check your wait_timeout and max_allowed_packet values, you can check by below command-
SHOW GLOBAL VARIABLES LIKE 'wait_timeout';
SHOW GLOBAL VARIABLES LIKE 'max_allowed_packet';
And increase these values as per your requirement in your my.cnf (linux) or my.ini (windows) config file and restart mysql service.
I am using a timer function in matlab to continuously execute a certain script. Within this script, I am using urlread to retrieve data from webservices, which works like a charm.
I am now trying to use urlread to execute a simple http-request within this script to insert data into a mysql-database. Thus, I simply specify the url-string and define the value to be parsed to the php parser.
Code-within script being executed in timer-function:
db_url = 'http://someurl/update.php?value=';
db_url = strcat(db_url,num2str(value));
urlread(db_url);
clear db_url
My problem is the following: When I run the timer, it works fine for one execution, but then stops displaying the following error:
"Either this URL could not be parsed or the protocol is not supported."
What is going wrong? When I check my mysql database, I see that one new line has been added to my database, which means it generally works, just won't execute multiple times within the timer.
Any idea what is going wrong? Many thanks in advance!
I figured out what the problem was. The value variable is an array with increasing in size each iteration. Thus, what I needed to do was specify value(end), like so:
db_url = 'http://someurl/update.php?value=';
db_url = strcat(db_url,num2str(value(end)));
urlread(db_url);
clear db_url
I have a VPS with Dreamhost but the mySQL server is shared. I really want to start producing accessible logs of every mySQL query a particular site issues.
I can hand roll this into my abstraction layer but I was curious is there was something like sql_log_off that can be set at runtime so all queries get logged into files I can rotate and review?
From what I understand of what the question is asking:
What you could do is wrap your queries into some sort of wrapper that logs the queries into a file. This could be a text file or a PHP file that will only allow those with permission to view (a log viewer script could include this so that only those with proper access can view).
That is of course saying if you are able to do so. (If you are wanting to log queries from sites that you have no control over then I am not sure.)
An example of a wrapper you might be interested in:
function sql_query($query, $show=0)
{
global $queries, $debugginglist;
$thequery = mysql_query($query) or print(mysql_error()."<br>Query was: <code>".htmlspecialchars($query)."</code>");
$queries++;
if ($show == 1)
{
print "($queries): Query was: <i><code>".htmlspecialchars($query)."</code></i><br>";
}
$debugginglist .= "$qbr($queries): Query was: <i><code>$query</code></i><br>";
//this is just to give an idea for logging, NOT an exact solution
$logquery = fopen("querylog.txt", "ab+");
fputs($logquery, "\r\n$query");
fclose($logquery);
return $thequery;
}
I wrote a utility for updating the DB from a list of numbered .sql update files. The utility stores inside the DB the index of the lastAppliedUpdate. When run, it reads lastAppliedUpdate and applies to the db, by order, all the updates folowing lastAppliedUpdate, and then updates the value of lastAppliedUpdate in the db. Basically simple.
The issue: the utility successfully applies the needed updates, but then when trying to store the value of lastAppliedUpdate, an error is encountered:
General error: 2014 Cannot execute queries while other unbuffered queries are active. Consider using PDOStatement::fetchAll(). Alternatively, if your code is only ever going to run against mysql, you may enable query buffering by setting the PDO::MYSQL_ATTR_USE_BUFFERED_QUERY attribute.
Any ideas, what does it mean, and how can be resolved?
Below is the essence of the code. It's a php code within the Yii framework.
foreach ($numericlyIndexedUpdateFiles as $index => $filename)
{
$command = new CDbCommand (Yii::app()->db, file_get_contents ($filename));
$command->execute();
}
$metaData = MDbMetaData::model()->find();
$metaData->lastAppliedUpdate = $index;
if (!$metaData->save()) throw new CException ("Failed to save metadata lastAppliedUpdate.");
// on this line, instead of throwing the exception that my code throws, if any,
// I receive the described above error
mysql version is: 5.1.50, php version is: 5.3
edit: the above code is done inside a transaction, and I want it to.
Check it out
PDO Unbuffered queries
You can also look at the to set PDO:MYSQL_ATTR_USE_BUFFERED_QUERY
http://php.net/manual/en/ref.pdo-mysql.php
The general answer is that you have to retrieve all the results of the previous query before you run another, or find out how to turn off buffered queries in your database abstraction layer.
Since I don't know the syntax to give you with these mysterious classes you're using (not a Yii person), the easy fix solution is to close the connection and reopen it between those two actions.