I need to run some SQL scripts to my database, to update the schema and data (some kind of migration).
Because there is some logic to check before running each script, I'm writting a small PHP tool to execute the scripts, but I have a simple problem: Can I load and execute a "simple" SQL script (including table manipulation, triggers & stored procedures updates) directly, or should I add markers to the script (to mark where each sentence ends), and run the script sentence by sentence?
For the database access I'm using the PDO.
I had a similar situation today.
My solution is extremely simple, but is just smart enough to allow for comments and statements that span multiple lines.
// open script file
$scriptfile = fopen($script_path, "r");
if (!$scriptfile) { die("ERROR: Couldn't open {$scriptfile}.\n"); }
// grab each line of file, skipping comments and blank lines
$script = '';
while (($line = fgets($scriptfile)) !== false) {
$line = trim($line);
if(preg_match("/^#|^--|^$/", $line)){ continue; }
$script .= $line;
}
// explode script by semicolon and run each statement
$statements = explode(';', $script);
foreach($statements as $sql){
if($sql === '') { continue; }
$query = $pdo->prepare($sql);
$query->execute();
if($query->errorCode() !== '00000'){ die("ERROR: SQL error code: ".$query->errorCode()."\n"); }
}
Related
I am having a problem with getting an sql query to interpolate as I would want, and would be grateful for some help please.
Within the manual page for pg_query_params,there is a code example for pg_query() passing a variable using curly braces. This appeared to be exactly what I need for my task. So, my code is as follows:
$fh = fopen('/home/www/KPI-Summary.sql',"r")
or die("Problem opening SQL file.\n");
$dbh = pg_connect("$connect")
or die('Could not connect: ' . pg_last_error());
$j = 0;
while (($line = fgets($fh)) !== false) {
$tmp[$j] = array(); // Initialise temporary storage.
$result = pg_query($dbh, $line); // process the line read.
if (!$result) { echo "Error: query did not execute"; }
...
while ($row = pg_fetch_row($result)) { // Read sql result.
$tmp[$j][2][] = $row;
}
$j++;
}
fclose($fh);
The sql file contains several queries, one per line, like this:
SELECT count(*) from table WHERE value=0 AND msgid='{$arg[1]}';
However, currently, my variable is not being replaced by the contents -- and therefore although the query runs OK, it is returning zero rows. What do I need to do in order to get the expected result? (Note: each sql line varies, and the query parameters are not constant -- hence using variable(s) within the sql file.)
OK. I have a solution (although it might not be the correct approach).
This works -- but it needs polish I think. Suggestions regarding a better regexp would be very much appreciated.
$bar = 'VALUE-A'; // Can we replace simple variable names?
$arg[1] = 'VALUE-B'; // What about an array, such as $arg[1]?
function interpolate($matches){
global $bar;
global $arg;
if ($matches[2]) {
$i = isset(${$matches[1]}[$matches[2]]) ? ${$matches[1]}[$matches[2]] : 'UNDEF';
} else {
$i = isset(${$matches[1]}) ? ${$matches[1]} : 'UNDEF';
}
return $i;
}
$fh = fopen('/home/www/file.sql',"r") or die("Failed.\n");
while (($line = fgets($fh)) !== false) {
...
$line = preg_replace_callback('|\{\$([a-z]+)\[*(\d*)\]*}|i', "interpolate", $line);
echo $line; // and continue with rest of code as above.
}
fclose($fh);
(Of course, the solution suggests that the question title is completely wrong. Is there any way to edit this?)
did you use pg_escape_string?
$arg[1] = pg_escape_string($arg[1]);
$line="SELECT count(*) from table WHERE value=0 AND msgid='{$arg[1]}';";
I'm tryin to insert datas (160,000+ rows) using INSERT INTO and PHP PDO but i have a bug.
When I launch the PHP script, i see more than the exact number of lines in my CSV inserted in my database.
Can someone say me if my loop is not correct or something ?
Here the code I have :
$bdd = new PDO('mysql:host=<myhost>;dbname=<mydb>', '<user>', '<pswd>');
// I clean the table
$req = $bdd->prepare("TRUNCATE TABLE lbppan_ticket_reglements;");
$req->execute();
// I read and import line by line the CSV file
$handle = fopen('<pathToMyCsvFile>', "r");
while (($data = fgetcsv($handle, 0, ',')) !== FALSE) {
$reqImport =
"INSERT INTO lbppan_ticket_reglements
(<my31Columns>)
VALUES
('$data[0]','$data[1]','$data[2]','$data[3]','$data[4]','$data[5]','$data[6]','$data[7]','$data[8]',
'$data[9]','$data[10]','$data[11]','$data[12]','$data[13]','$data[14]','$data[15]','$data[16]',
'$data[17]','$data[18]','$data[19]','$data[20]','$data[21]','$data[22]','$data[23]','$data[24]',
'$data[25]','$data[26]','$data[27]','$data[28]','$data[29]','$data[30]')";
$req = $bdd->prepare($reqImport);
$req->execute();
}
fclose($handle);
The script works a little because datas are in the table but i dunno why it bugs and inserts more datas. I think maybe, due to the file size (18 Mo) maybe the script crash and attempts to relaunch inserting same rows again.
I can't use LOAD DATA on the server I'm using.
Thanks for your help.
This is not an answer but adding this much into comments is quite tricky.
Start by upping the maximum execution time
If that does not solve your issue, start working your way through the code line by line and handle every exception you can think of. For example, you are truncating the table BUT you say you have loads more data after execution, could the truncate be failing?
try {
$req = $bdd->prepare("TRUNCATE TABLE lbppan_ticket_reglements;");
$req->execute();
} catch (\Exception $e) {
exit($e->getMessage()); // Die immediately for ease of reading
}
Not the most graceful of try/catches but it will allow you to easily spot a problem. You can also apply this to the proceeding query...
try {
$req = $bdd->prepare($reqImport);
$req->execute();
} catch (\Exception $e) {
exit($e->getMessage());
}
and also stick in some diagnostics, are you inserting 160k rows? You could optionally echo out $i on each loop and see if you can spot any breaks or abnormalities.
$i = 0;
while (($data = fgetcsv($handle, 0, ',')) !== FALSE) {
// ... your stuff
$i++;
}
echo "Rows inserted " . $i . "\n\n";
Going beyond that you can the loop print out the SQL content for you to look at manually, perhaps its doing something weird and fruity.
Hope that helps.
Assuming $data[0] is the unique identifier then you can try this to spot the offending row(s):
$i = 0;
while (($data = fgetcsv($handle, 0, ',')) !== FALSE) {
echo 'Row #'.++$i.' - '.$data[0];
}
Since you are not using prepared statements, it is very possible that one of the $data array items are causing a double-insert or some other unknown issue.
I have a simple txt file with email addresses. I want to check if these addresses are in my database and if so: delete them.
The txt file is not in a csv format, but every email addresses is on a new line. I'm wondering whats the best way to do this.
Steps:
This regular expression will match a new line
([^\n]*\n+)+
Add a comma after every line (or replace the NEWLINE with comma) so the list will become from
email1#com.com
email2#com.com
email3#com.com
to email#com.com,email2#com.com,email3#com.com
Add brackets to the beggining and end:
(email#com.com,email2#com.com,email3#com.com)
Add the following sql:
DELETE FROM database.schema.table WHERE email_address IN (email#com.com,email2#com.com,email3#com.com);
Execute the SQL.
You can execute the query from php or directly in the database, keep backups please else you might screw up smth...
Hope this helps...
Good Luck
The function you're looking for is fgets().
<?php
$emails = array();
if( ! $fh = fopen('file.txt', 'r') ) { die('could not open file'); }
while( $buffer = fgets($fh, 4096) ) {
$emails[] = $buffer;
}
fclose($fh);
foreach($emails as $email) {
$query = sprintf("DELETE FROM table WHERE email LIKE '%s'", $email);
// do the query
}
<?php
$file = file("emails.txt");
foreach ($file as $email) {
$query = sprintf("DELETE FROM table WHERE email LIKE '%s'", $email);
# execute query
}
?>
Read the emails in one at a time and then run the delete query with the value. DO NOT USE mysql_query (unless you trust the input from text file, but even then just use the newer libraries please). If the records are there, they will be deleted, if not no biggie nothing happens. Done.
How would I go about populating a database from info in a csv file using PHP code? I need to practice using php to make database calls but at the moment, all I have access to is this csv file...
Design Considerations:
You probably don't want to load the entire file into memory at once using a function like file_get_contents. With large files this will eat up all of your available memory and cause problems. Instead do like Adam suggested, and read one line at a time.
fgetcsv at php manual
//Here's how you would start your database connection
mysql_connect($serverName, $username, $password);
mysql_select_db('yourDBName');
//open the file as read-only
$file = fopen("file.csv", "r");
// lineLength is unlimited when set to 0
// comma delimited
while($data = fgetcsv($file, $lineLength = 0, $delimiter = ",")) {
//You should sanitize your inputs first, using a function like addslashes
$success = mysql_query("INSERT INTO fileTable VALUES(".$data[0].",".$data[1].")");
if(!$success) {
throw new Exception('failed to insert!');
}
}
just do it through phpmyadmin: http://vegdave.wordpress.com/2007/05/19/import-a-csv-file-to-mysql-via-phpmyadmin/
Use the built-in PHP functions to read the CSV and write an output file. Then you can import the SQL into your database. This should work with any type of database.
Don't forget to escape any strings you are using. I used sqlite_escape_string() for that purpose in this example.
$fd = fopen("mydata.csv", "r");
$fdout = fopen("importscript.sql","w");
while(!feof($fd))
{
$line = fgetcsv($fd, 1024); // Read a line of CSV
fwrite($fdout,'INSERT INTO mytable (id,name)'
.'VALUES ('.intval($line[0]).",'".sqlite_escape_string($line[1])."');\r\n";
}
fclose($fdout);
fclose($fd);
function cpanel_populate_database($dbname)
{
// populate database
$sql = file_get_contents(dirname(__FILE__) . '/PHP-Point-Of-Sale/database/database.sql');
$mysqli->multi_query($sql);
$mysqli->close();
}
The sql file is a direct export from phpMyAdmin and about 95% of the time runs without issue and all the tables are created and data is inserted. (I am creating a database from scratch)
The other 5% only the first table or sometimes the first 4 tables are created, but none of the other tables are created (there are 30 tables).
I have decided to NOT use multi_query because it seems buggy and see if the the bug occurs by using just mysql_query on each line after semi-colon. Has anyone ran into issue's like this?
Fast and effective
system('mysql -h #username# -u #username# -p #database# < #dump_file#');
I've seen similar issues when using multi_query with queries that can create or alter tables. In particular, I tend to get InnoDB 1005 errors that seem to be related to foreign keys; it's like MySQL doesn't completely finish one statement before moving on to the next, so the foreign keys lack a proper referent.
In one system, I split the problematic statements into their own files. In another, I have indeed run each command separately, splitting on semicolons:
function load_sql_file($basename, $db) {
// Todo: Trim comments from the end of a line
log_upgrade("Attempting to run the `$basename` upgrade.");
$filename = dirname(__FILE__)."/sql/$basename.sql";
if (!file_exists($filename)) {
log_upgrade("Upgrade file `$filename` does not exist.");
return false;
}
$file_content = file($filename);
$query = '';
foreach ($file_content as $sql_line) {
$tsl = trim($sql_line);
if ($sql_line and (substr($tsl, 0, 2) != '--') and (substr($tsl, 0, 1) != '#')) {
$query .= $sql_line;
if (substr($tsl, -1) == ';') {
set_time_limit(300);
$sql = trim($query, "\0.. ;");
$result = $db->execute($sql);
if (!$result) {
log_upgrade("Failure in `$basename` upgrade:\n$sql");
if ($error = $db->lastError()) {
log_upgrade("$error");
}
return false;
}
$query = '';
}
}
}
$remainder = trim($query);
if ($remainder) {
log_upgrade("Trailing text in `$basename` upgrade:\n$remainder");
if (DEBUG) trigger_error('Trailing text in upgrade script: '.$remainder, E_USER_WARNING);
return false;
}
log_upgrade("`$basename` upgrade successful.");
return true;
}
I have never resorted to multi-query. When I needed something like that, I moved over to mysqli. Also, if you do not need any results from the query, passing the script to mysql_query will also work. You'll also get those errors if there are exports in an incorrect order that clash with require tables for foreign keys and others.
I think the approach of breaking the SQL file to single-queries would be a good idea. Even if its just for comparison purposes (to see if it solves the issue).
Also, I'm not sure how big is your file - but I've had a couple of cases where the file was incredibly big and splitting it into batches did the job.