create GTT, insert csv, using PHP? - php

I have eight csv sheets, which where updated (every two hours) by an external service. I want to analyse these data - showing it in a php page for my boss....
I imagined, that it would be the easiest way to have the .csv in a GTT on my MsSQL Server. Maybe it is possible to create a GTT on ervery use (on commit preserve rows)? I still have no approach to make the csv to a normal format for insert syntax... is it even possible?
Or is there an other option? I try to catch some ideas for the solution...
Thank you!
EDIT
I have php code to read the .csv and put it into an array.
$csv_datei = "applicationData.csv";
$feler_trenner = ";";
$zeilen_trenner = "n";
if (#file_exists($csv_datei) == false) {
echo 'Die CSV Datei: '. $csv_datei.' gibt es nicht!';
}
else {
$datei_inhalt = #file_get_contents($csv_datei);
$zeilen = explode($zeilen_trenner, $datei_inhalt);
if (is_array($zeilen) == true) {
foreach($zeilen as $zeile) {
$felder = explode($feler_trenner, $zeile);
$i = 0;
if (is_array($felder) == true) {
foreach($felder as $felde) {
if ($felde != '') {
echo (($i != 0) ? ', ':'') . str_replace('"', '', $felde);
$i++;
}
}
}
echo '<br>';
}
}
}
source: PHP-space.info

Related

ZipArchive::getStreamIndex kills script

I decided to try out the latest PHP ZIP functions for extracting data and something odd has happened. This PHP script is to read data from packed files of a specific extension (gob or goo) and display them in an XML format for further use later.
$zip = new ZipArchive;
$zipres = $zip->open($zipfilename);
if($zipres === true)
{
for($i = 0; $i < $zip->numFiles; $i++)
{
echo "\t<entry id=\"".$i."\">".$zip->getNameIndex($i)."</entry>\n";
if(strripos($zip->getNameIndex($i),".gob") !== false || strripos($zip->getNameIndex($i),".goo") !== false)
{
$zipentry = $zip->getStreamIndex($i);
return;
if($zipentry)
{
$packagenames[] = $zip->getNameIndex($i);
$wholeGOB[] = stream_get_contents($zipentry);
fclose($zipentry);
}
else
echo "\t<error>Error reading entry ".$zip->getNameIndex($i)."</error>";
}
}
$zip->close();
}
else
echo "\t<error>Invalid ZIP file</error>\n\t<code>".$zipres."</code>\n";
Having the return; before getStreamIndex will print out the <entry>s, but as soon as I move the return; to after getStreamIndex I get a blank page (even in view-source). Am I doing something wrong, or has this function always been a bit off? And what alternative would you recommend?

Large processes in PhalconPHP

I have webapp that is logging application and I need backup/restore/import/export feature there. I did this successfully with laravel but have some complications with Phalcon. I don't see native functions in phalcon that would split on chunks execution of large php scripts.
The thing is that logs will be backed up and restored as well as imported by users in ADIF format (adif.org) I have parser for that format which converts file to array of arrays then every record should search through another table, containing 2000 regular expressions, and find 3-10 matches there and connect imported records in one table to those in another table (model relation hasMany) That means that every imported record should have quite some processing time. laravel did it somehow with 3500 records imported, I dont know how it will handle more. The average import will contain 10000 records and each of them need to be verified with 2000 regular expression.
The main issue is how to split this huge processing mount into smaller chunks so I wouldnt get timeouts?
Here is the function that could flawlessly do the job with adding 3862 records in one table and as a result of processing of every record add 8119 records in another table:
public function restoreAction()
{
$this->view->disable();
$user = Users::findFirst($this->session->auth['id']);
if ($this->request->isPost()) {
if ($this->request->isAjax()) {
$frontCache = new CacheData(array(
"lifetime" => 21600
));
$cache = new CacheFile($frontCache, array(
"cacheDir" => "../plc/app/cache/"
));
$cacheKey = $this->request->getPost('fileName').'.cache';
$records = $cache->get($cacheKey);
if ($records === null) {
$rowsPerChunk = 50;
$adifSource = AdifHelper::parseFile(BASE_URL.'/uploads/'.$user->getUsername().'/'.$this->request->getPost('fileName'));
$records = array_chunk($adifSource, $rowsPerChunk);
$key = array_keys($records);
$size = count($key);
}
for ($i = 0; $i < $size; $i++) {
if (!isset($records[$i])) {
break;
}
set_time_limit(50);
for ($j=0; $j < $rowsPerChunk; $j++) {
$result = $records[$i][$j];
if (!isset($result)) {
break;
}
if(isset($result['call'])) {
$p = new PrefixHelper($result['call']);
}
$bandId = (isset($result['band']) && (strlen($result['band']) > 2)) ? Bands::findFirstByName($result['band'])->getId() : null;
$infos = (isset($p)) ? $p->prefixInfo() : null;
if (is_array($infos)) {
if (isset($result['qsl_sent']) && ($result['qsl_sent'] == 'q')) {
$qsl_rcvd = 'R';
} else if (isset($result['eqsl_qsl_sent']) && ($result['eqsl_qsl_sent'] == 'c')) {
$qsl_rcvd = 'y';
} else if (isset($result['qsl_rcvd'])) {
$qsl_rcvd = $result['qsl_rcvd'];
} else {
$qsl_rcvd ='i';
}
$logRow = new Logs();
$logRow->setCall($result['call']);
$logRow->setDatetime(date('Y-m-d H:i:s',strtotime($result['qso_date'].' '.$result['time_on'])));
$logRow->setFreq(isset($result['freq']) ? $result['freq'] : 0);
$logRow->setRst($result['rst_sent']);
$logRow->setQslnote(isset($result['qslmsg']) ? $result['qslmsg'] : '');
$logRow->setComment(isset($result['comment']) ? $result['comment'] : '');
$logRow->setQslRcvd($qsl_rcvd);
$logRow->setQslVia(isset($result['qsl_sent_via']) ? $result['qsl_sent_via'] : 'e');
$logRow->band_id = $bandId;
$logRow->user_id = $this->session->auth['id'];
$success = $logRow->save();
if ($success) {
foreach ($infos as $info) {
if (is_object($info)) {
$inf = new Infos();
$inf->setLat($info->lat);
$inf->setLon($info->lon);
$inf->setCq($info->cq);
$inf->setItu($info->itu);
if (isset($result['iota'])) {
$inf->setIota($result['iota']);
}
if (isset($result['pfx'])) {
$inf->setPfx($result['pfx']);
}
if (isset($result['gridsquare'])) {
$inf->setGrid($result['gridsquare']);
} else if (isset($result['grid'])) {
$inf->setGrid($result['grid']);
}
$inf->qso_id = $logRow->getId();
$inf->prefix_id = $info->id;
$infSuccess[] = $inf->save();
}
}
}
}
}
sleep(1);
}
}
}
}
I know, the script needs a lot of improvement but for now the task was just to make it work.
I think that the good practice for large processing task in php is console applications, that doesn't have restrictions in execution time and can be setup with more memory for execution.
As for phalcon, it has builtin mechanism for running and processing cli tasks - Command Line Applications (this link will always point to the documentation of a phalcon latest version)

How to eliminate empty rows in excel upload from PHP using PHPExcel upload?

i am in the middle of the development. i want to eliminate the empty rows while upload a excel from PHP using PHP excel plugin.
while($x<=$excel->sheets[0]['numRows']){
$y=1;
while($y<=$excel->sheets[0]['numCols']){
$cell = isset($excel->sheets[0]['cells'][$x][$y])
? $excel->sheets[0]['cells'][$x][$y]
: '';
if($excel->sheets[0]['cells'][$x][$y] == $excel->sheets[0]['cells'][$x][1] ){
echo $cell;
} else {
echo $cell;
}
$y++;
}
$x++;
}
i used this code to get the values from excel to PHP, but if the excel has empty record the result will be modified something like extra cells. how can i fix this.
if (!array_reduce(
$excel->sheets[0]['cells'][$x],
function ($state, $value) {
return $state && !($value > '');
},
TRUE
)) {
// execute your while($y<=$excel->sheets[0]['numCols']) loop here
}

How to repeat a function in php for every element in an array without it timing out

I have a function that will take an id and with that find out other information in the database relating to it spread among 3 tables. It then compares this to a csv file which at most times is cpu intensive. Running this once with one id takes approx 8 to 10 sec at most but I have been asked to have it run automatically across a varing number of ids in the database. To do this I created an array of the ids that match the criteria in the database at any point and then run a 'while' statement to repeat the function for each element in the array but it gets as far as maybe 4 of them and I get the following error:
Server error!
The server encountered an internal error and was unable to complete
your request. Either the server is overloaded or there was an error in
a CGI script.
If you think this is a server error, please contact the webmaster.
Error 500
I'll admit that my code could be much much cleaner as I'm still learning as I go but the real bottle neck appears to be reading the csv which is a report which size changes each day. I have tried different combinations and the best result is (please don't chastise me for this as I know it is stupid but the other ways haven't works as of yet) to run the code as follows:
$eventArray = eventArray($venueId);
$totalEvents = count($eventArray);
for($i=0; $i<$totalEvents; $i++)
{
$eventId = $eventArray[$i];
echo $eventId;
echo $datename = getEventDetails($eventId, $zone);
// date of event
echo $eventDate = $datename['eventDate'];
// vs team
echo $eventName = $datename['eventName'];
$file_handle = fopen("data/rohm/sales/today.csv", "r");
while (!feof($file_handle) )
{
$line_of_text = fgetcsv($file_handle, 200);
include('finance_logic.php');
}
fclose($file_handle);
}
Yes, it is repeating the reading of the csv every time but I couldn't get it to function at all any other way so if this is the issue I would really appreciate some guidence on dealing with the csv better. Incase it is relevent the code it 'finance_logic.php' is listed below:
if($line_of_text[0] == "Event: $eventName ")
{
$f = 1;
$ticketTotalSet = "no";
$killSet = 'no';
// default totals zero
$totalHolds = 0;
$totalKills = 0;
$ticketSold = 0;
$ticketPrice = 0;
$totalCap = 0;
}
if($f == 1 && $line_of_text[0] == "$eventDate")
{
$f = 2;
}
if($f == 2 && $line_of_text[0] == "Holds")
{
$f = 3;
while($line_of_text[$col] !== "Face Value Amt")
{
$col++;
}
}
if($f == 3 && $line_of_text[0] !== "Face Value Amt")
{
if($f == 3 && $line_of_text[0] == "*: Kill")
{
$totalKills = $line_of_text[$col];
}
$holdsArray[] = $line_of_text[$col];
}
if($f == 3 && $line_of_text[0] == "--")
{
$f = 4;
}
if($f == 4 && $line_of_text[0] == "Capacity")
{
$totalCap = $line_of_text[$col];
$f = 5;
}
if($f == 5 && $line_of_text[0] == "Abbreviated Performance Totals")
{
$f = 6;
}
if($f == 6 && $line_of_text[0] == "$eventName")
{
// change when 1 ticket exists
$ticketTotalSet = "yes";
// set season tickets
include("financial/seasontickets/$orgShortName.php");
// all non season are single tickets
if(!isset($category))
{
$category = 'single';
}
$ticketName = $line_of_text[2];
$ticketSold = $line_of_text[3];
$ticketPrice = $line_of_text[4];
addTicketType($eventId, $ticketName, $category, $ticketSold, $ticketPrice);
unset($category);
}
if($f == 6 && $ticketTotalSet == "yes" && $line_of_text[0] !== "$eventName")
{
$totalHolds = (array_sum($holdsArray) - $totalKills);
// add cap, holds and kills
addKillsHoldsCap($eventId, $totalCap, $eventId, $totalHolds, $totalKills);
// reset everything
$f = 0;
$ticketTotalSet = "no";
echo "$eventName updated!";
}
Thanks in advance!
p.s. The reason the report is called each time is so that the 'eventName' and 'eventDate' are searched for with the 'finance_logic.php'. Obviously if this was set with all event names and dates already it would take one search of the report to find them all but I'm not sure how I could do this dynamically. Any suggestions would be welcome as I'm sure there is something out there that I just haven't learnt yet.
I have some heavy script i use with localhost sometimes and if i don't add anything they will just time out.
A simple solution is to limit the number of execution of your function, then reload the page, then restart where you stopped.

Loading .sql files from within PHP

I'm creating an installation script for an application that I'm developing and need to create databases dynamically from within PHP. I've got it to create the database but now I need to load in several .sql files. I had planned to open the file and mysql_query it a line at a time - until I looked at the schema files and realised they aren't just one query per line.
So, how do I load an sql file from within PHP (as phpMyAdmin does with its import command)?
$db = new PDO($dsn, $user, $password);
$sql = file_get_contents('file.sql');
$qr = $db->exec($sql);
phpBB uses a few functions to parse their files. They are rather well-commented (what an exception!) so you can easily know what they do (I got this solution from http://www.frihost.com/forums/vt-8194.html). here is the solution an I've used it a lot:
<?php
ini_set('memory_limit', '5120M');
set_time_limit ( 0 );
/***************************************************************************
* sql_parse.php
* -------------------
* begin : Thu May 31, 2001
* copyright : (C) 2001 The phpBB Group
* email : support#phpbb.com
*
* $Id: sql_parse.php,v 1.8 2002/03/18 23:53:12 psotfx Exp $
*
****************************************************************************/
/***************************************************************************
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 2 of the License, or
* (at your option) any later version.
*
***************************************************************************/
/***************************************************************************
*
* These functions are mainly for use in the db_utilities under the admin
* however in order to make these functions available elsewhere, specifically
* in the installation phase of phpBB I have seperated out a couple of
* functions into this file. JLH
*
\***************************************************************************/
//
// remove_comments will strip the sql comment lines out of an uploaded sql file
// specifically for mssql and postgres type files in the install....
//
function remove_comments(&$output)
{
$lines = explode("\n", $output);
$output = "";
// try to keep mem. use down
$linecount = count($lines);
$in_comment = false;
for($i = 0; $i < $linecount; $i++)
{
if( preg_match("/^\/\*/", preg_quote($lines[$i])) )
{
$in_comment = true;
}
if( !$in_comment )
{
$output .= $lines[$i] . "\n";
}
if( preg_match("/\*\/$/", preg_quote($lines[$i])) )
{
$in_comment = false;
}
}
unset($lines);
return $output;
}
//
// remove_remarks will strip the sql comment lines out of an uploaded sql file
//
function remove_remarks($sql)
{
$lines = explode("\n", $sql);
// try to keep mem. use down
$sql = "";
$linecount = count($lines);
$output = "";
for ($i = 0; $i < $linecount; $i++)
{
if (($i != ($linecount - 1)) || (strlen($lines[$i]) > 0))
{
if (isset($lines[$i][0]) && $lines[$i][0] != "#")
{
$output .= $lines[$i] . "\n";
}
else
{
$output .= "\n";
}
// Trading a bit of speed for lower mem. use here.
$lines[$i] = "";
}
}
return $output;
}
//
// split_sql_file will split an uploaded sql file into single sql statements.
// Note: expects trim() to have already been run on $sql.
//
function split_sql_file($sql, $delimiter)
{
// Split up our string into "possible" SQL statements.
$tokens = explode($delimiter, $sql);
// try to save mem.
$sql = "";
$output = array();
// we don't actually care about the matches preg gives us.
$matches = array();
// this is faster than calling count($oktens) every time thru the loop.
$token_count = count($tokens);
for ($i = 0; $i < $token_count; $i++)
{
// Don't wanna add an empty string as the last thing in the array.
if (($i != ($token_count - 1)) || (strlen($tokens[$i] > 0)))
{
// This is the total number of single quotes in the token.
$total_quotes = preg_match_all("/'/", $tokens[$i], $matches);
// Counts single quotes that are preceded by an odd number of backslashes,
// which means they're escaped quotes.
$escaped_quotes = preg_match_all("/(?<!\\\\)(\\\\\\\\)*\\\\'/", $tokens[$i], $matches);
$unescaped_quotes = $total_quotes - $escaped_quotes;
// If the number of unescaped quotes is even, then the delimiter did NOT occur inside a string literal.
if (($unescaped_quotes % 2) == 0)
{
// It's a complete sql statement.
$output[] = $tokens[$i];
// save memory.
$tokens[$i] = "";
}
else
{
// incomplete sql statement. keep adding tokens until we have a complete one.
// $temp will hold what we have so far.
$temp = $tokens[$i] . $delimiter;
// save memory..
$tokens[$i] = "";
// Do we have a complete statement yet?
$complete_stmt = false;
for ($j = $i + 1; (!$complete_stmt && ($j < $token_count)); $j++)
{
// This is the total number of single quotes in the token.
$total_quotes = preg_match_all("/'/", $tokens[$j], $matches);
// Counts single quotes that are preceded by an odd number of backslashes,
// which means they're escaped quotes.
$escaped_quotes = preg_match_all("/(?<!\\\\)(\\\\\\\\)*\\\\'/", $tokens[$j], $matches);
$unescaped_quotes = $total_quotes - $escaped_quotes;
if (($unescaped_quotes % 2) == 1)
{
// odd number of unescaped quotes. In combination with the previous incomplete
// statement(s), we now have a complete statement. (2 odds always make an even)
$output[] = $temp . $tokens[$j];
// save memory.
$tokens[$j] = "";
$temp = "";
// exit the loop.
$complete_stmt = true;
// make sure the outer loop continues at the right point.
$i = $j;
}
else
{
// even number of unescaped quotes. We still don't have a complete statement.
// (1 odd and 1 even always make an odd)
$temp .= $tokens[$j] . $delimiter;
// save memory.
$tokens[$j] = "";
}
} // for..
} // else
}
}
return $output;
}
$dbms_schema = 'yourfile.sql';
$sql_query = #fread(#fopen($dbms_schema, 'r'), #filesize($dbms_schema)) or die('problem ');
$sql_query = remove_remarks($sql_query);
$sql_query = split_sql_file($sql_query, ';');
$host = 'localhost';
$user = 'user';
$pass = 'pass';
$db = 'database_name';
// mysql_* is deprecated, prefer using mysqli_* instead
// mysql_connect($host,$user,$pass) or die('error connection');
// mysql_select_db($db) or die('error database selection');
$connection = mysqli_connect($host,$user,$pass) or die('error connection');
mysqli_select_db($connection, $db) or die('error database selection');
$i=1;
foreach($sql_query as $sql){
echo $i++;
echo "<br />";
// mysql_* is deprecated, prefer using mysqli_* instead
// mysql_query($sql) or die('error in query');
mysqli_query($connection, $sql) or die('error in query');
}
I'm getting the feeling that everyone here who's answered this question doesn't know what it's like to be a web application developer who allows people to install the application on their own servers. Shared hosting, especially, doesn't allow you to use SQL like the "LOAD DATA" query mentioned previously. Most shared hosts also don't allow you to use shell_exec.
Now, to answer the OP, your best bet is to just build out a PHP file that contains your queries in a variable and can just run them. If you're determined to parse .sql files, you should look into phpMyAdmin and get some ideas for getting data out of .sql files that way. Look around at other web applications that have installers and you'll see that, rather than use .sql files for their queries, they just package them up in PHP files and just run each string through mysql_query or whatever it is that they need to do.
The simplest solution is to use shell_exec() to run the mysql client with the SQL script as input. This might run a little slower because it has to fork, but you can write the code in a couple of minutes and then get back to working on something useful. Writing a PHP script to run any SQL script could take you weeks.
Supporting SQL scripts is more complex than what people are describing here, unless you're certain that your script contains only a subset of the functionality of scripts. Below are some examples of things that may appear in an ordinary SQL script that make it complex to code a script to interpret it line by line.
-- Comment lines cannot be prepared as statements
-- This is a MySQL client tool builtin command.
-- It cannot be prepared or executed by server.
USE testdb;
-- This is a multi-line statement.
CREATE TABLE foo (
string VARCHAR(100)
);
-- This statement is not supported as a prepared statement.
LOAD DATA INFILE 'datafile.txt' INTO TABLE foo;
-- This statement is not terminated with a semicolon.
DELIMITER //
-- This multi-line statement contains a semicolon
-- but not as the statement terminator.
CREATE PROCEDURE simpleproc (OUT param1 INT)
BEGIN
SELECT COUNT(*) INTO param1 FROM foo;
END
//
If you only support a subset of SQL scripts, excluding some corner cases such as those above, it's relatively easy to write a PHP script that reads a file and executes the SQL statements within the file. But if you want to support any valid SQL script, that's much more complex.
See also my answers to these related questions:
Running MySQL *.sql files in PHP
is it possible to call a sql script from a stored procedure in another sql script?
PHP: multiple SQL queries in one mysql_query statement
In my projects I've used next solution:
<?php
/**
* Import SQL from file
*
* #param string path to sql file
*/
function sqlImport($file)
{
$delimiter = ';';
$file = fopen($file, 'r');
$isFirstRow = true;
$isMultiLineComment = false;
$sql = '';
while (!feof($file)) {
$row = fgets($file);
// remove BOM for utf-8 encoded file
if ($isFirstRow) {
$row = preg_replace('/^\x{EF}\x{BB}\x{BF}/', '', $row);
$isFirstRow = false;
}
// 1. ignore empty string and comment row
if (trim($row) == '' || preg_match('/^\s*(#|--\s)/sUi', $row)) {
continue;
}
// 2. clear comments
$row = trim(clearSQL($row, $isMultiLineComment));
// 3. parse delimiter row
if (preg_match('/^DELIMITER\s+[^ ]+/sUi', $row)) {
$delimiter = preg_replace('/^DELIMITER\s+([^ ]+)$/sUi', '$1', $row);
continue;
}
// 4. separate sql queries by delimiter
$offset = 0;
while (strpos($row, $delimiter, $offset) !== false) {
$delimiterOffset = strpos($row, $delimiter, $offset);
if (isQuoted($delimiterOffset, $row)) {
$offset = $delimiterOffset + strlen($delimiter);
} else {
$sql = trim($sql . ' ' . trim(substr($row, 0, $delimiterOffset)));
query($sql);
$row = substr($row, $delimiterOffset + strlen($delimiter));
$offset = 0;
$sql = '';
}
}
$sql = trim($sql . ' ' . $row);
}
if (strlen($sql) > 0) {
query($row);
}
fclose($file);
}
/**
* Remove comments from sql
*
* #param string sql
* #param boolean is multicomment line
* #return string
*/
function clearSQL($sql, &$isMultiComment)
{
if ($isMultiComment) {
if (preg_match('#\*/#sUi', $sql)) {
$sql = preg_replace('#^.*\*/\s*#sUi', '', $sql);
$isMultiComment = false;
} else {
$sql = '';
}
if(trim($sql) == ''){
return $sql;
}
}
$offset = 0;
while (preg_match('{--\s|#|/\*[^!]}sUi', $sql, $matched, PREG_OFFSET_CAPTURE, $offset)) {
list($comment, $foundOn) = $matched[0];
if (isQuoted($foundOn, $sql)) {
$offset = $foundOn + strlen($comment);
} else {
if (substr($comment, 0, 2) == '/*') {
$closedOn = strpos($sql, '*/', $foundOn);
if ($closedOn !== false) {
$sql = substr($sql, 0, $foundOn) . substr($sql, $closedOn + 2);
} else {
$sql = substr($sql, 0, $foundOn);
$isMultiComment = true;
}
} else {
$sql = substr($sql, 0, $foundOn);
break;
}
}
}
return $sql;
}
/**
* Check if "offset" position is quoted
*
* #param int $offset
* #param string $text
* #return boolean
*/
function isQuoted($offset, $text)
{
if ($offset > strlen($text))
$offset = strlen($text);
$isQuoted = false;
for ($i = 0; $i < $offset; $i++) {
if ($text[$i] == "'")
$isQuoted = !$isQuoted;
if ($text[$i] == "\\" && $isQuoted)
$i++;
}
return $isQuoted;
}
function query($sql)
{
global $mysqli;
//echo '#<strong>SQL CODE TO RUN:</strong><br>' . htmlspecialchars($sql) . ';<br><br>';
if (!$query = $mysqli->query($sql)) {
throw new Exception("Cannot execute request to the database {$sql}: " . $mysqli->error);
}
}
set_time_limit(0);
$mysqli = new mysqli('localhost', 'root', '', 'test');
$mysqli->set_charset("utf8");
header('Content-Type: text/html;charset=utf-8');
sqlImport('import.sql');
echo "Peak MB: ", memory_get_peak_usage(true)/1024/1024;
On test sql file (41Mb) memory peak usage: 3.25Mb
mysqli can run multiple queries separated by a ;
you could read in the whole file and run it all at once using mysqli_multi_query()
But, I'll be the first to say that this isn't the most elegant solution.
Since I can't comment on answer, beware to use following solution:
$db = new PDO($dsn, $user, $password);
$sql = file_get_contents('file.sql');
$qr = $db->exec($sql);
There is a bug in PHP PDO https://bugs.php.net/bug.php?id=61613
db->exec('SELECT 1; invalidstatement; SELECT 2');
won't error out or return false (tested on PHP 5.5.14).
My suggestion would be to look at the sourcecode of PHPMyBackup. It's an automated PHP SQL loader. You will find that mysql_query only loads one query at a time, and projects like PHPMyAdmin and PHPMyBackup have already done the hard work for you of parsing the SQL the correct way. Please don't re-invent that wheel :P
An updated solution of Plahcinski solution. Alternatively you can use fopen and fread for bigger files:
$fp = file('database.sql', FILE_IGNORE_NEW_LINES | FILE_SKIP_EMPTY_LINES);
$query = '';
foreach ($fp as $line) {
if ($line != '' && strpos($line, '--') === false) {
$query .= $line;
if (substr($query, -1) == ';') {
mysql_query($query);
$query = '';
}
}
}
mysql_query("LOAD DATA LOCAL INFILE '/path/to/file' INTO TABLE mytable");
Briefly, the way I have done this is:
Read the file (a db dump eg $ mysqldump db > db.sql)
$sql = file_get_contents(db.sql);
Import it using mysqli::multi_query
if ($mysqli->multi_query($sql)) {
$mysqli->close();
} else {
throw new Exception ($mysqli->error);
}
Watch out mysqli_query supports async queries. More here: http://php.net/manual/en/mysqli.multi-query.php and here https://stackoverflow.com/a/6652908/2002493
I noticed that the PostgreSQL PDO driver does not allow you to run scripts separated by semicolons. In order to run a .sql file on any database using PDO it is necessary to split the statements in PHP code yourself. Here is a solution that seems to work quite well:
https://github.com/diontruter/migrate/blob/master/src/Diontruter/Migrate/SqlScriptParser.php
The referenced class has done the trick for me in a database independent way, please message me if there are any issues. Here is how you could use the script after adding it to your project:
$pdo = new PDO($connectionString, $userName, $password);
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$parser = new SqlScriptParser();
$sqlStatements = $parser->parse($fileName);
foreach ($sqlStatements as $statement) {
$distilled = $parser->removeComments($statement);
if (!empty($distilled)) {
$statement = $pdo->prepare($sql);
$affectedRows = $statement->execute();
}
}
Are you sure that its not one query per line? Your text editor may be wrapping lines, but in reality each query may be on a single line.
At any rate, olle's method seems best. If you have reasons to run queries one at time, you should be able to read in your file line by line, then use the semicolon at the end of each query to delimit. You're much better off reading in a file line by line than trying to split an enormous string, as it will be much kinder to your server's memory. Example:
$query = '';
$handle = #fopen("/sqlfile.sql", "r");
if ($handle) {
while (!feof($handle)) {
$query.= fgets($handle, 4096);
if (substr(rtrim($query), -1) === ';') {
// ...run your query, then unset the string
$query = '';
}
}
fclose($handle);
}
Obviously, you'll need to consider transactions and the rest if you're running a whole lot of queries in a batch, but it's probably not a big deal for a new-install script.
Unless you plan to import huge .sql files, just read the entire file into memory, and run it as a query.
It's been a while since I've used PHP, so, pseudo code:
all_query = read_file("/my/file.sql")
con = mysql_connect("localhost")
con.mysql_select_db("mydb")
con.mysql_query(all_query)
con.close()
Unless the files are huge (say, over several megabytes), there's no reason to execute it line-at-a-time, or try and split it into multiple queries (by splitting using ;, which as I commented on cam8001's answer, will break if the query has semi-colons within strings)..
Works on Navicat dumps. Might need to dump the first /* */ comment navicat puts in.
$file_content = file('myfile.sql');
$query = "";
foreach($file_content as $sql_line){
if(trim($sql_line) != "" && strpos($sql_line, "--") === false){
$query .= $sql_line;
if (substr(rtrim($query), -1) == ';'){
echo $query;
$result = mysql_query($query)or die(mysql_error());
$query = "";
}
}
}
This The Best Code For restore sql by php can use 100% Goooood!
Thank A lot
$file_content = file('myfile.sql');
$query = "";
foreach($file_content as $sql_line){
if(trim($sql_line) != "" && strpos($sql_line, "--") === false){
$query .= $sql_line;
if (substr(rtrim($query), -1) == ';'){
echo $query;
$result = mysql_query($query)or die(mysql_error());
$query = "";
}
}
}
Try This:
// SQL File
$SQLFile = 'YourSQLFile.sql';
// Server Name
$hostname = 'localhost';
// User Name
$db_user = 'root';
// User Password
$db_password = '';
// DBName
$database_name = 'YourDBName';
// Connect MySQL
$link = mysql_connect($hostname, $db_user, $db_password);
if (!$link) {
die("MySQL Connection error");
}
// Select MySQL DB
mysql_select_db($database_name, $link) or die("Wrong MySQL Database");
// Function For Run Multiple Query From .SQL File
function MultiQuery($sqlfile, $sqldelimiter = ';') {
set_time_limit(0);
if (is_file($sqlfile) === true) {
$sqlfile = fopen($sqlfile, 'r');
if (is_resource($sqlfile) === true) {
$query = array();
echo "<table cellspacing='3' cellpadding='3' border='0'>";
while (feof($sqlfile) === false) {
$query[] = fgets($sqlfile);
if (preg_match('~' . preg_quote($sqldelimiter, '~') . '\s*$~iS', end($query)) === 1) {
$query = trim(implode('', $query));
if (mysql_query($query) === false) {
echo '<tr><td>ERROR:</td><td> ' . $query . '</td></tr>';
} else {
echo '<tr><td>SUCCESS:</td><td>' . $query . '</td></tr>';
}
while (ob_get_level() > 0) {
ob_end_flush();
}
flush();
}
if (is_string($query) === true) {
$query = array();
}
}
echo "</table>";
return fclose($sqlfile);
}
}
return false;
}
/* * * Use Function Like This: ** */
MultiQuery($SQLFile);
The easiest and fastest way to load & parse phpmyadmin dump or mysql dump file..
$ mysql -u username -p -h localhost dbname < dumpfile.sql
None of the solutions I have seen here deal with needing to change the delimiter while creating a stored procedure on a server where I can't count on having access to LOAD DATA INFILE. I was hoping to find that someone had already solved this without having to scour the phpMyAdmin code to figure it out. Like others, I too was in the process of looking for someone else's GPL'ed way of doing it since I am writing GPL code myself.
Some PHP libraries can parse a SQL file made of multiple SQL statements, explode it properly (not using a simple ";" explode, naturally), and the execute them.
For instance, check Phing's PDOSQLExecTask
Just to restate the problem for everyone:
PHP's mysql_query, automatically end-delimits each SQL commands, and additionally is very vague about doing so in its manual. Everything beyond one command will yield an error.
On the other mysql_query is fine with a string containing SQL-style comments, \n, \r..
The limitation of mysql_query reveals itself in that the SQL parser reports the problem to be directly at the next command e.g.
You have an error in your SQL syntax; check the manual that corresponds to your
MySQL server version for the right syntax to use near 'INSERT INTO `outputdb:`
(`intid`, `entry_id`, `definition`) VALUES...
Here is a quick solution:
(assuming well formatted SQL;
$sqlCmds = preg_split("/[\n|\t]*;[\n|\t]*[\n|\r]$/", $sqlDump);
Many hosts will not allow you to create your own database through PHP, but you seem to have solved that.
Once the DB has been created, you can manipulate and populate it simply:
mysql_connect("localhost");
mysql_query("SOURCE file.sql");
Some guys (Plahcinski) suggested this code:
$file_content = file('myfile.sql');
$query = "";
foreach($file_content as $sql_line){
if(trim($sql_line) != "" && strpos($sql_line, "--") === false){
$query .= $sql_line;
if (substr(rtrim($query), -1) == ';'){
echo $query;
$result = mysql_query($query)or die(mysql_error());
$query = "";
}
}
}
but I would update it with the one which worked for me:
//selecting my database
$database = 'databaseTitleInFile';
$selectDatabase = mysql_select_db($database, $con);
if(! $selectDatabase )
{
die('Could not select the database: ' . mysql_error());
}
echo "The database " . $database . " selected successfully\n";
//reading the file
$file_path='..\yourPath\to\File';
if(!file_exists($file_path)){
echo "File Not Exists";
}
$file_content = file_get_contents($file_path);
$array = explode("\n", $file_content)
//making queries
$query = "";
foreach($array as $sql_line){
$sql_line=trim($sql_line);
if($sql_line != "" && substr($sql_line, 0, 2) === "--" && strpos($sql_line, "/*") === false){
$query .= $sql_line;
if (substr(rtrim($query), -1) == ';'){
$result = mysql_query($query)or die(mysql_error());
$query = "";
}
}
}
because it is more comprehensive. ;-)
This may be helpful -->
More or less what it does is to first take the string given to the function (the file_get_contents() value of your file.sql) and remove all the line breaks. Then it splits the data by the ";" character. Next it goes into a while loop, looking at each line of the array that is created. If the line contains the " ` " character, it will know it is a query and execture the myquery() function for the given line data.
Code:
function myquery($query) {
mysql_connect(dbhost, dbuser, dbpass);
mysql_select_db(dbname);
$result = mysql_query($query);
if (!mysql_errno() && #mysql_num_rows($result) > 0) {
}
else {
$result="not";
}
mysql_close();
return $result;
}
function mybatchquery ($str) {
$sql = str_replace("\n","",$str)
$sql = explode(";",$str);
$x=0;
while (isset($str[$x])) {
if (preg_match("/(\w|\W)+`(\w|\W)+) {
myquery($str[$x]);
}
$x++
}
return TRUE;
}
function myrows($result) {
$rows = #mysql_num_rows($result);
return $rows;
}
function myarray($result) {
$array = mysql_fetch_array($result);
return $array;
}
function myescape($query) {
$escape = mysql_escape_string($query);
return $escape;
}
$str = file_get_contents("foo.sql");
mybatchquery($str);
$sql = file_get_contents("sql.sql");
Seems to be the simplest answer
I use this all the time:
$sql = explode(";",file_get_contents('[your dump file].sql'));//
foreach($sql as $query)
mysql_query($query);
I hope the following code will solve your problem pretty well.
//Empty all tables' contents
$result_t = mysql_query("SHOW TABLES");
while($row = mysql_fetch_assoc($result_t))
{
mysql_query("TRUNCATE " . $row['Tables_in_' . $mysql_database]);
}
// Temporary variable, used to store current query
$templine = '';
// Read in entire file
$lines = file($filename);
// Loop through each line
foreach ($lines as $line)
{
// Skip it if it's a comment
if (substr($line, 0, 2) == '--' || $line == '')
continue;
// Add this line to the current segment
$templine .= $line;
// If it has a semicolon at the end, it's the end of the query
if (substr(trim($line), -1, 1) == ';')
{
// Perform the query
mysql_query($templine) or print('Error performing query \'<strong>' . $templine . '\': ' . mysql_error() . '<br /><br />');
// Reset temp variable to empty
$templine = '';
}
}
?>
this actually worked for me:
/* load sql-commands from a sql file */
function loadSQLFromFile($url)
{
// ini_set ( 'memory_limit', '512M' );
// set_time_limit ( 0 );
global $settings_database_name;
global $mysqli_object; global $worked; $worked = false;
$sql_query = "";
// read line by line
$lines = file($url);
$count = count($lines);
for($i = 0;$i<$count;$i++)
{
$line = $lines[$i];
$cmd3 = substr($line, 0, 3);
$cmd4 = substr($line, 0, 4);
$cmd6 = substr($line, 0, 6);
if($cmd3 == "USE")
{
// cut away USE ``;
$settings_database_name = substr($line, 5, -3);
}
else if($cmd4 == "DROP")
{
$mysqli_object->query($line); // execute this line
}
else if(($cmd6 == "INSERT") || ($cmd6 == "CREATE"))
{
// sum all lines up until ; is detected
$multiline = $line;
while(!strstr($line, ';'))
{
$i++;
$line = $lines[$i];
$multiline .= $line;
}
$multiline = str_replace("\n", "", $multiline); // remove newlines/linebreaks
$mysqli_object->query($multiline); // execute this line
}
}
return $worked;
}
?>
I have an environment where no mysql tool or phpmyadmin just my php application connecting to a mysql server on a different host but I need to run scripts exported by mysqldump or myadmin. To solve the problem I created a script multi_query as I mentioned here
It can process mysqldump output and phpmyadmin exports without mysql command line tool. I also made some logic to process multiple migration files based on timestamp stored in DB like Rails. I know it needs more error handling but currently does the work for me.
Check it out: https://github.com/kepes/php-migration
It's pure php and don't need any other tools. If you don't process user input with it only scripts made by developers or export tools you can use it safely.
This is from a project I am working on. Basically takes any text file and extracts the SQL statements while ignoring comments and gratuitous line breaks.
<?php
/*
ingestSql(string) : string
Read the contents of a SQL batch file, stripping away comments and
joining statements that are broken over multiple lines with the goal
of producing lines of sql statements that can be successfully executed
by PDO exec() or execute() functions.
For example:
-- My SQL Batch
CREATE TABLE foo(
bar VARCHAR(80),
baz INT NOT NULL);
Becomes:
CREATE TABLE foo(bar VARCHAR(80), baz INT NOT NULL);
*/
function ingestSql($sqlFilePath=__DIR__ . "/create-db.sql") {
$sqlFile = file($sqlFilePath);
$ingestedSql = "";
$statement = "";
foreach($sqlFile as $line) {
// Ignore anything between a double-dash and the end of the line.
$commentStart = strpos($line, "--");
if ($commentStart !== false) {
$line = substr($line, 0, $commentStart);
}
// Only process non-blank lines.
if (strlen($line)) {
// Remove any leading and trailing whitespace and append what's
// left of the line to the current statement.
$line = trim($line);
$statement .= $line;
// A semi-colon ends the current statement. Otherwise what was a
// newline becomes a single space;
if (substr($statement, -1) == ";") {
$ingestedSql .= $statement;
$statement = "\n";
}
else {
$statement .= " ";
}
}
}
return $ingestedSql;
}
?>

Categories