i got a function in PHP to read table from ODBC (to IBM AS400) and write it to a text file on daily basis. it works fine until it reach more than 1GB++. Then it just stop to some rows and didn't write completely.
function write_data_to_txt($table_new, $query)
{
global $path_data;
global $odbc_db, $date2;
if(!($odbc_rs = odbc_exec($odbc_db,$query))) die("Error executing query $query");
$num_cols = odbc_num_fields($odbc_rs);
$path_folder = $path_data.$table_new."/";
if (!file_exists($path_folder)) mkdir ($path_folder,0777);
$filename1 = $path_folder. $table_new. "_" . $date2 . ".txt";
$comma = "|";
$newline = chr(13).chr(10);
$handle = fopen($filename1, "w+");
if (is_writable($filename1)) {
$ctr=0;
while(odbc_fetch_row($odbc_rs))
{
//function for writing all field
// for($i=1; $i<=$num_cols; $i++)
// {
// $data = odbc_result($odbc_rs, $i);
// if (!fwrite($handle, $data) || !fwrite($handle, $comma)) {
// print "Cannot write to file ($filename1)";
// exit;
// }
//}
//end of function writing all field
$data = odbc_result($odbc_rs, 1);
fwrite($handle,$ctr.$comma.$data.$newline);
$ctr++;
}
echo "Write Success. Row = $ctr <br><br>";
}
else
{
echo "Write Failed<br><br>";
}
fclose($handle);
}
no errors, just success message but it should be 3,690,498 rows (and still increase) but i just got roughly 3,670,009 rows
My query is ordinary select like :
select field1 , field2, field3 , field4, fieldetc from table1
What i try and what i assume :
I think it was fwrite limitation so i try not to write all field (just write $ctr and 1st record) but it still stuck in same row.. so i assume its not about fwrite exceed limit..
I try to reduce field i select and it can works completely!! so i assume it have some limitation on odbc.
I try to use same odbc datasource with SQL Server and try to select all field and it give me complete rows. So i assume its not odbc limitation.
Even i try on 64 bits machine but it even worse, it just return roughly 3,145,812 rows.. So i assume it's not about 32/64 bit infrastructure.
I try to increase memory_limit in php ini to 1024mb but it didnt work also..
Is there anyone know if i need to set something in my PHP to odbc connection??
Related
I am having a problem with getting an sql query to interpolate as I would want, and would be grateful for some help please.
Within the manual page for pg_query_params,there is a code example for pg_query() passing a variable using curly braces. This appeared to be exactly what I need for my task. So, my code is as follows:
$fh = fopen('/home/www/KPI-Summary.sql',"r")
or die("Problem opening SQL file.\n");
$dbh = pg_connect("$connect")
or die('Could not connect: ' . pg_last_error());
$j = 0;
while (($line = fgets($fh)) !== false) {
$tmp[$j] = array(); // Initialise temporary storage.
$result = pg_query($dbh, $line); // process the line read.
if (!$result) { echo "Error: query did not execute"; }
...
while ($row = pg_fetch_row($result)) { // Read sql result.
$tmp[$j][2][] = $row;
}
$j++;
}
fclose($fh);
The sql file contains several queries, one per line, like this:
SELECT count(*) from table WHERE value=0 AND msgid='{$arg[1]}';
However, currently, my variable is not being replaced by the contents -- and therefore although the query runs OK, it is returning zero rows. What do I need to do in order to get the expected result? (Note: each sql line varies, and the query parameters are not constant -- hence using variable(s) within the sql file.)
OK. I have a solution (although it might not be the correct approach).
This works -- but it needs polish I think. Suggestions regarding a better regexp would be very much appreciated.
$bar = 'VALUE-A'; // Can we replace simple variable names?
$arg[1] = 'VALUE-B'; // What about an array, such as $arg[1]?
function interpolate($matches){
global $bar;
global $arg;
if ($matches[2]) {
$i = isset(${$matches[1]}[$matches[2]]) ? ${$matches[1]}[$matches[2]] : 'UNDEF';
} else {
$i = isset(${$matches[1]}) ? ${$matches[1]} : 'UNDEF';
}
return $i;
}
$fh = fopen('/home/www/file.sql',"r") or die("Failed.\n");
while (($line = fgets($fh)) !== false) {
...
$line = preg_replace_callback('|\{\$([a-z]+)\[*(\d*)\]*}|i', "interpolate", $line);
echo $line; // and continue with rest of code as above.
}
fclose($fh);
(Of course, the solution suggests that the question title is completely wrong. Is there any way to edit this?)
did you use pg_escape_string?
$arg[1] = pg_escape_string($arg[1]);
$line="SELECT count(*) from table WHERE value=0 AND msgid='{$arg[1]}';";
I have a query selects all from the database table and writes it to a text file. If the state is small (say max of 200k rows), the code still works and writes it to the text file. Problem arises when I have a state that has 2M rows when queried, then there's also the fact that the table has 64 columns.
Here's a part of the code:
create and open file
$file = "file2.txt";
$fOpen = fopen($file, "a"); // Open file, write and append
$qry = "SELECT * FROM tbl_two WHERE STE='48'";
$res = mysqli_query($con, $qry);
if(!$res) {
echo "No data record" . "<br/>";
exit;
}
$num_res =mysqli_num_rows($res);
for ($i=0; $i<=$num_res; $i++) {
$row = mysqli_fetch_assoc ($res);
$STATE = (trim($row['STATE'] === "") ? " " : $row['STATE']);
$CTY = (trim($row['CTY']=== "") ? " " : $row['CTY']);
$ST = (trim($row['ST']=== "") ? " " : $row['ST']);
$BLK = (trim($row['BLK']=== "") ? " " : $row['BLK']);
....
....
//64th column
$data = "$STATE$CTY$ST$BLK(to the 64th variable)\r\n";
fwrite($f,$data);
}
fclose($f);
I tried putting a limit to the query:
$qry = "SELECT * FROM tbl_two WHERE STE='48' LIMIT 200000";
Problem is, it just writes until the 200kth line, and it doesn't write the remaining 1.8m lines.
If I don't put a limit to the query, it encounters the error Out of memory .... . TIA for any kind suggestions.
First you need to use buffer query for fetching the data Read it
Queries are using the buffered mode by default. This means that query results are immediately transferred from the MySQL Server to PHP and then are kept in the memory of the PHP process.
Unbuffered MySQL queries execute the query and then return a resource while the data is still waiting on the MySQL server for being fetched. This uses less memory on the PHP-side, but can increase the load on the server. Unless the full result set was fetched from the server no further queries can be sent over the same connection. Unbuffered queries can also be referred to as "use result".
NOTE: buffered queries should be used in cases where you expect only a limited result set or need to know the amount of returned rows before reading all rows. Unbuffered mode should be used when you expect larger results.
Also optimize the array try to put variable directly and you while loop only
pdo = new PDO("mysql:host=localhost;dbname=world", 'my_user', 'my_pass');
$pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);
$uresult = $pdo->query("SELECT * FROM tbl_two WHERE STE='48' LIMIT 200000");
if ($uresult) {
$lineno = 0;
while ($row = $uresult->fetch(PDO::FETCH_ASSOC)) {
echo $row['Name'] . PHP_EOL;
// write value in text file
$lineno++;
}
}
I'm embarrassed because this should be a pretty simple task, but I can't figure out why this is not working. I'm using a tab separated file to get values I need to populate a MySQL table. I have 2 MySQL tables, clients and data The clients table has an ID I need to fetch and use in the insert to the data table
<?php
// MySQL settings
define('DB_SERVER', 'localhost');define('DB_USERNAME', 'USER');
define('DB_PASSWORD', 'pass');define('DB_DATABASE', 'DB');
// connect to DB
if ($db = mysqli_connect(DB_SERVER,DB_USERNAME,DB_PASSWORD,DB_DATABASE)){}
else {echo 'Connection to DB failed';die();}
// load tab delim file
$file = "file.csv";// TSV actually
$handle = fopen($file, "r"); // Make all conditions to avoid errors
$read = file_get_contents($file); //read
$lines = explode("\n", $read);//get
$i= 0;//initialize
// loop over file, one line at a time
foreach($lines as $key => $value){
$cols[$i] = explode("\t", $value);
// get order ID for this URL
//$cols[$i]['6'] stores website URLs that match `salesurl` in the clients table
$getidsql = 'select `id` FROM DB.clients WHERE `salesurl` = \''. $cols[$i]['6'].'\'';
if ($result = mysqli_query($db, $getidsql)){
$totalcnt = mysqli_num_rows($result);
$idrow = mysqli_fetch_array($result);
echo '<h1>:'. $idrow['id'] .': '.$totalcnt.'</h1>'; //prints ':: 0'
} else {
echo mysqli_error($db);
echo 'OOPS<hr>'. $getidsql .'<hr>';
}
// if $idrow['id'] actually had a value, then
$insertqry = 'INSERT INTO `data` ......';
$i++;
} //end for each, file line loop
?>
The $getidsql query does work when copy pasted into PHPMyADMIN I get the id result, but within this script mysqli_num_rows is ALWAYS zero, and $idrow is never populated eg; NO ERRORS.. but no result (well, an empty result)
Turns out my code was working fine. My problem was with the data file I was working with. All the data had non-printable characters in it, in fact each character was followed by a non-ASCII character. Running this preg_replace prior to using it in my query solved the problem.
$data[$c] = preg_replace('/[\x00-\x08\x0B\x0C\x0E-\x1F\x80-\x9F]/u', '', $data[$c]);
I have a large study conducted with about 50 questions and 70,000 entries, so manually editing or using pivot tables just won't really work, I need to upload the data into a database. I can't get the Japanese characters to be read with any accuracy while using fgcsv(). I've tried setting the locale to UTF-8 and SJIS, but neither one seem to want to read all of the Japanese characters. I read somewhere this might be a bug, but I don't know..
The data looks like this:
Q-004 必須回答 あなたは、以下のどちらにお住まいですか? S/A
1 北海道 Hokkaido
2 青森県 Aomori
3 岩手県 Iwate
4 宮城県 Miyagi
5 秋田県 Akita
Here is my code:
setlocale(LC_ALL, 'ja_JP.SJIS');
$fp = fopen($_POST["filename"],'r') or die("can't open file");
$csv_line = fgetcsv($fp,1024);
$query = "";
$count = 0;
$question = false;
while($csv_line = fgetcsv($fp,1024)) {
if (!$question && strpos($csv_line[0],"Q-")!== false)
{
echo "Found a question: ".$csv_line[2] . "<br>";
$question = true;
}
else if($question && strlen($csv_line[0])==0)
{
echo "<hr>";
$question = false;
}
else if($question && intval($csv_line[0])>0)
{
echo $csv_line[0]. " has value ". $csv_line[2]." - ".$csv_line[3]. "<br>";
}
$count++;
}
echo "$count records read successfully";
fclose($fp) or die("can't close file");
Here is the result:
Found a question: A以下のどちらにお住まいですか?
1 has value k海道 - Hokkaido
2 has value X県 - Aomori
3 has value - Iwate
4 has value {城県 - Miyagi
5 has value H田県 - Akita
When it comes to reading a CSV in PHP, I would say... don't do it, and use an SQL database instead, wherein you can set a collation such as ujis_japanese_ci in MySQL.
You should be able to easily import your CSV into a MySQL database using phpMyAdmin, if that is what you have, and then render the data from the MySQL database instead of reading a CSV file.
It is a work-around, granted, but my general experience is that CSV + foreign/special characters == problems.
I believe it is at least worth the try. Good luck
I want to insert about 50,000 mysql query for 'insert' in mysql db,
for this i have 2 options,
1- Directly import the (.sql) file:
Following error is occur
" You probably tried to upload too large file. Please refer to documentation for ways to workaround this limit. "
2- Use php code to insert these queries in form of different chunks from the (.sql) file.
here is my code:
<?php
// Configure DB
include "config.php";
// Get file data
$file = file('country.txt');
// Set pointers & position variables
$position = 0;
$eof = 0;
while ($eof < sizeof($file))
{
for ($i = $position; $i < ($position + 2); $i++)
{
if ($i < sizeof($file))
{
$flag = mysql_query($file[$i]);
if (isset($flag))
{
echo "Insert Successfully<br />";
$position++;
}
else
{
echo mysql_error() . "<br>\n";
}
}
else
{
echo "<br />End of File";
break;
}
}
$eof++;
}
?>
But memory size error is occur however i have extend memory limit from 128M to 256M or even 512M.
Then i think that if i could be able to load a limited rows from (.sql) file like 1000 at a time and execute mysql query then it may be import all records from file to db.
But here i dont have any idea for how to handle file start location to end and how can i update the start and end location, so that it will not fetch the previously fetched rows from .sql file.
Here is the code you need, now prettified! =D
<?php
include('config.php');
$file = #fopen('country.txt', 'r');
if ($file)
{
while (!feof($file))
{
$line = trim(fgets($file));
$flag = mysql_query($line);
if (isset($flag))
{
echo 'Insert Successfully<br />';
}
else
{
echo mysql_error() . '<br/>';
}
flush();
}
fclose($file);
}
echo '<br />End of File';
?>
Basically it's a less greedy version of your code, instead of opening the whole file in memory it reads and executes small chunks (one liners) of SQL statements.
Instead of loading the entire file into memory, which is what's done when using the file function, a possible solution would be to read it line by line, using a combinaison of fopen, fgets, and fclose -- the idea being to read only what you need, deal with the lines you have, and only then, read the next couple of ones.
Additionnaly, you might want to take a look at this answer : Best practice: Import mySQL file in PHP; split queries
There is no accepted answer yet, but some of the given answers might already help you...
Use the command line client, it is far more efficient, and should easily handle 50K inserts:
mysql -uUser -p <db_name> < dump.sql
I read recently about inserting lots of queries into a database to quickly. The article suggested using the sleep() (or usleep) function to delay a few seconds between queries so as not to overload the MySQL server.