Retrive Data Fron Txt File And Save It in Database - php

I have A data in txt file I want to save it in database
mytxtfile
xxxxxxxxx
xxxxxxxxx
xxxxxxxxx
I want to Save each line in database...
I want to make a PHP file which take 1st line from txt file and save it in db. Then again second line and so on....
myphpcode
$myFile = "data.txt";
$fh = fopen($myFile, 'r');
$theData = fgets($fh);
$con=mysqli_connect("example.com","peter","abc123","my_db");
// Check connection
if (mysqli_connect_errno())
{
echo "Failed to connect to MySQL: " . mysqli_connect_error();
}
mysqli_query($con,"INSERT INTO Persons (ID)
VALUES (''.$theData.'')");
mysqli_close($con);
?>
This Code Is Just Saving 1st line..

I'll answer your question first. Your code is fine in it's present condition but the problem is only one line being read. If you want to read the complete file content, you'll have to loop through each line. Change your code to include a while loop, like so:
$fh = fopen('file.txt', 'r');
while(!feof($fh)){
$theData = fgets($fh);
echo $theData; //save it to DB
}
feof() checks for end-of-file on a file pointer. If EOF (end of file) is reached, it'll exit the loop.
Alternative solution:
Using LOAD DATA INFILE:
Example:
LOAD DATA INFILE 'data.txt' INTO TABLE Persons
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
The advantage of using LOAD DATA INFILE is that it's considerably faster than other methods, but the speed may vary depending on other factors.

You should try something like:
while (!feof($fh)) {
$line = fgets($fh);
mysqli_query($con,"INSERT INTO Persons (ID)
VALUES ('".$line."')");
}
The loop goes through each line of the file and then inserts it into the database.
It is not the most efficient way, especially if you have a very large number of lines, but it does the trick. If you do have very many lines, than you should try and do everything in a single INSERT.

it's as simple as:
$ip = fopen('data.txt', "r");
$line = trim(fgets($ip));
while(!feof($ip)){
// do something with line in $line
//....
// reading next line
$line = trim(fgets($ip));
}

Two possibilities for this question:
First one just parse file using the SplFileObject http://php.net/manual/fr/class.splfileobject.php
In my mind using Spl object is always a good way of coding in PHP.
The second one consist in using LOAD DATA INFILE functionality of MySQL http://dev.mysql.com/doc/refman/5.6/en/load-data.html
This is a good way if you don't need to use data from the file but just storing them.

Related

in php read text file and insert file data into mysql line by line but they read one row they cannot read complete file

I have a problem I need some code in PHP the read txt file line by line and insert all data into MySQL database but this code read all data and insert only single row please anyone can help me to solve this problem.
Note: I need a code that is read text file line by line and insert all data into MySQL database and i insert large txt file data into MySQL database so please anyone can help me
$file = fopen("members.txt", "r");
while (!feof($file)) {
$line_of_text= fgets($file);
$mem .= explode("\n", $line_of_text);
}
list($cname,$std,$longtext)=$mem;
$t=mysqli_real_escape_string($con,$longtext);
$sql=mysqli_query($con,"INSERT into test(class,student,info) values('$cname','$std','$t')");
fclose($file);
print_r($mem);
Put your INSERT inside your loop, otherwise you will only insert the last row in the loop.

Importing CSV file into SQL using PHP not working

I am trying to import a CSV file into my SQL database. This is what I have:
if ($_FILES[csvFile][size] > 0)
{
$file = $_FILES[csvFile][tmp_name];
$handle = fopen($file,"r");
do {
if ($data[0])
{
$insert_query = "REPLACE INTO `teacherNames` SET
`schoolName` = '".addslashes($schoolname)."',
`teacherName` = '".addslashes($data[0])."'
;";
$result = mysql_query($insert_query);
echo $insert_query; -- SEE RESULTING QUERY BELOW
echo $data[0]." added\n<br />";
}
}
while ($data = fgetcsv($handle,1000,",","'"));
The CSV file has 3 records and it looks correct. The procedure works to an extent but for some reason it is not reading the CSV file correctly and the resulting query is like this:
REPLACE INTO `teacherNames` SET `schoolName` = 'Brooks', `teacherName` = 'RMG JMC PMC';
When I would expect to get 3 separate queries - one for each record. It does not seem to be reading the CSV file as 3 separate records but as 1. Can anyone see why?
UPDATE:
The CSV contents are:
RMG
JMC
PMC
The anwer of Julio Martins is better if you have the file on the same computer as the MySQL server.
But if you need to read the file from inside the PHP, there is a note from PHP.NET at http://php.net/manual/en/function.fgetcsv.php :
Note: If PHP is not properly recognizing the line endings when reading
files either on or created by a Macintosh computer, enabling the
auto_detect_line_endings run-time configuration option may help
resolve the problem.
How is the line endings on your file? As all lines are being read as one, it can be your case i guess.
To turn auto_detect_line_endings on, use ini_set("auto_detect_line_endings", true); as said Pistachio at http://php.net/manual/en/filesystem.configuration.php#107333
Use while instead do-while:
while ($data = fgetcsv($handle,1000,",","'")) {
//...
}
Try load data:
LOAD DATA INFILE '{$filepath}'
INTO TABLE '{$table}'
FIELDS TERMINATED BY ','
It is cleaner

Importing CSV with odd rows into MySQL

I'm faced with a problematic CSV file that I have to import to MySQL.
Either through the use of PHP and then insert commands, or straight through MySQL's load data infile.
I have attached a partial screenshot of how the data within the file looks:
The values I need to insert are below "ACC1000" so I have to start at line 5 and make my way through the file of about 5500 lines.
It's not possible to skip to each next line because for some Accounts there are multiple payments as shown below.
I have been trying to get to the next row by scanning the rows for the occurrence of "ACC"
if (strpos($data[$c], 'ACC') !== FALSE){
echo "Yep ";
} else {
echo "Nope ";
}
I know it's crude, but I really don't know where to start.
If you have a (foreign key) constraint defined in your target table such that records with a blank value in the type column will be rejected, you could use MySQL's LOAD DATA INFILE to read the first column into a user variable (which is carried forward into subsequent records) and apply its IGNORE keyword to skip those "records" that fail the FK constraint:
LOAD DATA INFILE '/path/to/file.csv'
IGNORE
INTO TABLE my_table
CHARACTER SET utf8
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 4 LINES
(#a, type, date, terms, due_date, class, aging, balance)
SET account_no = #account_no := IF(#a='', #account_no, #a)
There are several approaches you could take.
1) You could go with #Jorge Campos suggestion and read the file line by line, using PHP code to skip the lines you don't need and insert the ones you want into MySQL. A potential disadvantage to this approach if you have a very large file is that you will either have to run a bunch of little queries or build up a larger one and it could take some time to run.
2) You could process the file and remove any rows/columns that you don't need, leaving the file in a format that can be inserted directly into mysql via command line or whatever.
Based on which approach you decide to take, either myself or the community can provide code samples if you need them.
This snippet should get you going in the right direction:
$file = '/path/to/something.csv';
if( ! fopen($file, 'r') ) { die('bad file'); }
if( ! $headers = fgetcsv($fh) ) { die('bad data'); }
while($line = fgetcsv($fh)) {
echo var_export($line, true) . "\n";
if( preg_match('/^ACC/', $line[0] ) { echo "record begin\n"; }
}
fclose($fh);
http://php.net/manual/en/function.fgetcsv.php

break down the large number of rows in CSV and process them as a batch wise php

I am upto edit and upload a CSV with more then 50000 records (shopping cart produts). and it should update the number of tables in the system. So i m using zend framework with my shopping cart.
im planing to break them down (50000 CSV records) in memory before processing them batch-wise using PHP/MYSQL
please any one can give me advice on this
What im up to now is
public function getDataFromPath($path=null) {
if($path == null) {
$path = $this->_path;
}
ini_set("auto_detect_line_endings", 1);
$fp = fopen($path, "r");
while(($line = fgetcsv($fp, 5000, ",")) !== FALSE) {
$line = self::trimArray($line);
$this->data[] = $line;
}
fclose($fp);
return $this->data;
}
regards
roshan
Every RDBMS out there shouldn't have problems with 50.000 rows. That's nothing. There's no need to process them batch wise.
Just use the LOAD DATA INFILE command and you will be fine.
For an example see here: LOAD DATA INFILE easily convert YYYYMMDD to YYYY-MM-DD?
UPDATE (cause of comment to Ion Wood's answer): To create a CSV file you can use the SELECT .. INTO OUTFILE command.
SELECT a,b,a+b INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;
For more info, see the manual.
This is a job for...
DATABASE MAN!!!
Load your csv directly into a table using the load data infile business and do what ever magic you need to after that.

PHP MySQL query stops after 1273459 loops

I am using the script:
$file=fopen('part1.csv', 'r');
mysql_connect('localhost', '~~~', '~~~') or die(mysql_error());
mysql_select_db('Stubby') or die(mysql_error());
while (($buffer = fgets($file, 4096)) !== false) {
//echo $buffer;
$q = mysql_query('INSERT INTO allCombos (combo) VALUES (\'' . $buffer . '\')') or die(mysql_error());
}
fclose($file);
To load the very long contents of a CSV into a database. The CSV has around 3.5M lines. The querys stop at 1273459 lines. Why?
PHP generally sets its default script load timelimit to 30 seconds; you're probably hitting that limit. You can manually override it.
set_time_limit(0); //sets the time limit to infinity
set_time_limit(600); //sets the time limit to 10 minutes
Another possibility is that your script has run out of memory. You can raise it by doing something like:
ini_set('memory_limit', '32M'); //raises limit to 32 megabytes
Just stops, or any error is shown? On the other hand, you seem just copy file to database, why not to use LOAD DATA INFILE command? So you don't need a loop, and maybe even php application, that command can load csv file into the table (the fastest way to do it).
In your case you can execute the followig command to export:
LOAD DATA INFILE 'part1.csv' INTO TABLE allCombos
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
(just copied from MySQL LOAD DATA description page and set your parameters). Btw, are there any other fields, or csv file has everything what should be in that table?
I'm guessing you're running out of memory or exceeding the processing time allowed. Why not chunk it into smaller pieces and do it in groups of 500,000 or something? Or adjust the timeout.
http://php.net/manual/en/function.set-time-limit.php
Increasing memory you would have to do through the php.ini file.
http://php.net/manual/en/ini.core.php
build one big insert and hit database ones:
q = 'INSERT INTO allCombos (combo) VALUES ';
while (($buffer = fgets($file, 4096)) !== false) {
//echo $buffer;
$q .=' (\'' . $buffer . '\'), ';
}
q = substr ( q, 0, -2 ); // remove last comma
mysql_query (q);
playing with time limits also will help, however it will be more effective using resources...
most effective is using LOAD DATA INFILE form Maxym

Categories