Importing CSV file into SQL using PHP not working - php

I am trying to import a CSV file into my SQL database. This is what I have:
if ($_FILES[csvFile][size] > 0)
{
$file = $_FILES[csvFile][tmp_name];
$handle = fopen($file,"r");
do {
if ($data[0])
{
$insert_query = "REPLACE INTO `teacherNames` SET
`schoolName` = '".addslashes($schoolname)."',
`teacherName` = '".addslashes($data[0])."'
;";
$result = mysql_query($insert_query);
echo $insert_query; -- SEE RESULTING QUERY BELOW
echo $data[0]." added\n<br />";
}
}
while ($data = fgetcsv($handle,1000,",","'"));
The CSV file has 3 records and it looks correct. The procedure works to an extent but for some reason it is not reading the CSV file correctly and the resulting query is like this:
REPLACE INTO `teacherNames` SET `schoolName` = 'Brooks', `teacherName` = 'RMG JMC PMC';
When I would expect to get 3 separate queries - one for each record. It does not seem to be reading the CSV file as 3 separate records but as 1. Can anyone see why?
UPDATE:
The CSV contents are:
RMG
JMC
PMC

The anwer of Julio Martins is better if you have the file on the same computer as the MySQL server.
But if you need to read the file from inside the PHP, there is a note from PHP.NET at http://php.net/manual/en/function.fgetcsv.php :
Note: If PHP is not properly recognizing the line endings when reading
files either on or created by a Macintosh computer, enabling the
auto_detect_line_endings run-time configuration option may help
resolve the problem.
How is the line endings on your file? As all lines are being read as one, it can be your case i guess.
To turn auto_detect_line_endings on, use ini_set("auto_detect_line_endings", true); as said Pistachio at http://php.net/manual/en/filesystem.configuration.php#107333

Use while instead do-while:
while ($data = fgetcsv($handle,1000,",","'")) {
//...
}

Try load data:
LOAD DATA INFILE '{$filepath}'
INTO TABLE '{$table}'
FIELDS TERMINATED BY ','
It is cleaner

Related

limitation with csv bulk insert in mysql

I have a functionality with csv file upload. Functionality is working if csv is having nearly 30k rows. But whenever csv file will have more than 30k rows then bulk insert is not working. Below is my code for reading csv and inserting into table.
$csvfile = fopen($file, 'r');
$i = 0;
$data4 = "";
while (!feof($csvfile))
{
$csv_data[] = fgets($csvfile, 1024);
$csv_array = explode(";", $csv_data[$i]);
$data4.= "('".$csv_array[2]."', '".$csv_array[4]."', '".$csv_array[0]."','".$csv_array[1]."'),";
}
$i++;
fclose($csvfile);
$data4 = substr($data4,0,-1);
$sql = "INSERT INTO csv_table(`column1`,`column2`,`column3`,`column4`) VALUES $data4";
mysqli_query($mysqliConn,$sql);
Only I am having issue when I have records more than 30k. Please suggest me changes here.
Thanks in advance
Pro tip: "Not working," of course, can mean anything from "my server caught fire and my data center burned to the ground," to "all my values were changed to 42," to "the operation had no effect." Understand your errors. Check the errors that come back from operations like mysqli_query().
That being said...
You're slurping up your entire CSV file's contents and jamming it into a single text string. It's likely that method falls over when the csv file is too long.
There's a limit on the length of a MySQL query. It's large, but not infinite, and it's set both by a server parameter and the server configuration. Read this. https://dev.mysql.com/doc/refman/5.7/en/packet-too-large.html
php can run out of memory as well.
How to fix? Process your CSV file not all at once, but in chunks of fifty rows or so. Once you've read fifty rows, do an insert.
Pro tip 2: Sanitize your input data. What will happen to your table if somebody puts a row like this in an uploaded CSV file?
"bwahahahaha!'; -- DROP TABLE csv_table;", "something", "somethingelse"
You may be OK. But do you want to run a risk like this?
Be aware that the public net is crawling with cybercriminals, and somebody will detect and exploit this kind of vulnerability in days if you leave it running.

Retrive Data Fron Txt File And Save It in Database

I have A data in txt file I want to save it in database
mytxtfile
xxxxxxxxx
xxxxxxxxx
xxxxxxxxx
I want to Save each line in database...
I want to make a PHP file which take 1st line from txt file and save it in db. Then again second line and so on....
myphpcode
$myFile = "data.txt";
$fh = fopen($myFile, 'r');
$theData = fgets($fh);
$con=mysqli_connect("example.com","peter","abc123","my_db");
// Check connection
if (mysqli_connect_errno())
{
echo "Failed to connect to MySQL: " . mysqli_connect_error();
}
mysqli_query($con,"INSERT INTO Persons (ID)
VALUES (''.$theData.'')");
mysqli_close($con);
?>
This Code Is Just Saving 1st line..
I'll answer your question first. Your code is fine in it's present condition but the problem is only one line being read. If you want to read the complete file content, you'll have to loop through each line. Change your code to include a while loop, like so:
$fh = fopen('file.txt', 'r');
while(!feof($fh)){
$theData = fgets($fh);
echo $theData; //save it to DB
}
feof() checks for end-of-file on a file pointer. If EOF (end of file) is reached, it'll exit the loop.
Alternative solution:
Using LOAD DATA INFILE:
Example:
LOAD DATA INFILE 'data.txt' INTO TABLE Persons
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
The advantage of using LOAD DATA INFILE is that it's considerably faster than other methods, but the speed may vary depending on other factors.
You should try something like:
while (!feof($fh)) {
$line = fgets($fh);
mysqli_query($con,"INSERT INTO Persons (ID)
VALUES ('".$line."')");
}
The loop goes through each line of the file and then inserts it into the database.
It is not the most efficient way, especially if you have a very large number of lines, but it does the trick. If you do have very many lines, than you should try and do everything in a single INSERT.
it's as simple as:
$ip = fopen('data.txt', "r");
$line = trim(fgets($ip));
while(!feof($ip)){
// do something with line in $line
//....
// reading next line
$line = trim(fgets($ip));
}
Two possibilities for this question:
First one just parse file using the SplFileObject http://php.net/manual/fr/class.splfileobject.php
In my mind using Spl object is always a good way of coding in PHP.
The second one consist in using LOAD DATA INFILE functionality of MySQL http://dev.mysql.com/doc/refman/5.6/en/load-data.html
This is a good way if you don't need to use data from the file but just storing them.

export php to excel with separate column

I am trying to export my database to .csv the exporting done ,but it put all the fields
in one column (without separating)
the code :
$sql = "SELECT ARP_name ,Student_name ,institute ,id ,Major from istyle ";
$results=mysql_query($sql);
$filename = "uploaded/".time().".csv";
$handle = fopen($filename, 'w+');
fputcsv($handle, array_keys("ARP_name","Student_name","institute"));
while($row = mysql_fetch_assoc($results))
{
fputcsv($handle, array($row["ARP_name"], $row["Student_name"],$row["institute"]));
}
the result is :
Since your csv export looks perfectly fine on first glance I assume you mean that the spreadsheet you are trying to import the data into puts everything into one single column?
There are various settings you can adjust to describe the csv format details when importing data into spreadsheet applications. Check the applications preferences dialog. OpenOffice and LibreOffice Calc applications come with a great wizard for this.
For separate columns use ' \t ' for separate rows use ' \n '

csv data import into mysql database using php

Hi I need to import a csv file of 15000 lines.
I m using the fgetcsv function and parsing each and every line..
But I get a timeout error everytime.
The process is too slow and data is oly partially imported.
Is there any way out to make the data import faster and more efficient?
if(isset($_POST['submit']))
{
$fname = $_FILES['sel_file']['name'];
$var = 'Invalid File';
$chk_ext = explode(".",$fname);
if(strtolower($chk_ext[1]) == "csv")
{
$filename = $_FILES['sel_file']['tmp_name'];
$handle = fopen($filename, "r");
$res = mysql_query("SELECT * FROM vpireport");
$rows = mysql_num_rows($res);
if($rows>=0)
{
mysql_query("DELETE FROM vpireport") or die(mysql_error());
for($i =1;($data = fgetcsv($handle, 10000, ",")) !== FALSE; $i++)
{
if($i==1)
continue;
$sql = "INSERT into vpireport
(item_code,
company_id,
purchase,
purchase_value)
values
(".$data[0].",
".$data[1].",
".$data[2].",
".$data[3].")";
//echo "$sql";
mysql_query($sql) or die(mysql_error());
}
}
fclose($handle);
?>
<script language="javascript">
alert("Successfully Imported!");
</script>
<?
}
The problem is everytime it gets stuck in between the import process and displays the following errors:
Error 1 :
Fatal Error: Maximum time limit of 30 seconds exceeded at line 175.
Error 2 :
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'S',0,0)' at line 1
This error I m not able to detect...
The file is imported oly partial everytime.. oly around 200 300 lines out of a 10000 lines..
You can build a batch update string for every 500 lines of csv and then execute it at once if you are doing the mysql inserts on each line. It'll be faster.
Another solution is to read the file with an offset:
Read the first 500 lines,
Insert them to the database
Redirect to csvimporter.php?offset=500
Return the 1. step and read the 500 lines starting with offset 500 this time.
Another solution would be setting the timeout limit to 0 with:
set_time_limit(0);
Set this at the top of the page:
set_time_limit ( 0 )
It will make the page run endlessly. However, that is not recommended but if you have no other option then cant help!
You can consult the documentation here.
To make it faster, you need to check your the various SQL you are sending and see if you have proper indexes created.
If you are calling user defined functions and these functions are referring to global variables, then you can minimize the time take even more by passing those variables to the function and change the code so that the function refers to those passed variables. Referring to global variables is slower than local variables.
You can make use of LOAD DATA INFILE which is a mysql utility, this is much faster than fgetcsv
more information is available on
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
simply use this # the beginning of your php import page
ini_set('max_execution_time',0);
PROBLEM:
There is a huge performance impact on the way you INSERT data into your table. For every one of your records you send an INSERT request to the server, 15000 INSERT requests that's huge!
SOLUTION::
Well you should group your data like the way mysqldump does. In your case you just need three insert statement not 15000 as below:
before the loop write:
$q = "INSERT into vpireport(item_code,company_id,purchase,purchase_value)values";
And inside the loop concatenate the records to the query as below:
$q .= "($data[0],$data[1],$data[2],$data[3]),";
Inside the loop check that the counter is equal to 5000 OR 10000 OR 15000 then insert data to the vpireprot table and then set the $q to INSERT INTO... again.
run the query and enjoy!!!
If this is a one-time exercise, PHPMyAdmin supports Import via CSV.
import-a-csv-file-to-mysql-via-phpmyadmin
He also notes the user of leveraging MySQL's LOAD DATA LOCAL INFILE. This is a very fast way to import data into a database table. load-data Mysql Docs link
EDIT:
Here is some pseudo-code:
// perform the file upload
$absolute_file_location = upload_file();
// connect to your MySQL database as you would normally
your_mysql_connection();
// execute the query
$query = "LOAD DATA LOCAL INFILE '" . $absolute_file_location .
"' INTO TABLE `table_name`
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(column1, column2, column3, etc)";
$result = mysql_query($query);
Obviously, you need to ensure good SQL practices to prevent injection, etc.

How can I load an .xls file to a mysql database in php?

I want the exact opposite of this.
I want to load a .xls file full of emails into a database column.
What's the simplest way to do this?
Like Ponies said, you will have to first "Save As > CSV" then upload and do whatever. Often permissions will not allow that SQL statement however. My personal favorite is using csv2sql.com because it will do the heavy lifting and give you the SQL to just execute, however you could use fgetcsv which would be more secure, and something you can implement into a script.
With fgetcsv, you could do something like this assuming you just had a one column file with the email addresses listed:
$row = 1;
if (($handle = fopen("test.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);
//echo "<p> $num fields in line $row: <br /></p>\n";
$row++;
for ($c=0; $c < $num; $c++) {
if ($c==0){
//This is the first column
//Just run a generic MySQL insert command here (see below for link for more info on this)
}
}
}
fclose($handle);
}
Tutorial on PHP MySQL Insert: at W3Schools. That should get you going on this endeavor.
Export the Excel spreadsheet to CSV format
Use MySQL's LOAD DATA INFILE syntax to use the CSV file created in Step 1, using:
LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
The file will have to be locally accessible to the MySQL instance, so you'll have to upload it if the server isn't physically accessible.
You can use PHPExcel, which can read most Excel file formats (BIFF5, BIFF8, spreadsheetML). Unfortunately the project's docs are in a word file, so can't link directly to the specific section, but it's section 6.4.1 / page 37 of this file.

Categories