Automatically Upload CSV file to Database from a specified folder . - php

Is it possible to upload data from CSV file to my database AUTOMATICALLY ? I am making a PHP project , in which It is needed the database to get updated from a specified path folder . Kindly help !

This can be archived with plain SQL on the server.
Take a look at LOAD DATA INFILE Statement: https://dev.mysql.com/doc/refman/5.7/en/load-data.html
LOAD DATA INFILE '/tmp/test.txt' INTO TABLE test
FIELDS TERMINATED BY ','

Yes its easy to implement. you have to create one PHP script.
get all the files from specific folder using scandir
parse csv files using fgetcsv
write database connection code and insert query to insert csv records.
set PHP script in Cron Scheduler which would take your CSV entries, format into INSERT SQL statements and fire the query.
Useful links:
http://php.net/manual/en/function.scandir.php
How to parse a CSV file using PHP
http://coyotelab.org/php/upload-csv-and-insert-into-database-using-phpmysql.html

Related

How can I updates my records in MYSQL using csv

I Have created a table on my db, and filled all the records, using CSV file.
I need to do this weekly to keep the table updated.
I want to upload the new records without disturbing the old one onto the same table using csv.
[I have to pick the data from remote host and upload it locally on my server, i dont have access to the remote db]
Kindly guide me.
You can upload records from a CSV into a table VERY quickly using the load data infile syntax (http://dev.mysql.com/doc/refman/5.1/en/load-data.html)
The syntax is pretty simple, but also flexible. This is an example:
LOAD DATA INFILE 'data.txt' INTO TABLE table2
FIELDS TERMINATED BY '\t';
You can kick these off from a console or via code.
This will append to the table, not replace it, so if you don't truncate it first, it should work a charm.
You can of course also load the data manually by parsing the CSV file in your code and manually creating an insert statement for each line of code, but if the format is fixed already, this will be quicker and more efficient.
Edit: It appends the data. By default, no database will delete data from a table unless you specifically tell it to. Any insert statement is what you consider an append statement.

whats the best way to parse a huge sql file

I am trying to parse a large sql file to a csv file. I have considered using fread in php but cant figure out if sql is separated by lines...bc I am assuming that fread is loading the data into RAM and that would not work.
Any ideas on how quickly convert sql to csv? (also I am running on a different machine than my db is on...so I cant export as csv unfortunately).
"Large" - what does it mean to you.
You can save to a file on server (the machine DB is running) and compress/download.
Exaple:
SELECT name,lastname,age FROM profile
The query returns three columns of the mysql table. Now for redirecting/print into a file:
SELECT name,lastname,age FROM profile INTO OUTFILE '/tmp/userdata.txt'
This will output data into the passed file in the above statement.
To output data in terms of CSV format add more options to the query as following:
SELECT name,lastname,age FROM profile INTO OUTFILE '/tmp/userdata.txt'
FIELDS enclosed by '"' separated by "," LINES TERMINATED BY '\n'
original post
install mysql on your local machine. now you can import the sql file, then freely export as csv or whatever you want.

import client data from csv into mysql

I am trying to import write a script that imports a csv file and parses it into mysql, then imports it in the db, I came across this article,but it seems to be written for ms sql, does anyone know of a tutorial for doing this in mysql, or better still a libary or script that can do it ?.
Thanks :-)
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Using the LOAD DATA INFILE SQL statement
Example :
LOAD DATA LOCAL INFILE '/importfile.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(field1, filed2, field3);
If you are looking for script / library. I want to refer below link. here you can find :
PHP script to import csv data into mysql
This is a simple script that will allow you to import csv data into your database. This comes handy because you can simply edit the appropriate fields, upload it along with the csv file and call it from the web and it will do the rest.
It allows you to specify the delimiter in this csv file, whether it is a coma, a tab etc. It also allows you to chose the line separator, allows you to save the output to a file (known as a data sql dump).
It also permits you to include an empty field at the beginning of each row, which is usually an auto increment integer primary key.
This script is useful mainly if you don’t have phpmyadmin, or you don’t want the hassle of logging in and prefer a few clicks solution, or you simply are a command prompt guy.
Just make sure the table is already created before trying to dump the data.
Kindly post your comments if you got any bug report.

Automatically import CSV file and upload to database

One of my clients has all of his product information handled by an outside source. They have provided this to me in a CSV file which they will regulary update and upload to an ftp folder of my specification, say every week.
Within this CSV file is all of the product information; product name, spec, image location etc.
The site which I have built for my client is running a MySQL database, which I thought would be holding all of the product information, and thus has been built to handle all of the product data.
My question is this: How would I go about creating and running a script that would find a newly added CSV file from the specified FTP folder, extract the data, and replace all of the data within the relevant MySQL table, all done automatically?
Is this even possbile?
Any help would be greatly appreciated as I don't want to use the IFrame option, S.
should be pretty straight forward depending on the csv file
some csv files have quotes around text "", some don't
some have , comma inside the quoted field etc
depending on you level of php skills this should be reasonably easy
you can get a modified timestamp from the file to see if it is new
http://nz.php.net/manual/en/function.lstat.php
open the file and import the data
http://php.net/manual/en/function.fgetcsv.php
insert into the database
http://nz.php.net/manual/en/function.mysql-query.php
If the CSV is difficult to parse with fgetcsv
the you could try something like PHPExcel project which has csv reading capabilities
http://phpexcel.codeplex.com
You can just make a script which reads csv file using fread function of php, extract each row and format in an array to insert it into database.
$fileTemp = "path-of-the-file.csv";
$fp = fopen($fileTemp,'r');
$datas = array()
while (($data = fgetcsv($fp)) !== FALSE)
{
$data['productName'] = trim($data[0]);
$data['spec'] = trim($data[1]);
$data['imageLocation'] = trim($data[2]);
$datas[] = $data;
}
Now you have prepared array $datas which you can insert into database with iterations.
All you need is:
Store last file's mtime somewhere (let's say, for simplicity, in another file)
script that runs every X minutes by cron
In this script you simply mtime of the csv file with stored value. If mtime differs, you run SQL query that looks like this:
LOAD DATA LOCAL INFILE '/var/www/tmp/file.csv' REPLACE INTO TABLE mytable COLUMNS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n'
Optionally, you can just touch your file to know when you've performed last data load. If the scv's file mtime is greate than your "helper" file, you should touch it and perform the query.
Documentation on LOAD DATA INFILE SQL statement is here
Of course there is a room for queries errors, but I hope you will handle it (you just need to be sure data loaded properly and only in this case touch file or write new mtime).
have you had a look at fgetcsv? You will probably have to set up a cron job to check for a new file at regular intervals.

What do I do with this csv dataset I just downloaded from dbpedia?

I just downloaded this csv of infoboxes of wikipedia from dbpedia. However I have no idea how to use it :-S I want to import all this data into a database but am not so sure how to take it from here. I downloaded it from http://wiki.dbpedia.org/Downloads32#infoboxes
I'm working in Php
Just for the record - this csv file is around 1.8 GB. I'm actually going through all this trouble for well just to get a select set of infoboxes from a select set of articles form wikipedia. I would do it manually except I need the infoboxes for over 10 000 entries which includes countries and cities. I'm just looking for an easy way to do this and frankly have been using all my options :(
To import CSV data into MySQL you can use a LOAD DATA INFILE statement, e.g.
LOAD DATA LOCAL INFILE '/importfile.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(field1, filed2, field3);
Sometimes such data might need a little massaging, it's not tricky to write a script in Perl or similar to parse a file line by line and spit out SQL statements.
If you want to massage the data before importing it, you could take a look at my CSV stream editor, CSVfix - it's FOSS. It can also generate SQL INSERT statements for your database if for some reason your database's bulk loading of CSV data doesn't suit you.

Categories