I have a .txt file on our webserver which gets updated and replaced by the client's third party property software.
Does anyone have a script (PHP/MySQL) where I can read this file and import it into a table in my database? Ideally something using codeigniter, but standard PHP will work just fine too.
It is in this format:
"BranchID","PropertyID","PropertyName","Street","DisplayStreet","Postcode","PricePrefix","Price","Bedrooms","Receptions","Bathrooms","ParkingSpaces","Numeric5","Numeric6","Numeric7","Numeric8","Numeric9","AREA","TYPE","FURNISHED","CHILDREN","SMOKING","PETS","GARDEN","DSS","PARKING","cFacility1","cFacility2","cFacility3","cFacility4","cFacility5","cFacility6","cFacility7","cFacility8","cFacility9","cFacility10","Tenure","ShortDescription","MainDescription","AvailabilityCode","AvailabilityDate","FullAddress","PricePrefixPos"
My field names match these headers exactly.
You can use the MYSQL LOAD DATA INFILE directly, see MySQL Reference
This will save come scripting time and will be much faster than importing it via a PHP script.
You may also parse it with PHP.
It looks like a csv.
I would use fgetcsv() to parse the file.
Related
ERROR i'm getting :
This page isn’t working didn’t send any data.
ERR_EMPTY_RESPONSE
I am using PHP language for reading the csv file .
My PHP approach is look like for procession the csv data :
$csvAsArray = array_map('str_getcsv', file($tmpName));
I am sure that the above code creating problem afterwords the code is not getting executing .
How i can import more that greater than at least 1 million data at a time ?
Can anyone help me out , which approach should i choose ?
It looks like you're trying to grab the entire contents of the file in one gulp. Don't do that :) PHP array_map() isn't scalable to 1000's ... or millions of lines.
SUGGESTION:
Read your data into a temp file (as you appear to be doing now).
Do a Postgresql COPY
EXAMPLE:
COPY my_table(my_columns, ...)
FROM 'my_file.csv' DELIMITER ',' CSV HEADER;
I would suggest using a league/csv package for CSV parsing and importing. #paulsm4 is correct that it's never needed to put the whole file into memory and then work with it, one should rather read line-by-line. But this package is well-maintained and does this all under the hood and quite effectively. And it is much more flexible than COPY postgres command to my mind. One can filter the contents, map callbacks to fields/rows and all this on PHP side.
I am looking for the best way to export a CSV file. With MySQL and PHP.
Currently Im generating an CSV with INTO OUTFILE, it works that way but I don't think it's the good way.
Isn't there a better option to make a CSV export download button for every moment a user clicks the download button?
A INTO OUTFILE export is only possible for one instance and is not overwritable.
I have to generate a timestamp and save the file, and then get the latest file from my directory.
This method looks a bit messy for downloading a CSV file from a server...
Has anyone got better solutions?
Thanks!
I think you are well off with exporting via INTO OUTFILE. The reason is that sending the content to the CSV file is done by the MySQL server. Doing it with the PHP Script would be slower (first of all because it is a script, second of all because the data from the SQL server need to be passed to the script) and cost you more resources.
If the CSV file(s) become large you should keep in mind that your Script still may expire. You can encounter this issue by either setting an higher value for the maximum running time of a script in the configuration or have the CSV file being created by an external process/script
Maybe something like this:
`echo $query | mysql > $unique`;
$contents = file($unique);
I have generated a JSON file and it is formatted correctly, the same way as I have export it from my MySQL database. Now I need to put it back:-)
I would like to read up on how to do this, so if there is a good link, I welcome it. I have used the phrase
php - script to import JSON file into MySQL database
in Google and others like it, but I having no luck.
Also if I import a file and the record already exists, how do I deal with duplicate issues? Can I make it so that it overwrites automatically?
I am uploading it from an iOS app, so I do not see the php file at work.
You can use json_decode
json_decode($json_from_ios_app, true)
json_decode will return associated array. Based from this structure, construct SQL queries.
I am trying to import write a script that imports a csv file and parses it into mysql, then imports it in the db, I came across this article,but it seems to be written for ms sql, does anyone know of a tutorial for doing this in mysql, or better still a libary or script that can do it ?.
Thanks :-)
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Using the LOAD DATA INFILE SQL statement
Example :
LOAD DATA LOCAL INFILE '/importfile.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(field1, filed2, field3);
If you are looking for script / library. I want to refer below link. here you can find :
PHP script to import csv data into mysql
This is a simple script that will allow you to import csv data into your database. This comes handy because you can simply edit the appropriate fields, upload it along with the csv file and call it from the web and it will do the rest.
It allows you to specify the delimiter in this csv file, whether it is a coma, a tab etc. It also allows you to chose the line separator, allows you to save the output to a file (known as a data sql dump).
It also permits you to include an empty field at the beginning of each row, which is usually an auto increment integer primary key.
This script is useful mainly if you don’t have phpmyadmin, or you don’t want the hassle of logging in and prefer a few clicks solution, or you simply are a command prompt guy.
Just make sure the table is already created before trying to dump the data.
Kindly post your comments if you got any bug report.
I have a software which give me stock data as excel format , the data is automatically update continuously in every second.i have to show these data in web page such like they are shw in excel (ie the web data should be also update in such manner ) and these data. how it is be done.
Programatically export the data into CSV format and import it into a relational database.
Extract the data with web language and display in webpage. Tutorials for these steps should all be available.
To convert from xls to csv see the question...
converting an Excel (xls) file to a comma separated (csv) file without the GUI
For the second part, you can have a cron job run a PHP script that reads in the csv file contents and inserts this into a database. Plenty of threads on this also.
To display, select from database and format appropriately, can follow any of the basic tuts on the net for this part.
Post your code if you get stuck :)
As you've been told, use PHPExcel to read Excel file.
However, refreshing data every second is gonna make a very heavy load on your server.
I'd recommend you rather use server side push using Comet technologies. Take a look at Meteor server.
You will accomplish 'persistent' connection, so server will push data to the client, and the need to refresh the page or create ajax request every second will be gone.
You've tagged this PHP, so I assume that's your scripting language of choice: use PHPExcel to read the Excel file and write it out as formatted HTML to your web page. PHPExcel's HTML Writer will retain all the style and formatting of the original Excel workbook.
However, an update every second is pretty extreme for a web page, and for a scripting language. Rather than reading the Excel file whenever the page is requested, run this as a background task converting the Excel to a static HTML file whenever an Excel file is received, and serve the latest static HTML.
If this extreme timing is needed, then you might be better looking at a compiled language, or even a non-web solution.