I am wondering what is the best way to tackle the following problem. I need to provide a client with the option to download various ranges of records from a database in CSV format. I was thinking of doing the following.
Provide user with a webform to determine what rows to pull from the DB.
Pull the appropriate records and create a large string from the return data.
Post the string to the script which creates the CSV for downloading.
Before I attempt the above I was wondering if there is a better way to do this?
Thanks ^_^
You can use MySQL's SELECT ... INTO OUTFILE to immediately obtain your results in a CSV. You then need only read the contents of that CSV file back to the user.
This is much safer than attempting to construct a CSV yourself (where you must consider how to properly escape field delimiters, etc); however, if you wanted to do it that way, I'd recommend using fputcsv(STDOUT, $results) to avoid common pitfalls.
Related
I have a question, I want to have a dynamic table in php without database. The number of rows and columns in the table should be easy to adjust.
I want to make a variable of the rows and columns.
How can I make this? Is here a tutorial about?
thanks in advance.
You can use a number of ways to store your data. If you choose not to use a database then saving the data to file after encoding it would probably the next most common way.
Take a look at the XML example I've included here, let me know if you have any more questions. There is definitely more than one way you can do this.
PHP XML to dynamic table
I need help parsing the following a CSV file in PHP, so I can insert the contents into a database.
I know I use file_get_contents() but after that I feel a bit lost.
What I'd like to store.
Collection1 - events.text & date
Collection2 - position & name.text & total
I'm not sure how best structure the data to insert into a database table.
"**collection1**"
"events.href","**events.text**","**date**","index","url"
"tur.com/events/classic.html","John Deere Classic","Thursday Jul 9
- Sunday Jul 12, 2015","1","tur.com/r.html"
"collection2"
"**position**","name.href","**name.text**","**total**","index","url"
"--","javascript:void(0);","Scott","--","2","tur.com/r.html"
"--","javascript:void(0);","Billy","--","3","tur.com/r.html"
"--","javascript:void(0);","Jon","--","4","tur.com/r.html"
"--","javascript:void(0);","Bill","--","5","tur.com/r.html"
"--","javascript:void(0);","Tim","--","6","tur.com/r.html"
"--","javascript:void(0);","Carlos","--","7","tur.com/r.html"
"--","javascript:void(0);","Robert","--","8","tur.com/r.html"
"--","javascript:void(0);","Rod","--","9","tur.com/r.html"
As per your previous question, I think this needs to be broken down into sections. As it stands it is rather too broad to answer.
Read the information using file_get_contents(). Make sure this works first, by echoing it to the console. (It sounded from your other question that you felt this would not work if the URL does not have a .csv suffix. It should work regardless of the file extension - try it. If it fails it may be dependent on cookies or JavaScript or some other problem).
Design and create your table structure in MySQL. It seems like you have two tables. They should both have a primary key. Are they related in some fashion? If so, perhaps one has a foreign key to the other one?
Explode your text file on the new line character and loop across the resulting array of lines.
If your CSV data has a title row in the first row position, delete that from your array.
For each line, read the elements of interest using PHP's build-in CSV parsing functions, and store them in variables.
Pass these variables to a custom function that saves the data.
For each save, you'll need to do an INSERT. I recommend using PDO here. Make sure you bind your parameters.
Where you get stuck on a specific problem, you can ask a new and focussed question. At present, the task is to break things down into discrete and researchable pieces.
One trick worth remembering is this shortcut to the PHP manual. If you do not know how fgetcsv works, for example, type php.net/fgetcsv into your browser address bar, and the PHP site will find the function for you. The documentation is excellent.
I'm trying to develop a PHP script that lets users upload shapefiles to import to a postGIS database.
First of all, for the conversion part, AFAIK we can use shp2pgsql to convert the shapefile to a postgresql table; I was wondering if there is another way of doing the conversion, as I would prefer not to use the exec() command.
I would also appretiate any idea on storing the data in a way that does not require dozens of uniquenamed tables.
There seems to be no other way than using the postgresql's binary to convert the shapefile. Although it is not really a bad choice, I would rather not use exec() if there is a PHP native function, or apache module to do it!
However, it sounds like exec is the only sane option available. So I'm going to use it.
No hard feelings! :)
About the last part, it's a different question and should be asked separately. Although, I'm afraid there is no other way of doing it.
UPDATE example added
$queries = shell_exec("shp2pgsql -s ".SRID." -c $shpfilpath $tblname")
or respond(false, "Error parsing the shapfile.");
pg_query($queries) or respond(false, "Query failed!");
SRID is a constant containing the "SRID"!
$shpfilpath is a path to the desired ShapeFile
$tblname is the desired name for the table
See this blog post about loading shapefiles using the PHP shapefile reader plugin from here. http://www.phpclasses.org/package/1741-PHP-Read-vectorial-data-from-geographic-shape-files.html. The blog post focuses on using PHP on the backend to load data for a Flash app, but you should be able to ignore the flash part and use the PHP portion for your needs.
Once you have the data loaded from the shapefile, you could convert the geometry to a WKT string and use ST_GeomFromText or other PostGIS functions to store in the database.
Regarding the unique columns for a shapefile, I've found that to be the most straightforward way to store ad-hoc shapefile attributes and then retrieve that data. However, you could use a "tuple" system, and convert the attributes to strings, then store them in arbitrarily named columns (col1, col2, col3, etc.) if you don't care about attribute names or types.
If you cared about names and types, you could go one step further and store them as a shapefile "schema" in another table.
Write your shp2pgsql and define its parameters using text editor ie
sublime notepad etc.
Copy, paste and change shapefile name for each
layer.
Save as a batch file .bat.
Pull up command window.
Pull up directory with where yu .bat file is saved.
Hit enter and itll run the code for all your shapefiles and they will be uploaded to the
database you defined in writing your code.
Use qgis, go to postgis window and hit connect.
You are good to go your shapefiles are now ready to go and can be added as layers to your map. Make sure the spatial reference matches what they were prior to running it. Does that make sense? I hope that helped its the quickest way.
Adding this answer just for the benefit of anyone who is looking for the same as the OP and does not want to rely on eval() nor external tools.
As of August 2019, you could use PHP Shapefile, a free and open source PHP library I have been developing and maintaining for a few years that can read and write any ESRI Shapefile and convert it natively from/to WKT and GeoJSON, without any third party dependency.
Using my library, which provides WKT to use with PostGIS ST_GeomFromText() function and an array containing all the data to perform a simple INSERT, makes this task trivial, fast and secure, without the need of evil eval().
i am writing a cms system in php 5.3 on top codeigniter. The contents of the data, i have determined, it will very in size and form and i was wondering if storing the data in JSON format in MySQL would be a good choice. I need to be able to store data in a structured way, but the way it is structured, the number of subtitles will very, the use of lists, tables, or other elements may be needed at times.
I want to create a lib of tools in the end for creating different types of sites, have the lib built very modular so that u can strip it down to the minimum sized footprint for each site.
Is JSON a the right choice or would it be to processor intensive?
JSON in MySQL? No, don't do that.
Arrays and structures are readily built from the result of a SQL query, and readily converted to JSON format (json_encode()) when needed which is not much, except when creating JavaScript.
Going the other direction is straightforward too: convert JSON to PHP arrays/hashtables (json_decode()) and use that to create a SQL query to update or insert into a set of tables.
I have a CSV file which is more than 5 MB. As an example like this,
id,song,album,track
200,"Best of 10","Best of 10","No where"
230,"Best of 10","Best of 10","Love"
4340,"Best of 10","Best of 10","Al Road"
I have plenty of records. I need to get a one record at a time. Assume I need to get the detail of the id 8999. My main question is Is there any method to get exact data record from CSV just like we query in mysql table.
Other than that I have following solutions in my mind to perform the task. What could be the best way.
Read the CSV and load in to array. Then search trough it and get data. (I have to get all records. The order is different that's why I face this issue. Other wise I could get record by record.)
Export CSV to MYSQL database and then query from that table
Read the CSV all the time without loading it to array
If I can search in CSV file in quick way it will be great. Appreciate your suggestions.
Thank you.
I'm not aware if there are any libraries that allow you to directly query a CSV file. If there are, that would be your best bet. If not, here's another solution.
If you want to get details of id 8999, it would be extremely memory inefficient to load in the entire file.
You could do something like this
Read in a line using fgets, explode on comma and check 0th element. If it is not the ID you want, discard and repeat.
Not familiar with PHP, but if this query is frequent, consider keeping the data in the memory and index them by id using a mapping data structure. (should also be Array in PHP)