Allowing user to download, edit and upload a database table - php

I'm looking to create an easy way for a user to create a table, and upload it to the server using ftp. On the server side, I'd like to query this table like an SQL-like query.
As I'd like the user to edit this in something like OO Calc, or MS Excel, would csv files be the best/fastest to parse? Is doing a fgetcsv a good way? Could you suggest any alternatives?

Your best bet is to allow users to upload a CSV file with the first line containing field names (and maybe field types (i.e. int, varchar, etc)). Then you can parse it and validate it for valid/malicious data. If it passes inspection then create a database table escaping all relevant data from which you then can query from.
This way not only do you validate the data first, which is always a good idea, but you control everything like the naming of tables, etc, which helps to keep your data out of malicious hands.

If the user needs to create table give him access to phpMyAdmin (of course properly secured). That can support uploading data, too.

I think CSV would be the easiest thing to work with and will be very fast. Check out the fgetcsv() function. You can read the file into an array and search through that. If your files aren't huge and your queries are standard, then you can write your own search code and not worry about using a database.
If you need to handle any query your user types in, then you'll need to move the data into an SQL compatible database to query.

I've done this before using XML. The nice thing about this is that you can insert data that isn't flat (parent child relationships). If someone is having you load a table it is likely that eventually they will ask you do include child information.

Related

Where to save a single value on server

I am creating an application with a click to call button on an html page.
There will be one person manning the phone. I want this person to be able to set a variable with a boolean value on my server: 1 is available, 0 is unavailable.
I could create a single field SQL table but this feels like overkill, or I could read and write to a text file containing just one character.
What is the most correct way to store a single value?
I know it seems like overkill to use a small database table for this.
If your application already uses a database, this is by far the best way to proceed. Your database technology has all kinds of support for storing data so it doesn't get lost. But, don't stand up a database and organize your application to use it just for this one data point; a file will be easier in that case.
(WordPress does something similar; it uses a table called wp_options containing a lot of one-off settings values.)
I suggest your table contain two columns (or maybe more), agent_id and available. Then, if you happen to add another person taking telephone calls, your app will be ready to handle that growth. Your current person can have agent_id = 0.
If you have a DB set up, I'd use it.
That's what DB's are for, persisting changeable data.. otherwise you are basically writing your own separate DB system for the sake of one setting, which would be uberkill in my eyes!
There is value in consistency and flexibility.. what if I suddenly need to store an expected return time? How do I do this in a text-file, how do I differentiate the column? How do I manipulate the data? MySQL already answers all these questions for you.
As a team member, I'd expect most of my dev colleagues (and new hires) to know how to use MySQL.. I wouldn't want them to have to work with, extend or debug a separate bespoke file persistence system that I had tacked on.
If you are worried about having lots of one row tables dotted about, you could use a single table for miscellaneous singular config variables which need updating regularly.
We have a table like this:
Table: `setting`
Columns: `key_string` VARCHAR, `value` VARCHAR
And could store your variable as
['key_string' => 'telephone_service_available', 'value' => '1']
In this specific case a simple file check (Exist a file or not) is probably the most simple way you can do here. And it also has the benefit to easily check if the file exist or not, you don't have to read file contents.
But if you need just one more information, you have to go a complete other way.
Depends on what you try to do afterwards with the information.
If you use it within a web-application store it in the session.
Or try a flatfile-database like SQLite (no active DBMS needed). Its easy and you can extend it very easy.
Or just a bipolar information with creating a file. If the file is not there is is off.

Writing records into a .dbf file using PHP?

I'm currently building a web-app which displays data from .csv files for the user, where they are edited and the results stored in a mySQL database.
For the next phase of the app I'm looking at implementing the functionality to write the results into ** existing .DBF** files using PHP as well as the mySQL database.
Any help would be greatly appreciated. Thanks!
Actually there's a third route which I should have thought of before, and is probably better for what you want. PHP, of course, allows two or more database connections to be open at the same time. And I've just checked, PHP has an extension for dBase. You did not say what database you are actually writing to (several besides the original dBase use .dbf files), so if you have any more questions after this, state what your target database actually is. But this extension would probably work for all of them, I imagine, or check the list of database extensions for PHP at http://php.net/manual/en/refs.database.php. You would have to try it and see.
Then to give an idea on how to open two connections at once, here's a code snippet (it actually has oracle as the second db, but it shows the basic principles):
http://phplens.com/adodb/tutorial.connecting.to.multiple.databases.html
There's a fair bit of guidance and even tutorials on the web about multiple database connections from PHP, so take a look at them as well.
This is a standard kind of situation in data migration projects - how to get data from one database to another. The answer is you have to find out what format the target files (in this case the format of .dbf files) need to be in, then you simply collect the data from your MySQL file, rearrange it into the required format, and write a new file using PHP's file writing functions.
I am not saying it's easy to do; I don't know the format of .dbf files (it was a format used by dBase, but has been used elsewhere as well). You not only have to know the format of the .dbf records, but there will almost certainly be header info if you are creating new files (but you say the files are pre-existing so that shouldn't be a problem for you). But the records may also have a small amount of header data as well, which you would need to write to work out and each one in the form required.
So you need to find out the exact format of .dbf files - no doubt Googling will find you info on that. But I understand even .dbf can have various differences - in which case you would need to look at the structure of your existing files to resolve those if needed).
The alternative solution, if you don't need instant copying to the target database, is that it may have an option to import data in from CSV files, which is much easier - and you have CSV files already. But presumably the order of data fields in those files is different to the order of fields in the target database (unless they came from the target database, but then you wouldn't presumably, be trying to write it back unless they are archived records). The point I'm making, though, is you can write the data into CSV files from the PHP program, in the field order required by your target database, then read them into the target database as a seaparate step. A two stage proces in other words. This is particularly suitable for migrations where you are doing a one off transfer to the new database.
All in all you have a challenging but interesting project!

Generate .sql file vs execute queries with php

I am trying to import some data in one table from a database to another database.
I cannot just copy them, because format of both tables of the two databases are different.
With the fetched data from one database, I am able to create insert queries.
I want to know which is better:
Execute those queries in PHP itself by creating a new connection to second database.
Write all queries to .sql file and then import it directly in second database.
I am looking at the aspects of performance and ease of implementation.
Note: I am expecting the data in the table to be more than ten thousand rows
If you go with the first Option, There are chances that you could make some mistakes.
I prefer you to go with the Second option to Write all queries to .sql file and then import it directly in second database. Thanks
vJ
I would certainly go for the second option. Why use php for a one time action.
You can just solve this in the database with SQL only
I would go for the second option.
Then I would:
get an overview over both table structures
Export the data from the first table in a flat file format like CSV.
If necessary, transform the data from the first table to the second using a script or a tool.
Import the modified data into the second table.
The database vendors have good tools for exporting, manipulating and importing data.
If only the name of the tables are different, vendor tools importing feature often have good functionality for mapping data from one table to another. In my own case, I've used Oracle SQL developer, but please let me know your vendor and I can give you a pointer in the right direction.

How can I handle 5M Transactions every day with MySQL and the whole LAMP?

Well, Maybe 5M is not that much, but it needs to receive a XML based on the following schema
http://www.sat.gob.mx/sitio_internet/cfd/3/cfdv3.xsd
Therefore I need to save almost all the information per row. Now by law we are required to save the information for a very long time and eventually this database will be very very veeeeery big.
Maybe create a table every day? something like _invoices_16_07_2012.
Well, I'm lost..I have no idea how to do this, but I know is possible.
On top of that, I need to create a PDF and 2 more files based on each XML and keep them on HD.
And you should be able to retrieve your files quickly using a web site.
Thats a lot of data to put into one field in a single row (not sure if that was something you were thinking about doing).
Write a script to parse the xml object and save each value from the xml in a separate field or in a way that makes sense for you (so you'll have to create a table with all the appropriate fields). You should be able to input your data as one row per xml sheet.
You'll also want to shard your database and spread it across a cluster of servers on many tables. MySQL does support this but I've only boostrapped my own sharding mechanism before.
Do not create a table per XML sheet as that is overkill.
Now, why do you need mysql for this? Are you querying the data in the XML? If you're storing this data simply for archival purposes, you don't need mysql, but can instead compress the files into, say, a tarball and store them directly on disk. Your website can easily fetch the file in this way.
If you do need a big data store that can handle 5M transactions with as much data as you're saying, you might also want to look into something like Hadoop and store the data in a Distributed File System. If you want to more easily query your data, look into HBase which can run on top of Hadoop.
Hope this helps.

How to import data to analyse it the fastest way

I just have a question which way gives me more performance and would be easier to get done. We have a DB with over 120000 datarows which is stored in a database. These data is currently exported as CSV file to an ftp location.
Now from this csv file there should be a webform created to filter the datasets. What would you recommend regarding performance and work todo. Should I parse the csv file and get the information out to the webpage or should I reimport the csv file to a DB (MySQL) and use SQL queries to filter the data (Note: The original DB and export is on a different server than the webpage/webform.)
A direct connection to the DB on the original server is not possible.
I prefer reuploading it to a DB, because it makes the development easier, I just simply need to create the SQL query against the filter criteria entered in the webform and run it.
Any ideas?
Thanks...
WorldSignia
The database is undoubtedly the best answer. Since you are looking to use a web form to analyze the results and perform complex queries, the other alternative may prove VERY expensive in terms of server processing time, and quite more difficult to implement. After all, on the one hand you have SQL that handles all filtering details for you, and on the other you will have to implement something yourself.
I would advise, performance - wise, that you create indices for all fields that you know you will be using as criteria, and to display results partially, say 50 per page to minimize load times.
These data is currently exported as CSV file to an ftp location.
There are so many things wrong in that one sentence.
Should I parse the csv file and get the information out to the webpage
Definitely not.
While it is technically possible, and will probably be faster given the number of rows if you use the right tools this is a high risk approach which gives a lot less clarity of code. And while it may meet your immediate requirement is it rather inflexible.
Since the only sensible option is to transfer to another database, perhaps you should think about how you can do this
without using FTP
without using CSV
What happens to the data after it has been filtered?
I think the DB with indexes may be a better solution in case you need to filter the data. Actually this is the idea of DB to optimize your work with data. But you could profile you work and measure the performance. Then you just choose..
hmm good question.
i would think the analysis with a DB is faster. You can set Indizes and optimize the analysis.
But it could take some time to load the CSV into the Database.
To analyse the CSV without a Db it could take some time. You have to create a concrete algorithm and this may be a lot of work :)
So I think u have to proof it both and take the best performance... evaluate them ;-)

Categories