I have an Excel spreadsheet with data I would like to use to populate a variety of tables in MySql. To ensure that business logic is adhered to, I have developed a series of stored procedures. Each row can call one or more of these procedures - depending on the contents.
I have thought of two possible solutions - Either
a) Write a PHP script to do it;
or
b) Write an Excel Macro to do it.
It must be noted that the data is still in the stage of possibly being edited before going live.
So my question is, what is the best solution? Any possible advantages/disadvantages with either one? Any possible pitfalls? Are there any other possible solutions?
Go with PHP if you populate the DB with data from Excel. I recommend http://phpexcel.codeplex.com/ for data manipulation.
I recommend using a php script. With it you can have the logic externalized and easily reuse it later.
There are a lot of classes/libraries
(PHP excel reader for example)
that you can use to parse the spreadsheet, thus the script can contain only content-related logic.
Related
I have a webpage field where it shows results as you type. Currently I am using an XML file stored on the server as the source for these results. Would it be faster if I directly query the database as letters are being typed? What is the best way to accomplish this?
I am using PHP and AJAX with MS SQL Server. Thank you.
XML files are just text files stored on your machine or another one; they need to be read, parsed and written to, and only your program can do that. They'are also really, really inefficient, because of their text nature: reading and parsing a text file is very slow, and modifying it is even worse.
XML files are good for storing configuration settings and passing data between different systems, but data storage and processing should definitely live in a proper DBMS.
Hope this suffice your query.
A more scalable solution would be to use a de-normalized table to store the results. You can use a Memory Optimized table for this if you are using SQL Server 2014 or later.
To speed up the querying, you can also create a Natively Compiled Stored Procedure, but this also supports in SQL Server 2014 or later.
Further, try and build a Full Text Search Index over the search results so that if will speed up the querying as well as give more functionality.
I'm currently building a web-app which displays data from .csv files for the user, where they are edited and the results stored in a mySQL database.
For the next phase of the app I'm looking at implementing the functionality to write the results into ** existing .DBF** files using PHP as well as the mySQL database.
Any help would be greatly appreciated. Thanks!
Actually there's a third route which I should have thought of before, and is probably better for what you want. PHP, of course, allows two or more database connections to be open at the same time. And I've just checked, PHP has an extension for dBase. You did not say what database you are actually writing to (several besides the original dBase use .dbf files), so if you have any more questions after this, state what your target database actually is. But this extension would probably work for all of them, I imagine, or check the list of database extensions for PHP at http://php.net/manual/en/refs.database.php. You would have to try it and see.
Then to give an idea on how to open two connections at once, here's a code snippet (it actually has oracle as the second db, but it shows the basic principles):
http://phplens.com/adodb/tutorial.connecting.to.multiple.databases.html
There's a fair bit of guidance and even tutorials on the web about multiple database connections from PHP, so take a look at them as well.
This is a standard kind of situation in data migration projects - how to get data from one database to another. The answer is you have to find out what format the target files (in this case the format of .dbf files) need to be in, then you simply collect the data from your MySQL file, rearrange it into the required format, and write a new file using PHP's file writing functions.
I am not saying it's easy to do; I don't know the format of .dbf files (it was a format used by dBase, but has been used elsewhere as well). You not only have to know the format of the .dbf records, but there will almost certainly be header info if you are creating new files (but you say the files are pre-existing so that shouldn't be a problem for you). But the records may also have a small amount of header data as well, which you would need to write to work out and each one in the form required.
So you need to find out the exact format of .dbf files - no doubt Googling will find you info on that. But I understand even .dbf can have various differences - in which case you would need to look at the structure of your existing files to resolve those if needed).
The alternative solution, if you don't need instant copying to the target database, is that it may have an option to import data in from CSV files, which is much easier - and you have CSV files already. But presumably the order of data fields in those files is different to the order of fields in the target database (unless they came from the target database, but then you wouldn't presumably, be trying to write it back unless they are archived records). The point I'm making, though, is you can write the data into CSV files from the PHP program, in the field order required by your target database, then read them into the target database as a seaparate step. A two stage proces in other words. This is particularly suitable for migrations where you are doing a one off transfer to the new database.
All in all you have a challenging but interesting project!
Sorry if this question might sound stupid to you guys, but am total newbie to programming, apart from knowing SQL, the thing is i have been given a MYSQL database containing various information about kids diseases and a web interface written in php to create reports from the database that can be accessed via the interface. there are almost 25 different variables that need to be computed in the report, i have written sql queries to compute all these values, but i don't know anything about PHP, if i have all these queries isn't there a way for me to combine all these sql queries to be display results on a webpage and come up with this report without writing PHP code?
Thanks for your
again very sorry if this is too basic.
As mr_jp suggests, phpmysqladmin provides a simple front end for running queries, but also changing the data and modifying the schema. Although you can restrict named users to only have SELECT privilege, they'll still need to know SQL to run the queries.
It's not that hard to build a front end to take a set of parameters, substitute them into a SELECT statement and send the output to a formatted table. There are lots of datagrid tools (e.g. phplens, phpgrid, have a google for 'mysql datagrid' for more) which will handle the formatting of a MySQL resultset (or just download it as CSV - your browser should be able to transfer the data into your spreadsheet program automatically).
There are a couple of report generators for PHP - but the last time I looked at this in any depth, I wasn't overly impressed.
Your web host would probably have phpmyadmin installed. Try getting access from the web host.
You can enter your queries there and export the results as html, csv, excel and others.
You could write Python. Or Ruby. Or something you know. ;-)
But you need something to output your queried data.
If you just want to check the results by yourself without having the needs to publish that directly, you might use some MySQL query browser or administrator like phpMyAdmin or the MySQL Workbench. Those tools allow you to query the database but display the returned data only as raw tables. If you need some styling or your own layout, you'll have to use an own application or edit the exported data manually (e.g. using a CSV export and re-open it using some spreadsheet application like Excel or Calc).
The combination PHP + MySQL is a very popular one and it's highly recommended that you use them together.
The code that you will need to write in order to display that information using PHP is pretty straightforward and not very hard. If you do know some basic programming concepts, you can learn to do that in a matter of hours. PHP is well known for its extremely accessible learning curve. There are thousands of code samples online that you can look at to see how this is done.
I'm managing a website that is built from the ground-up in PHP and uses AJAX extensively to load pages within the site.
Currently, the 'admin' interface I created stores the HTML for each page, the page's title, custom CSS for that page, and some other attributes in JSON in a file directory (e.g. example.com/hi would have its data stored in example.com/json/hi/data.json).
Is this more efficient (performance-wise) than using a database (MySQL) to store all of the content?
EDIT: I understand that direct filesystem access is more efficient than using a database, but I'm not just getting the contents of a file; I need to use json_decode to parse the JSON into an object/array first to work with it. Is the performance hit from that negligible?
Your manner doesn't sounds very bad, in fact I used a not-so-far solution before and it does the job pretty good. But you might consider more powerful storage as soon as your admin will grow or the need to separate things arises. Since, there are two ways :
SQL : using relationnal database like Postgre or MySQL
No-SQL : which seems to be more of your interest, here is why.
Considering you're using ajax for communication and json for storage, there's MongoDB. A no sql approach to storage that use json syntax for querying. There even is an Interactive Tutorial for it !
When it comes to performances, no one sounds to be faster. Both engine usually written in C or C++, that is, for best performances. Nevertheless, for a structure as simple as you describe, there's no faster way than direct file access.
Thad depends on what you use that data for.
If you only serve 'static' files, with almost the same data all the time, then even static HTML files are recommended for chaching.
If, on the other hand, you process data and display it in multiple forms (searches, custom statistics, etc) then it is much better to store it in some kind of DB
I am trying to import various pipe delimited files using php 5.2 into a mysql database. I am importing various formats of piped data and my end goal is to try put the different data into a suitably normalised data structure but need to do some post processing on the data to put it into my model correctly.
I thought the best way to do this is to import into a table called buffer and map out the data then import into various tables. I am planning to create a table just called "buffer" with fields that represent each columns (there will be up to 80 columns) then apply some data transforms/mapping to get it to the right table.
My planned approach is to create a base class that generically reads the the pipe data into the buffer table then extend this class by having a function that contain various prepared statements to do the SQL magic, allowing me the flexibility to check the format is the same by reading the headers on the first row and changing it for one format.
My questions are:
Whats the best way to do step one of reading the data from a local file saved into the table? I'm not too sure if i should use the LOAD DATA of mysql (as suggested in Best Practice : Import CSV to MYSQL Database using PHP 5.x) or just fopen then insert the data line by line.
is this the best approach? How have other people approach this?
Is there anything in the zen framework that may help?
Additional : I am planning to do this in a scheduled task.
You don't need any PHP code to do that, IMO. Don't waste time on classes. MySQL LOAD DATA INFILE clause allows a lot of ways to import data, for 95% of your needs. Whatever delimiters, whatever columns to skip/pick. Read the manual attentively, it's worth to know what you CAN do with it. After importing the data, it can be already in a good shape if you write the query properly. The buffer table can be a temporary one. Then normalize or denormalize it and drop the initial table. Save the script in a file to reproduce the sequence of scripts if there's a mistake.
The best way is to write a SQL script, test if finally the data is in proper shape, seek for mistakes, modify, re-run the script. If there's a lot of data, do tests on a smaller set of rows.
[added] Another reason for sql-mostly approach is that if you're not fluent in SQL, but are going to work with a database, it's better to learn SQL earlier. You'll find a lot of uses for it later and will avoid the common pitfalls of programmers who know it superficially.
I personally use the free ETL software Kettle by Pentaho (this bit of software is commonly referred to as kettle). While this software is far from perfect, I've found that I can often import data in a fraction of the time I would have to spend writing a script for one specific file. You can select a text file input and specify the delimiters, fixed width, etc.. and then simply export directly into your SQL server (they support MySql, SQLite, Oracle, and much more).
There are dozens and dozens of ways. If you have local filesystem access to the MySQL instance, LOAD DATA. Otherwise you can just as easily transform each line into SQL (or a VALUES line) for periodic submittal to MySQL via PHP.
In the end i used dataload AND modified this http://codingpad.maryspad.com/2007/09/24/converting-csv-to-sql-using-php/ for different situations.