I have a webpage field where it shows results as you type. Currently I am using an XML file stored on the server as the source for these results. Would it be faster if I directly query the database as letters are being typed? What is the best way to accomplish this?
I am using PHP and AJAX with MS SQL Server. Thank you.
XML files are just text files stored on your machine or another one; they need to be read, parsed and written to, and only your program can do that. They'are also really, really inefficient, because of their text nature: reading and parsing a text file is very slow, and modifying it is even worse.
XML files are good for storing configuration settings and passing data between different systems, but data storage and processing should definitely live in a proper DBMS.
Hope this suffice your query.
A more scalable solution would be to use a de-normalized table to store the results. You can use a Memory Optimized table for this if you are using SQL Server 2014 or later.
To speed up the querying, you can also create a Natively Compiled Stored Procedure, but this also supports in SQL Server 2014 or later.
Further, try and build a Full Text Search Index over the search results so that if will speed up the querying as well as give more functionality.
Related
I need to load XML data from an external server/url into my MySQL database, using PHP.
I don't need to save the XML file itself anywhere, unless this is easier/faster.
The problem is, I will need this to run every hour or so as the data will be constantly updated, therefore I need to replace the data in my database too. The XML file is usually around 350mb.
The data in the MySQL table needs to be searchable - I will know the structure of the XML so can create the table to suit first.
I guess there are a few parts to this question:
What's the best way to automate this whole process to run every hour?
Whats the best(fastest?) way of downloading/ parsing the xml (~350mb) from the url? in a way that I can -
load it into a mysql table of my own, maintaining columns/ structure
1) A PHP script can keep running in background all the time, but this is not the best scenario or you can set a php -q /dir/to/php.php using cronos (if running on linux) or other techniques to makes server help you. (You still need access to server)
2) You can use several systems, the more linear one, less RAM consuming, is if you decide to work with files or with a modified mySQL access is opening your TCP connection, streaming smaller packages (16KB will be ok) and streaming them out on disk or another connection.
3) Moving so huge data is not difficult, but storing them in mySQL is not waste. Performing search in it is even worst. Updating it is trying to kill mySQL system.
Suggestions:
From what i can see, you are trying to synchronize or back-up data from another server. If there is just one file then make a local .xml using PHP and you are done. If there are more than one i will still suggest to make local files as most probably you are working with unstructured data: they are not for mySQL. If you work with hundreds of files and you need to search them fast perform statistics and much much more... consider to change approach and read about hadoop.
MySQL BLOOB or TEXT columns still not support more than 65KB, maybe you know another technique, but i never heard about it and I will never suggest to do so. If you are trying it just to use SQL SEARCH commands you took the wrong path.
I just have a question which way gives me more performance and would be easier to get done. We have a DB with over 120000 datarows which is stored in a database. These data is currently exported as CSV file to an ftp location.
Now from this csv file there should be a webform created to filter the datasets. What would you recommend regarding performance and work todo. Should I parse the csv file and get the information out to the webpage or should I reimport the csv file to a DB (MySQL) and use SQL queries to filter the data (Note: The original DB and export is on a different server than the webpage/webform.)
A direct connection to the DB on the original server is not possible.
I prefer reuploading it to a DB, because it makes the development easier, I just simply need to create the SQL query against the filter criteria entered in the webform and run it.
Any ideas?
Thanks...
WorldSignia
The database is undoubtedly the best answer. Since you are looking to use a web form to analyze the results and perform complex queries, the other alternative may prove VERY expensive in terms of server processing time, and quite more difficult to implement. After all, on the one hand you have SQL that handles all filtering details for you, and on the other you will have to implement something yourself.
I would advise, performance - wise, that you create indices for all fields that you know you will be using as criteria, and to display results partially, say 50 per page to minimize load times.
These data is currently exported as CSV file to an ftp location.
There are so many things wrong in that one sentence.
Should I parse the csv file and get the information out to the webpage
Definitely not.
While it is technically possible, and will probably be faster given the number of rows if you use the right tools this is a high risk approach which gives a lot less clarity of code. And while it may meet your immediate requirement is it rather inflexible.
Since the only sensible option is to transfer to another database, perhaps you should think about how you can do this
without using FTP
without using CSV
What happens to the data after it has been filtered?
I think the DB with indexes may be a better solution in case you need to filter the data. Actually this is the idea of DB to optimize your work with data. But you could profile you work and measure the performance. Then you just choose..
hmm good question.
i would think the analysis with a DB is faster. You can set Indizes and optimize the analysis.
But it could take some time to load the CSV into the Database.
To analyse the CSV without a Db it could take some time. You have to create a concrete algorithm and this may be a lot of work :)
So I think u have to proof it both and take the best performance... evaluate them ;-)
We are building a website which would serve about 30k unique visitors a day.
Currently we use a simple mysql Connect > A Simple Query > mysql Close.
I'm afraid that with a dual core server running 2GB of RAM we would be able
to open about 1k mysql connection tops. is 1k a good estimate?
Is it better to make a Cron-Job output XML files and let our php files grab the data from them?
Typically XML will never be faster than MySQL for searching data (i.e. performing queries).
I don't know what kind of data you have, but XML will only be faster if you have a bunch of simple files and don't need to search, just load the files and format them.
If you need to search, then use MySQL.
MySQL does all sorts of optimizations. For example it stores KEY columns in a separate file, allowing for a much faster search.
I would suggest using Zend cache for caching MySQL query results for the the data that doesn't change frequently.
My question is fairly simple; I need to read out some templates (in PHP) and send them to the client.
For this kind of data, specifically text/html and text/javascript; is it more expensive to read them out a MySQL database or out of files?
Kind regards
Tom
inb4 security; I'm aware.
PS: I read other topics about similar questions but they either had to do with other kind of data, or haven't been answered.
Reading from a database is more expensive, no question.
Where do the flat files live? On the file system. In the best case, they've been recently accessed so the OS has cached the files in memory, and it's just a memory read to get them into your PHP program to send to the client. In the worst case, the OS has to copy the file from disc to memory before your program can use it.
Where does the data in a database live? On the file system. In the best case, they've been recently accessed so MySQL has that table in memory. However, your program can't get at that memory directly, it needs to first establish a connection with the server, send authentication data back and forth, send a query, MySQL has to parse and execute the query, then grab the row from memory and send it to your program. In the worst case, the OS has to copy from the database table's file on disk to memory before MySQL can get the row to send.
As you can see, the scenarios are almost exactly the same, except that using a database involves the additional overhead of connections and queries before getting the data out of memory or off disc.
There are many factors that would affect how expensive both are.
I'll assume that since they are templates, they probably won't be changing often. If so, flat-file may be a better option. Anything write-heavy should be done in a database.
Reading a flat-file should be faster than reading data from the database.
Having them in the database usually makes it easier for multiple people to edit.
You might consider using memcache to store the templates after reading them, since reading from memory is always faster than reading from a db or flat-file.
It really doesnt make enough difference to worry you. What sort of volume are you working with? Will you have over a million page views a day? If not I'd say pick whichever one is easiest for you to code with and maintain and dont worry about the expense of the alternatives until it becomes a problem.
Specifically, if your templates are currently in file form I would leave them there, and if they are currently in DB form I'd leave them there.
Lets assume the same environments for PHP5 working with MySQL5 and CSV files. MySQL is on the same host as hosted scripts.
Will MySQL always be faster than retriving/searching/changing/adding/deleting records to CSV?
Or is there some amount of data below which PHP+CSV performance is better than using database server?
CSV won't let you create indexes for fast searching.
If you always need all data from a single table (like for application settings), CSV is faster, otherwise not.
I don't even consider SQL queries, transactions, data manipulation or concurrent access here, as CSV is certainly not for these things.
No, MySQL will probably be slower for inserting (appending to a CSV is very fast) and table-scan (non-index based) searches.
Updating or deleting from a CSV is nontrivial - I leave that as an exercise for the reader.
If you use a CSV, you need to be really careful to handle multiple threads / processes correctly, otherwise you'll get bad data or corrupt your file.
However, there are other advantages too. Care to work out how you do ALTER TABLE on a CSV?
Using a CSV is a very bad idea if you ever need UPDATEs, DELETEs, ALTER TABLE or to access the file from more than one process at once.
As a person coming from the data industry, I've dealt with exactly this situation.
Generally speaking, MySQL will be faster.
However, you don't state the type of application that you are developing. Are you developing a data warehouse application that is mainly used for searching and retrieval of records? How many fields are typically present in your records? How many records are typically present in your data files? Do these files have any relational properties to each other, i.e. do you have a file of customers and a file of customer orders? How much time do you have to develop a system?
The answer will depend on the answer to the questions listed previously. However, you can generally use the following as a guidelines:
If you are building a data warehouse application with records exceeding one million, you may want to consider ditching both and moving to a Column Oriented Database.
CSV will probably be faster for smaller data sets. However, rolling your own insert routines in CSV could be painful and you lose the advantages of database indexing.
My general recommendation would be to just use MySql, as I said previously, in most cases it will be faster.
From a pure performance standpoint, it completely depends on the operation you're doing, as #MarkR says. Appending to a flat file is very fast. As is reading in the entire file (for a non-indexed search or other purposes).
The only way to know for sure what will work better for your use cases on your platform is to do actual profiling. I can guarantee you that doing a full table scan on a million row database will be slower than grep on a million line CSV file. But that's probably not a realistic example of your usage. The "breakpoints" will vary wildly depending on your particular mix of retrieve, indexed search, non-indexed search, update, append.
To me, this isn't a performance issue. Your data sounds record-oriented, and MySQL is vastly superior (in general terms) for dealing with that kind of data. If your use cases are even a little bit complicated by the time your data gets large, dealing with a 100k line CSV file is going to be horrific compared to a 100k record db table, even if the performance is marginally better (which is by no means guaranteed).
Depends on the use. For example for configuration or language files CSV might do better.
Anyway, if you're using PHP5, you have 3rd option -- SQLite, which comes embedded in PHP. It gives you ease of use like regular files, but robustness of RDBMS.
Databases are for storing and retrieving data. If you need anything more than plain line/entry addition or bulk listing, why not go for the database way? Otherwise you'd basically have to code the functionality (incl. deletion, sorting etc) yourself.
CSV is an incredibly brittle format and requires your app to do all the formatting and calcuations. If you need to update a spesific record in a csv you will have to first read the entire csv file, find the entry in memory would need to change, then write the whole file out again. This gets very slow very quickly. CSV is only useful for write once, readd once type apps.
If you want to import swiftly like a thief in the night, use SQL format.
If you are working in production server, CSV is slow but it is the safest.
Just make sure the CSV file doesn't have a Primary Key which will override your existing data.