Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Sorry if this has been ask but i can't find anything about this on the form,
I am making a shipping calculator and i get csv files from my courier with rate and place (the calculator is made in php),my question is - what is best to read the CSV file in as an array or import the CSV to Mysql database and read the data that way?
If anyone has some experience with this type of situation and won't mine telling me the best way to go about this that will be so great.
I have not tried anything because i would like to know what the best way is to go about this.
Thanks for reading.
Won't this depend upon how many times a day you need to access the data, and how often the shipping data is updated?
eg if the shipping data is updated daily, and you access it 10000 times per day, then yes it would be worth importing it into a db so you can do your lookups.
(this is the kind of job sqlite was designed for btw).
If the shipping data is updated every minute, then you'd be best grabbing it every time.
If the shipping data is updated daily, and you only access it 10 times, then I wouldn't worry too much - either grab it an cache the file then access it as a PHP array.
Sorry, but I am not familiar with the data feed in question.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I have a website (Laravel + Mysql on top of 'dedicated server') where I save all the pages that every user sees for reporting.
My site is visited 10,000 times a day and this statistic makes the database size bigger after a few months. now 'visits' table occupied 85% of whole database!
Is there a way to do this that is the best way possible?
I have not encountered this problem before, but I think its better to take the logging with this much of heavy load out of primary database, you can move it to a file system or logging services (read this).
Or you can have job (background process) to remove the logs that you don't need Like logs from a month ago, this will help db a little bit.
Read some best practices
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I have situation as below.
Every day I am getting 256 GB products information from different online shops and content providers (Ex. CNET datasource).
These information can be CSV, XML and TXT files. Files will be parsed and storing into MongoDB.
Later information will be transformed to searchable and indexed into Elasticsearch.
All 256 GB information is not different every day. Mostly 70% information will be same and few fields like Price, Size, Name and etc will be changed frequently.
I am processing Files usig PHP.
My problem are
Parsing huge data
Mapping the fields inside DB ( ex. title is not title for all onlineshops. They will give field name as Short-Title or some other name)
Increasing GB of information every day. How to store all and process. ( may be Bigdata but not sure how to use it)
Searching information fast with huge data.
Please suggest me suitable Database for this problem.
parsing huge data - Spark is fastest distributed solution for your need, thought you have 70% same data just for comparing its duplicate you anyway have to process it, here you can do mapping n all as well.
data store, if you are doing any aggregation here, i would recommend to use HBase/Impala , if each row of product is important to you use cassandra
For serching nothing is faster than lucene, so use use Solr or Elasticsearch whatever you think comfortable, both are good.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm currently working on a project that uses MySQL for configuration, but now I'm starting to think it could slow down page loads.
So my question is, would it be better to store configuration options (that are read almost every page load) inside an XML/JSON file, or a MySQL database?
Thanks.
One thing to conside is how much config data there is, and perhaps how often it is likely to change. If the amount of data is small, then saving this in a database (if your not already using a db for anything else), would be overkill, equally maintaining a db for something that gets changed once every 6 months would probably be a waste of resources.
I think this depends on your projects. If you want someone else to configure the application through the UI you can put the configurations into the database.
If its just you and some developers, and changes are not made frequently, put them in a file.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have a system where users can 'like' content. There will likely be many hundreds of these likes going on at once. I'd like it to be AJAX driven so you get an immediate response.
At the moment I have a mysql table of likes which contains the post_id and user_id and I have a 'cached' counter on the posts table with the total number of likes - simple so far.
Would I benefit in any way, from storing any of this information in mongodb to take the load off of mysql?
At the moment, I click like, and two mysql queries run - and INSERT into likes and an UPDATE on posts. If I'm in a large-scale environment in heavy read/write situation what would be the best way to go?
Thanks in advance :)
MySQL isn't a good option for something like this, as a large number of writes will cause scaling issues. I believe MongoDB's real advantage is schemaless JSON document oriented storage, and while it should perform better than MySQL (if set up correctly), I think you should look at using Redis to store counters like this (The single INC command to increase a number value is the cream on top of the cake). It can handle writes much more efficiently than any other database, as per my experience.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I've create an application that has an autosave feature. So instead of the user having to click a save button, their settings are automatically saved with every interaction with app. Behind the scenes, every time the app changes, I POST the data to PHP and update a MySQL database table.
It's working well but nobody's using it yet so my question is: If I'm updating a MySQL database table with this save data (and the saved data could be the equivalent of a 100kb XML file) every couple of seconds, could I experience performance issues? It should be noted that there could be hundreds or thousands of users using the app at the same time.
Any tips or advice would be appreciated.
Bundle up and serialize all your data changes into a single JSON object before POSTING (as a single field). Fewer large interactions will offer better performances than constant tiny ones.