Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a mysql database which its size is growing so fast due the amount of inserts at minute,
Since larger DB slow down performance, and the old date is only used for reports, I was thinking to move that data out of the DB.
So I have some questions about it
Is it a good idea?
How can I do the archiving?
How can I run reports on the archived data?
Thank you so much!
The level of performance decrease due to database size is very relative. When your tables are properly indexed and your report queries are efficient, the database size can have almost negligible effect on its performance.
My suggestion is rather than moving data from your production database to an archive database, instead build your reporting to only ask for 'recent' records. For instance, instead of bumping off all records older than a year, modify your report queries with:
where date >= DATE_SUB(NOW(),INTERVAL 1 YEAR);
That way you still have access to all your data if you need it, and your reporting suffers very minimally for the database size. Also making sure your date is indexed will help.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am currently developing an application in which there is a part where users vote for an image and a message "You are the Nth voter" is displayed. Suppose, 1000 or even 100 users vote for a particular image in a span of 2-3 seconds. How will I make sure that each of those users is displayed the correct value of N with N being incremented correctly in the database?
I am using MySQL and PHP.
In general, relational databases implement the ACID properties (read up more about them on Wikipedia or in some other source).
These guarantee that transactions do not interfere with each other and that the database remains consistent. So, if a bunch of users vote around the same time and a bunch of queries query, then each query will be consistent with a view of the data at the time it is run. Of course, the results might change over time.
I should also add this. The enforcement of ACID properties adds overhead to databases. Not all databases are 100% ACID compliant all the time, so this also depends on your database setup (and in the case of MySQL on the storage engine). However, in general, you don't have to worry about things being "incremented correctly" in the database, if the code is properly written.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a list of about 20,000 zip codes that I need to check against. Should I store them in an PHP file as an array? How much memory would that occupy?
Or should I call MySQL every time to check against its database table to see if it exists? Which way is faster? I assume the first option should be faster? The connection to database alone may slow down the database call option quite significantly? I'm just a bit concerned about that memory problem if I do it by including PHP file on every call.
Databases are specifically designed to store and search through large amounts of data efficiently and quickly. If you were to put a 20,000 element array in every PHP file it would drastically slow down every page load, even when the array wasn't being used.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I want to create a web site with laravel based on a database (Mysql or Mongodb) that has almost 500.000 records. The main problem so far is that i want to update the database records daily with cronjobs. What is the best database solution to use in order to have a good flow on the updates. The tables of the database are not many and there are not really relashionships on the records. Can you advise me which database to choose? Mysql or Mongodb? Is it possible to host that web site on a sharing hosting? or do i need to move to a dedicated server. As i say before the records are 500.000 and they will be adding (new), deleting (trash) and updating (existing) around 5% of the records daily.
MySQL or MongoDB: we can't answer without knowing how the data are structured and what usage you are planning. If you require relations and cross-references between records OR if data consistency is vital to your application, then go with MySQL. Otherwise, if data are not related and rapidity is most important then ensured data consistency, MongoDB is the choice
Yes, you can host it in a shared hosting service
25k records to elaborate in a whole day requires low computational resource, it should not be a big problem
I just want to advice you to keep in consideration further development of your application: are you completely sure that records will always be around 500k and they will not increase to millions or even more?
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have a system where users can 'like' content. There will likely be many hundreds of these likes going on at once. I'd like it to be AJAX driven so you get an immediate response.
At the moment I have a mysql table of likes which contains the post_id and user_id and I have a 'cached' counter on the posts table with the total number of likes - simple so far.
Would I benefit in any way, from storing any of this information in mongodb to take the load off of mysql?
At the moment, I click like, and two mysql queries run - and INSERT into likes and an UPDATE on posts. If I'm in a large-scale environment in heavy read/write situation what would be the best way to go?
Thanks in advance :)
MySQL isn't a good option for something like this, as a large number of writes will cause scaling issues. I believe MongoDB's real advantage is schemaless JSON document oriented storage, and while it should perform better than MySQL (if set up correctly), I think you should look at using Redis to store counters like this (The single INC command to increase a number value is the cream on top of the cake). It can handle writes much more efficiently than any other database, as per my experience.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I've create an application that has an autosave feature. So instead of the user having to click a save button, their settings are automatically saved with every interaction with app. Behind the scenes, every time the app changes, I POST the data to PHP and update a MySQL database table.
It's working well but nobody's using it yet so my question is: If I'm updating a MySQL database table with this save data (and the saved data could be the equivalent of a 100kb XML file) every couple of seconds, could I experience performance issues? It should be noted that there could be hundreds or thousands of users using the app at the same time.
Any tips or advice would be appreciated.
Bundle up and serialize all your data changes into a single JSON object before POSTING (as a single field). Fewer large interactions will offer better performances than constant tiny ones.