Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I'm trying to create a database to store nearly 5000 records per day (500 per user) and I was wondering if that can slow down the website.
I have an idea to store every 500 data for a user in a text file and save the .txt file into just one row. This way 5000 rows are reduced to 10 rows.
will this work?
PS: I will use PHP implode and explode functions to assemble and disassemble the .txt file.
If you can suggest any better ways that will be amazing.
5000, 50000, 500 000, 5 000 000, 50 000 000 records wont slow your website if you will set indexes right and definitely using .txt files are not best choice to store the data, just set your table in right way (correct column data type, indexes, good and optimized queries)
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a list of about 20,000 zip codes that I need to check against. Should I store them in an PHP file as an array? How much memory would that occupy?
Or should I call MySQL every time to check against its database table to see if it exists? Which way is faster? I assume the first option should be faster? The connection to database alone may slow down the database call option quite significantly? I'm just a bit concerned about that memory problem if I do it by including PHP file on every call.
Databases are specifically designed to store and search through large amounts of data efficiently and quickly. If you were to put a 20,000 element array in every PHP file it would drastically slow down every page load, even when the array wasn't being used.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am having table with 9 columns and 400000 Records. I am using php and mysql for database. The problem I am facing is it takes quite a long time to fetch the particular data or search the records. So can anyone please suggest me should I use other database or some twicks to do in database and also sugegst me the best hosting to handle this large records in my site.
this much record is not considered as a large data. What you need to do is make sure you have proper indexing in your table columns and most important to load only those data which are required. i.e. Implement paging.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Does huge database table effect the speed of the page load ?
for example : If i have database with table name "Images" and this table have 3 columns: ID , Name and src
now if i have 100 image info in that table and I use php to select from the database and order by ID
Will it take the same time to do so as it would take for 1,000,000 Image info ?
if it does effect ... how long time it need to sort 1 million Image info ?
I tried to see the difference between 100 image and 1000 but it was the same .. but i am afraid if it become big it slow down the website ..
i am using this code to get results from database :
SELECT * FROM 4images_images ORDER BY image_id DESC
The size of the database table will probably only matter if your query forces it to look through a lot of rows. Practice 3rd Normal Form.
Do you mean 'page' web page? A large database could effect how fast a query completes which can delay a page.
Queries can be optimized. Tables can be optimized. This allows a query to complete quickier. Therefore it is not true statement to say a large database can cause a page to load slowly. However, it is a likely cause.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have a system where users can 'like' content. There will likely be many hundreds of these likes going on at once. I'd like it to be AJAX driven so you get an immediate response.
At the moment I have a mysql table of likes which contains the post_id and user_id and I have a 'cached' counter on the posts table with the total number of likes - simple so far.
Would I benefit in any way, from storing any of this information in mongodb to take the load off of mysql?
At the moment, I click like, and two mysql queries run - and INSERT into likes and an UPDATE on posts. If I'm in a large-scale environment in heavy read/write situation what would be the best way to go?
Thanks in advance :)
MySQL isn't a good option for something like this, as a large number of writes will cause scaling issues. I believe MongoDB's real advantage is schemaless JSON document oriented storage, and while it should perform better than MySQL (if set up correctly), I think you should look at using Redis to store counters like this (The single INC command to increase a number value is the cream on top of the cake). It can handle writes much more efficiently than any other database, as per my experience.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Sorry if this has been ask but i can't find anything about this on the form,
I am making a shipping calculator and i get csv files from my courier with rate and place (the calculator is made in php),my question is - what is best to read the CSV file in as an array or import the CSV to Mysql database and read the data that way?
If anyone has some experience with this type of situation and won't mine telling me the best way to go about this that will be so great.
I have not tried anything because i would like to know what the best way is to go about this.
Thanks for reading.
Won't this depend upon how many times a day you need to access the data, and how often the shipping data is updated?
eg if the shipping data is updated daily, and you access it 10000 times per day, then yes it would be worth importing it into a db so you can do your lookups.
(this is the kind of job sqlite was designed for btw).
If the shipping data is updated every minute, then you'd be best grabbing it every time.
If the shipping data is updated daily, and you only access it 10 times, then I wouldn't worry too much - either grab it an cache the file then access it as a PHP array.
Sorry, but I am not familiar with the data feed in question.