Does huge database table affect the speed of the page load? [closed] - php

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Does huge database table effect the speed of the page load ?
for example : If i have database with table name "Images" and this table have 3 columns: ID , Name and src
now if i have 100 image info in that table and I use php to select from the database and order by ID
Will it take the same time to do so as it would take for 1,000,000 Image info ?
if it does effect ... how long time it need to sort 1 million Image info ?
I tried to see the difference between 100 image and 1000 but it was the same .. but i am afraid if it become big it slow down the website ..
i am using this code to get results from database :
SELECT * FROM 4images_images ORDER BY image_id DESC

The size of the database table will probably only matter if your query forces it to look through a lot of rows. Practice 3rd Normal Form.

Do you mean 'page' web page? A large database could effect how fast a query completes which can delay a page.
Queries can be optimized. Tables can be optimized. This allows a query to complete quickier. Therefore it is not true statement to say a large database can cause a page to load slowly. However, it is a likely cause.

Related

What happens in case of multiple requests to the database at the same time? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am currently developing an application in which there is a part where users vote for an image and a message "You are the Nth voter" is displayed. Suppose, 1000 or even 100 users vote for a particular image in a span of 2-3 seconds. How will I make sure that each of those users is displayed the correct value of N with N being incremented correctly in the database?
I am using MySQL and PHP.
In general, relational databases implement the ACID properties (read up more about them on Wikipedia or in some other source).
These guarantee that transactions do not interfere with each other and that the database remains consistent. So, if a bunch of users vote around the same time and a bunch of queries query, then each query will be consistent with a view of the data at the time it is run. Of course, the results might change over time.
I should also add this. The enforcement of ACID properties adds overhead to databases. Not all databases are 100% ACID compliant all the time, so this also depends on your database setup (and in the case of MySQL on the storage engine). However, in general, you don't have to worry about things being "incremented correctly" in the database, if the code is properly written.

PHP & MySQL: Good/efficient of showing statistics from thousands of rows on each page load [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I know this is the wrong way to go about it, but I have a database filled with stats about vehicles that are imported from excel files.
For each vehicle(about 100 currently, updated each three days) I have from 500 to 2000 rows of data which is used to build graphs regarding fuel consumption, distance driven etc..
This is fairly simple and it takes from 1 to 3 seconds to load, but I also need the total stats and compare it against each car.
So if I build the graph for car id 1, I want to see the difference between its fuel consumption and the global fuel consumption (of all existing cars).
Is there a way of doing this without querying not only the single car but also all cars on each page load ?
Thank you
It sounds like you need to pre-compile your stats into a summary table. Write a function that takes in 1 vehicle as a parameter, compiles all your stats, then saves them to a dedicated summary table. Then write a background script that calls that function for all vehicles one by one. You can call the background script as often as you feel the stats need to be updated, leaving the web interface free to do very little computing/io.
This type of thing has saved me quite a big of headache over the years.

Archive data and run reports on it [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a mysql database which its size is growing so fast due the amount of inserts at minute,
Since larger DB slow down performance, and the old date is only used for reports, I was thinking to move that data out of the DB.
So I have some questions about it
Is it a good idea?
How can I do the archiving?
How can I run reports on the archived data?
Thank you so much!
The level of performance decrease due to database size is very relative. When your tables are properly indexed and your report queries are efficient, the database size can have almost negligible effect on its performance.
My suggestion is rather than moving data from your production database to an archive database, instead build your reporting to only ask for 'recent' records. For instance, instead of bumping off all records older than a year, modify your report queries with:
where date >= DATE_SUB(NOW(),INTERVAL 1 YEAR);
That way you still have access to all your data if you need it, and your reporting suffers very minimally for the database size. Also making sure your date is indexed will help.

Fetching data from database for front-end [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm working on a front-end for a catalog database. So far it is going pretty good, but by the end of this I'll have probably a million rows of items in my database. I know this is more option based, but I wanted feedback on the best of way of doing this.
I was wondering if I should pull all records at once for a list and then use a filter type setup so it shows only items starting with A, or top rating items, then can click b and it would show all items starting with a b as option 1. Option 2 would be fetching the new data from the database upon request. So by default items starting with A will show then when the link/button for items b is click it will connect to the database and fetch those items.
Thoughts and options please!
PS I'm working with bootstrap/php/mysql/jquery/javascript.
Well, don't pull all million records every request.
See LIMIT and OFFSET http://dev.mysql.com/doc/refman/5.0/en/select.html
You should probably use an ORM or library that helps you with pagination
Make sure you understand SQL parameterization (for security)
I find that lists like catalogs are best served from search servers (like Solr) over databases, esp if you want search facets. MySQL does have text search though.

Read / write load balancing with php - mongodb vs mysql [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have a system where users can 'like' content. There will likely be many hundreds of these likes going on at once. I'd like it to be AJAX driven so you get an immediate response.
At the moment I have a mysql table of likes which contains the post_id and user_id and I have a 'cached' counter on the posts table with the total number of likes - simple so far.
Would I benefit in any way, from storing any of this information in mongodb to take the load off of mysql?
At the moment, I click like, and two mysql queries run - and INSERT into likes and an UPDATE on posts. If I'm in a large-scale environment in heavy read/write situation what would be the best way to go?
Thanks in advance :)
MySQL isn't a good option for something like this, as a large number of writes will cause scaling issues. I believe MongoDB's real advantage is schemaless JSON document oriented storage, and while it should perform better than MySQL (if set up correctly), I think you should look at using Redis to store counters like this (The single INC command to increase a number value is the cream on top of the cake). It can handle writes much more efficiently than any other database, as per my experience.

Categories