MySQL query to reduce server load - php

I have a MySQL server running that will be queried regularly through a php front end. I'm slightly worried about server load as there will be a fair amount of people accessing the webpage, with each session querying the database regularly. The results of the query, and in essence the webpage will be the same for all users.
Is there a way of querying the database once, and outputting the data/results to the webpage, from which all users connect to and view? Basically running the query for all users that connect to the webpage, rather than each user querying the database.
Any suggestions appreciated.
Thanks

You don't have to worry.
Databases intended for that.
Most sites in the world run exactly the same way: MySQL server running that will be queried regularly through a php front end. Nothing bad with it.
Well tuned SQL server and properly designed query will serve much more than you think.
You will need exceptionally high traffic to start worrying about such things.
Don't forget that MysQL has it's own query cache.
Also please note that there are no users "connected" to the webpage. They connect, get page contents and disconnect.

You should give the server a try. If the server is overloaded,
you can always try Memcached tool. It can be used via PHP or by MySQL directly. It will save you from querying DB server with similar queries, i.e. the load on server will be decreased drastically.

If the webpage will be the same for all users, why do you even need to have a MySQL backend?
I think the best solution would be to have a standalone script running periodically (e.g. as a cron) which generates the static HTML for your web pages. That way, there is no need for users to query the database when they are just going to end up with the exact same page anyway.

If its a large query with joins you could create a view in mysql with the queried data and query the view, and update the view if the data changes.

Related

PHP with MSSQL performance

I'm developing a project where I need to retrieve HUGE amounts of data from an MsSQL database and treat that data. The data retrieval comes from 4 tables, 2 of them with 800-1000 rows, but the other two with 55000-65000 rows each one.
The execution time wasn't tollerable, so I started to rewrite the code, but I'm quite inexperienced with PHP and MsSQL. My execution of PHP atm is in localhost:8000. I'm generating the server using "php -S localhost:8000".
I think that this is one of my problems, the poor server for a huge ammount of data. I thought about XAMPP, but I need a server where I can put without problems the MsSQL Drivers to use the functions.
I cannot change the MsSQL for MySQL or some other changes like that, the company wants it that way...
Can you give me some advices about how to improve the performance? Any server that I can use to improve the PHP execution? Thank you really much in advance.
The PHP execution should least of your concerns. If it is, most likely you are going about things in the wrong way. All the PHP should be doing is running the SQL query against the database. If you are not using PDO, consider it: http://php.net/manual/en/book.pdo.php
First look to the way your SQL query is structured, and how it can be optimised. If in complete doubt, you could try posting the query here. Be aware that if you can't post a single SQL query that encapsulates your problem you're probably approaching the problem from the wrong angle.
I am assuming from your post that you do not have recourse to alter the database schema, but if so that would be the second course of action.
Try to do as much data processing in SQL Server as possible. Don't do data joining or other type of data processing that can be done in the RDBMS.
I've seen PHP code that retrieved data from multiple tables and matched lines based on several conditions. This is just an example of a misuse.
Also try to handle data in sets in SQL (be it MS* or My*) and avoid, if possible, line-by-line processing. The optimizer will output a much more performant plan.
This is small database. Really. My advices:
- Use paging for the tables and get data by portions (by parts)
- Use indexes for tables
- Try to find more powerful server. Often hosters companies uses one database server for thousands user's databases and speed is very slow. I suffered from this and bought dedicated server finally.

CakePHP and connection pooling

I am working on a website that needs to serve multiple requests from the same table simultaneously. We made a simple index page in CakePHP which draws some data from the database (10 rows, to be precise), and a colleague executed a test simulating 1000 users viewing the same page at the same time, meaning that 1000 identical requests would be issued to the database. The thing is that at around 500 requests, the database stopped being responsive, everything just froze and we had to kill the processes.
What comes to mind is that each and every request is executed on its own connection, and this would explain why the MySQL server was overwhelmed. From a few searches online, and on SO, I can see that PHP does not support connection pooling natively, as can be done in a Java application, for instance. Having based our app on CakePHP 2.5.3, however, I would like to think that there is some underlying mechanism that overcomes these limitations. Perhaps I am not doing something right?
Any suggestion is welcome, I just want to make sure to exhaust every possible solution.
If results gonna be same for each query, you can cache the query result, then it will not send multiple request to database,
try this plugin:-
https://github.com/ndejong/CakephpAutocachePlugin

Reduce MySQL queries by saving result to textfile

I have an app that is posting data from android to some MySQL tables through PHP with a 10 second interval. The same PHP file does a lot of queries on some other tables in the same database and the result is downloaded and processed in the app (with DownloadWebPageTask).
I usually have between 20 and 30 clients connected this way. Most of the data each client query for is the same as for all the other clients. If 30 clients run the same query every 10th second, 180 queries will be run. In fact every client run several queries, some of them are run in a loop (looping through results of another query).
My question is: if I somehow produce a textfile containing the same data, and updating this textfile every x seconds, and let all the clients read this file instead of running the queries themself - is it a better approach? will it reduce serverload?
In my opinion you should consider using memcache.
It will let you store your data in memory which is even faster than files on disk or mysql queries.
What it will also do is reduce load on your database so you will be able to serve more users with the same server/database setup.
Memcache is very easy to use and there are lots of tutorials on the internet.
Here is one to get you started:
http://net.tutsplus.com/tutorials/php/faster-php-mysql-websites-in-minutes/
What you need is caching. You can either cache the data coming from your DB or cache the page itself. Below you can find few links on how do the same in PHP:
http://www.theukwebdesigncompany.com/articles/php-caching.php
http://www.addedbytes.com/articles/for-beginners/output-caching-for-beginners/
And yes. This will reduce DB server load drastically.

Using PHP to separate read / writes

We have decided that we are going to move from a single database to a replicated database in a master-slave architecture and are going to get all of our reads to go to the slave and writes to the master.
The reason we are going down this route is an addition to our product means that we are getting a large increase in database connections which leads to performance problems with our reporting suite.
We are using MySQL(5.1.55) and the application is developed in PHP.
A couple of general queries on this:
How would you tell the application which db to read from? Would you do it within the PHP? Or just something like mysqldnd_ms or mysql proxy?
Where would ajax requests read from? We have a page which allows users to flag a record. This is then saved in the database and users can see which records have been flagged.
Thanks for any advice.

How to keep database connect alive?

I'm using ajax, and quite often if not all the time, the first request is timed out. In fact, if I delay for several minutes before making a new request, I always have this issue. But the subsequent requests are all OK. So I'm guessing that the first time used a database connect that is dead. I'm using MySQL.
Any good solution?
Can you clarify:
are you trying to make a persistent connection?
do basic MySQL queries work (e.g. SELECT 'hard-coded' FROM DUAL)
how long does the MySQL query take for your ajax call (e.g. if you run it from a mysql command-line or GUI client.)
how often do you write to the MySQL tables used in your AJAX query?
Answering those questions should help rule-out other problems that have nothing to do with making a persistent connection: basic database connectivity, table indexing / slow-running SQL, MySQL cache invalidation etc.
chances are that you problem is NOT opening the connection, but actually serving the request.
subsequent calls are fast because of the mysql query cache.
what you need to do is to look for slow mysql queries, for example by turning on the slow query log, or by looking at the server at real time using mytop or "SHOW PROCESSLIST" to see if there is a query that takes too long. if you found one, use EXPLAIN to make sure it's properly indexed.

Categories