Maximum databases on mysql server and security - php

I have 9 databases on an MYSQL server right now. I configured them using the command line, adding a user and giving that user only full permissions to a specific database. The mysql server has 512MB ram right now, and i don't know if i should be worried security wise of any problems that might arise with 9 databases on one server. Should I split it up into two servers each with about 4 to 5 at most? I have 2 other app servers running to handle the load of the websites, but those 2 servers hit the database server for everything. So far, no problems. I have all 3 servers set with IP restrictions (iptables and other firewall), so hacking from elsewhere isn't possible, but only from the apps themselves.
Since I created the users each with a restriction to a specific database, a hacker who hacks one can't get to the rest, i assume?
Thanks!

The mysql server has 512MB ram right now, and i don't know if i should
be worried security wise of any problems that might arise with 9
databases on one server.
Is the server cup running at 100% all the time?
Are there a lot of slow queries?
This could indicate that the server needs more resources.
You can also check the size of the InnoDB Buffer Usage. Increasing this is often a good way to relieve some pressure.
I have 2 other app servers running to handle the load of the websites,
but those 2 servers hit the database server for everything. So far, no
problems. I have all 3 servers set with IP restrictions (iptables and
other firewall), so hacking from elsewhere isn't possible, but only
from the apps themselves
This is good. That way no one can access your db server directly only through the app servers.
Since I created the users each with a restriction to a specific
database, a hacker who hacks one can't get to the rest, i assume?
Correct.

Related

Zend Framework Multiple Database Stop Working with slow query

I'm running a big Zend Framework web application with 5 database, independent of each other, distributed on 2 database servers running on Mysql 5.6.36 - CentOS7 with 16gb ram 8 core processor each. However, if one of the 2 database servers stops responding because of slows query, the users on the other server cannot access the web application. The only way to turn on the application is to restart mysql on that server. I try different things without success. The strange thing is that if I turn off one of the servers the system continues to work correctly.
It's hard to offer a meaningful answer, because you've given us no information about your tables or your queries (in fact, you haven't asked a question at all, you've just told us a story! :-).
I will offer a guess that you are using MyISAM for one or more of your tables. This means a query from one client locks the table(s) it queries, and blocks concurrent updates from other clients.
To confirm you have this problem, use SHOW PROCESSLIST on each of your two database servers at the time you experience the contention between web apps. You might see a bunch of queries stuck waiting for a lock (it may appear in the processlist with the state of "Updating").
If so, you might have better luck if you alter your tables' storage engine to InnoDB. See https://dev.mysql.com/doc/refman/5.7/en/converting-tables-to-innodb.html

how is mysql 200 connections limit applied?

I am building an ecommerce site hosted on godaddy shared hosting and I saw that only 200 database connections are allowed.
I am using Codeigniter framework for the site and I have 2 databases for the project.
Database 1 for storing sessions only with a user with read, write, update and delete privileges
Database 2 rest of tables needed for site, with a read only user.
Since 1 website visitor will be connecting to 2 databases does this mean that I can only have 100 visitoras at a time? Since each one will be using 2 connections.
Or can someone explain the 200 connections limit please.
As #Drew said, it depends. Limits exist everywhere (hardware, software, bandwidth etc). GoDaddy has it's own limitations (not only to database connections). Optimizing your code can help you take the maximum of the web-server and database servers.
For example if your code uses the database connection for 1 second to each database, you can serve 100 visitors per second. If you use it for 0.2 of a sec then you can serve 500 visitors every second.
Optimization is necessary especially on heavy web applications. Maybe you could organize your app so it does not need connecting to both databases for every request (this would double the available connectios per time fraction). Optimizing the SQL queries and minimizing JOINing tables will help your app too (also it will make it run faster).
Finally you can use caching, so you will not have your server constructing the same content again and again. This is not a full list of all optimizations you can do, but a start point to do your research and planning. Hope you find it helpful.

MySQL server configuration cloud

I wonder how my optimal MySQL server configuration would look like when I am using an Amazon Cloud Server which is being adapted from 8GB RAM to 144GB Ram according to the traffic peaks. I don't want to change the MySQL configuration every time I change the hardware resources of my server. Is there a suitable configuration for things like this?
Further I wonder which configurations are especially important for my case:
It is a Forum with 400-2500 live users online (Analytics real time analysis)
We got an authentication and session-check (two php files, which do 1-2 small SQL queries) which is being called 40,000 user x 4times per hour (I mentioned this because I worry about cached results = session check for this user may fail?)
In the past our database server went offline at peak times before all resources (CPU/RAM/Disk I/O, Bandwidth) has been used completely. Thus I think that we use a bad configuration currently. What is about the property max_connections , this is currently 500. Would it help to set it up to something like 2000 for servers of my size?
Should I use a perssistant PDO connection for the session-check / authentication?

Apache (PHP) & Mysql - Ideal 2 Server Setup

My client currently has only one server with both MySql and Apache running on it, and at busy times of the year they're occasionally seeing Apache fall over as it has so many connections.
They run two applications; their busy public ecommerce PHP based website and their (busy during working hours only) internal order processing type application with 15-20 concurrent users.
I've managed to get them to increase their budget enough to get two servers. I'm considering either:
A) one server running Apache/PHP and the other as a dedicated MySQL server, or
B) one running their public website only, and the other running MySQL and the internal application.
The benefit I see of A) is that Mysql my.cnf can be tuned to use all of the resources of that server, but it has the drawback of only having one Apache instance running.
B) would spread the load on Apache across both servers, but would limit MySQL's resources on that server, even out of working hours when the internal application won't be used.
I just can't decide which way to go with this and would be grateful of any feedback you may have.
Both approaches are wrong.
You have 2 goals here; availability and performance (I'm considering capacity to be an aspect of performance in this context).
To improve availability, you should be ensuring that there is no single point of failure in your architecture. But with the models you propose, you're actually creating multiple single points of failure - hence your 2 server models are less available than your single server.
From a performance point of view, you want to spread the workload across the available resources. You can't move CPU and memory between the servers but you can move the traffic.
Hence the optimal solution is to run both applications on both servers. Setting up MySQL clustering is a bit more complex, but probably the out-of-the-box asynch replication will be adequate - with the nodes configured as master-master (but writes from the 2 applications targeted sensibly).
There's probably a lot of scope for increasing the capacity of the system further but without a lot more detail (more than is appropriate in this forum, and possibly more than your client is comfortable payng for) it is hard to advise.

Website slow response (all other users) when large MySQL query running

This may seem like an obvious question but we have a PHP/MySQL app that runs on Windows 2008 server. The server has about 10 different sites running from it in total. Admin options on the site in question allow an administrator to run reports (through the site) which are huge and can take about 10mins in some cases. These reports are huge mysql queries that display the data on screen. When these reports are running the entire site goes slow for all users. So my questions are:
Is there a simple way to allocate server resources so if a (website) administrator runs reports, other users can still access the site without performance issues?
Even though running the report kills the website for all users of that site, it doesn't affect other sites on the same server. Why is that?
As mentioned, the report can take about 10 minutes to generate - is
it bad practice to make these kinds of reports available on the
website? Would these typically be generated by overnight scheduled tasks?
Many thanks in advance.
The load your putting on the server will most likely have nothing to do with the applications but the mysql table that you are probably slamming. Most people get around this by generating reports in down time or using mysql replication to have a second database which is used purely for reporting.
I recommend trying to get some server monitoring to see what is actually going on. I think Newrelic just released windows versions of its platform and you can try it out for free for 30 days i think.
There's the LOW_PRIORITY flag, but I'm not sure whether that would have any positive effect, since it's most likely a table / row locking issue that you're experiencing. You can get an idea of what's going on by using the SHOW PROCESSLIST; query.
If other websites run fine, it's even more likely that this is due to database locks (causing your web processes to wait for the lock to get released).
Lastly, it's always advisable to run big reporting queries overnight (or when the server load is minimal). Having a read replicated slave would also help.
I strongly suggest you install a replicated MySQL server, then running large administrator queries (SELECT only naturally) on it, to avoid the burden of having your website blocked!
If there's not too much transaction per second, you could even run the replica on a desktop computer remotely from your production server, and thus have a backup off-site of your DB!
Are 100% sure you have added all necessary indexes?
You need to have a insanely large website to have this kinds of problems unless you are missing indexes.
Make sure you have the right indexing and make sure you do not have connection fields of varchar, not very fast.
I have a database with quite a few large tables and millions of records that is working 24/7.
Has loads of activity and automated services processing it without issues due to proper indexing.

Categories