I've got 2 servers at different locations and I need a secure to do this.
SERVER1 shows the latest entries for the web application on SERVER2. This app from on a subdomain so that its not on the same server as the main website for security reasons.
Problem. main site on SERVER1 pulls from the database of that web app which is now on SERVER2. I can't do a remote SQL connection as that is too slow.
Is there an ideal way to code this or do this?
If I understand your question correctly, you are looking for a way to query the server1 database from server2, without using a remote SQL connection because it's too slow.
Based on that, any kind of remote operation will be too slow (e.g. a SSH tunnel isn't going to speed things since it just adds encryption to the process).
Personally, I would set up some kind of database replication - each time a record is inserted/changed/deleted on server1, that change is pushed to server2. Then you are able to query server1 as if it were local (i.e. query it on server2) where it'll always be up-to-date and should be sufficiently fast for your needs.
A remote SQL connection is the way to go.
The other option would be to do replication across SERVER1 and SERVER2, so each connection is local.
Related
Looking for some suggestions on best way / possibility of implementing offsite backup of the data in my php app.
So we have an PHP app that runs on the clients local site, which dumps the MySQL data to a datefile.sql each night. what's the best way to get this moved to an external server.
We host a server that currently we manually FTP files to each morning. How best can we automate this, would we need to hard code in FTP credentials, what if we had multiple clients how could we separate out this so no hard coded credentials are needed.
The ideal situation would be to have a MySQL instance running on the external server that the local clients server replicates the data across to this on the fly and back if required. Not even sure that's possible?
Happy to try and explain further if needed.
Thanks,
Any ideas?
you could create a bash script running on your server, called by a cron at night, that uses rsync to fetch the sql file from the clients servers (if you have an ssh connection with them), and restore it on your own machine.
You can achieve this using cron. Just create a cronjob and schedule it to run when you need it to. For all the file-transfering hasle, you may use rsync (which also provides ways to transfer only different data etc).
Also, I think that MySQL has a build-in feature for replication and backups, but I'm not sure about this or how to configure it..
I have a simple MySQL database (one table with 12 rows of data and 5 columns) sitting on the web-server of my host provider.
On my home PC I create the data programmatically and store it in a free version of SQL Server (on my home PC). I would like to "upload" this data to the MySQL db in real time (or as close as I can get) over the internet (I'm assuming this is the only way to connect the pipes).
I know that opening up a MySQL database to a remote internet connection probably is not a "secure" thing to do, but the resulting data table will be publicly available anyway via an "app" so I'm not too worried about that (I suppose a hacker could "overwrite" my data with their own if they were both industrious and inclined) but I think the risk/reward is so small its not a major concern.
Anyway, what is the easiest way to do this with some semblance of security? I only know how to program in VB (I did a little HTML and ASP back in the day, but that was a long time ago). I could learn a few lines of code in another language if need be.
I do not have a static IP, and I've never actually interacted with a MySQL database before (only SQL server, so my MySQL knowledge/ familiarity is zero...but a db is a db, so how hard can it be?). Because of my home network firewall, I can't allow connections "in". I will have to make the connection to the MySQL db "out" from my home PC --> to the hosted database.
Ok this problem is not actually super simple.
What you will find is most shared hosting providers do not allow just any IP to access their databases.
Solution? set the IP for your computer of course! BUT.....you are probably on home internet connection so your IP address can CHANGE (if you have a static IP you are a lucky person!)
So the best way - create a mini-API!
Basically, you want to post your data to a script (with some security of course) that then inserts this data into the database.
It is a lot of work but having done all this before it seems to be the only way unless you have a dedicated server / advanced access privileges!
You could take a look at WAMP for your home pc. It's simple to use.
And then you should take a look at Mysql remote connections(some details here)
I would try this:
At your local computer install MySQL server, there's a free community
edition available for download, try the web installer since its more lightweight. Use the
custom installation and make sure MySql Workbench is selected too.
Workbench has a migration tool for the most common databases, Try this locally, so you can tell if all your data is correctly migrated from your local SQL Server to a MySQL db and there are no data losses in the process.
Then, You could probably connect through Workbench to your online MySQL db
and migrate your data to it directly from your just created local db. In case you cannot connect, make a backup of your local db and send the files to
your server by ftp or similar process. Then, simply restore DB from the backup file on your
online server.
Hope this helps!
I am planning to increase My site performance by adding another MySQL server beside the current one because the current server is too busy.
Is it possible to scale PHP application with MySQL replication without PHP code change? I means all quires will be sent to the master and the master will distribute the load between itself and the slave.
Is there any easy way to send all write quires to the master and distribute read quires between the master and slave?
I think you need to put a load balancer / proxy between your db servers and clients (your code). Example solutions are:
HAProxy: http://haproxy.1wt.eu/
MySQL Proxy: https://launchpad.net/mysql-proxy
If you don't want to do the "load balancing" manually, you might want to look into MySQL Proxy.
I think you should also optimize your application's code (PHP) and then you should optimize your architecture.
First of all you can check your MYSQL queries. Mysql slow query log can help you. If you have a connection issues (MYSQL server has gone away or too many connections etc) you should manage your application's connection pooling mechanism.
And other steps and also your answer is (I think), you can set up MYSQL master-master replication. When you set replication clearly, you can put a load balancer (HAProxy) front of your replication.
You have 2 nodes for mysql (server A and server B, both of them master server)
You can configure HAProxy with server A is master and server B is backup server. Your all MYSQL operations comes server A via HAProxy and your data is automaticly sync with server B.
When server A is down, HAProxy sends all queries server B automaticly.
Also you can configure HAProxy with server A is all insert queries and server B is for all read queries.
All this cases your code should connect MYSQL via HAProxy
I have a local intranet application which runs off a basic WAMP server in our offices. Every morning, one of our team members manually syncs our internal mysql db with our external mysql db (where our online enrollments occur). If a change is made during the day on the intranet application, it is not reflected on the external db until the following day.
I am wondering if it is possible to (essentially) tunnel to an external mysql connection from say a wamp or xampp server from within our offices and work in 'real-time'.
Anybody had any luck or advice?
Yes
Replication enables data from one MySQL database server (the master) to be replicated to one or more MySQL database servers (the slaves). Replication is asynchronous - slaves need not to connected permanently to receive updates from the master. This means that updates can occur over long-distance connections and even over temporary or intermittent connections such as a dial-up service. Depending on the configuration, you can replicate all databases, selected databases, or even selected tables within a database.
If you use the external server directly, performance is likely to suffer. A Gigabit LAN might be a thousand times faster than your Internet connection - particularly the upload speed of an ADSL connection.
Just make your internal application use the database from the external one. You may need to add permission to the external server to allow connections from your internal server IP, but otherwise this is just like having a webserver and sperate db server that need to access each other.
Can't really tell you how to do this here - it all depends on your specific configuration, something that I would thing is a little complicated (and too specialized) to figure out on SO.
I have a php application running on the local xampp server, the application has data that needs to be synced with a website database. The application and website are in php/mysql...My question is how can i sync them up keeping in mind security issues.
Thnx.
A clever and quasi secure method would be to use SSL and HTTP auth to post PHP serialized data to the website with some sort of query/post params that the website checks for. Obviously not impenetrable, but it'd avoid the casual exploiter and be relatively simple to implement.
If, as you say, both applications are relying on MySQL, you can consider using MySQL database replication - set one server as master, second as slave. That way your database on slave server will always be up-to-date.
You can read more about MySQL database replication here: http://dev.mysql.com/doc/refman/5.1/en/replication.html