I have a php application running on the local xampp server, the application has data that needs to be synced with a website database. The application and website are in php/mysql...My question is how can i sync them up keeping in mind security issues.
Thnx.
A clever and quasi secure method would be to use SSL and HTTP auth to post PHP serialized data to the website with some sort of query/post params that the website checks for. Obviously not impenetrable, but it'd avoid the casual exploiter and be relatively simple to implement.
If, as you say, both applications are relying on MySQL, you can consider using MySQL database replication - set one server as master, second as slave. That way your database on slave server will always be up-to-date.
You can read more about MySQL database replication here: http://dev.mysql.com/doc/refman/5.1/en/replication.html
Related
I have a system in which there are several copies of a MySQL DB schema running on remote servers (and it's not possible to consolidate them all into one DB schema in the cloud).
However, this has proven troublesome because whenever the master DB schema is updated, I have to then remotely log into all the other servers and manually update the schemas using the sync tool in MySQL Workbench, which honestly doesn't work very well (i.e., it doesn't catch changes to views, etc.).
As such, I would like to come up with a way to have the master DB schema stored somewhere in AWS and have all the other, remote instances do something like a daily check for anything that's different between the schema locally installed on that server and the master schema in AWS.
Are there tools out there for this sort of thing, and what are they called? Also, because the application itself is written in PHP, using a tool that's easy to use in PHP would be ideal.
Thank you.
Also, I should note that a lot of the remote schemas are stored on servers behind very secure firewalls, so I don't think that pushing the master DB schema to the remote instances will work. Instead, I think that the request for the schema update has to originate from each of the remote servers to the master schema on AWS, if that makes a difference at all.
Db-sync project for MySql sounds like the tool for you. Here is git repo: https://github.com/mrjgreen/db-sync
I have a simple MySQL database (one table with 12 rows of data and 5 columns) sitting on the web-server of my host provider.
On my home PC I create the data programmatically and store it in a free version of SQL Server (on my home PC). I would like to "upload" this data to the MySQL db in real time (or as close as I can get) over the internet (I'm assuming this is the only way to connect the pipes).
I know that opening up a MySQL database to a remote internet connection probably is not a "secure" thing to do, but the resulting data table will be publicly available anyway via an "app" so I'm not too worried about that (I suppose a hacker could "overwrite" my data with their own if they were both industrious and inclined) but I think the risk/reward is so small its not a major concern.
Anyway, what is the easiest way to do this with some semblance of security? I only know how to program in VB (I did a little HTML and ASP back in the day, but that was a long time ago). I could learn a few lines of code in another language if need be.
I do not have a static IP, and I've never actually interacted with a MySQL database before (only SQL server, so my MySQL knowledge/ familiarity is zero...but a db is a db, so how hard can it be?). Because of my home network firewall, I can't allow connections "in". I will have to make the connection to the MySQL db "out" from my home PC --> to the hosted database.
Ok this problem is not actually super simple.
What you will find is most shared hosting providers do not allow just any IP to access their databases.
Solution? set the IP for your computer of course! BUT.....you are probably on home internet connection so your IP address can CHANGE (if you have a static IP you are a lucky person!)
So the best way - create a mini-API!
Basically, you want to post your data to a script (with some security of course) that then inserts this data into the database.
It is a lot of work but having done all this before it seems to be the only way unless you have a dedicated server / advanced access privileges!
You could take a look at WAMP for your home pc. It's simple to use.
And then you should take a look at Mysql remote connections(some details here)
I would try this:
At your local computer install MySQL server, there's a free community
edition available for download, try the web installer since its more lightweight. Use the
custom installation and make sure MySql Workbench is selected too.
Workbench has a migration tool for the most common databases, Try this locally, so you can tell if all your data is correctly migrated from your local SQL Server to a MySQL db and there are no data losses in the process.
Then, You could probably connect through Workbench to your online MySQL db
and migrate your data to it directly from your just created local db. In case you cannot connect, make a backup of your local db and send the files to
your server by ftp or similar process. Then, simply restore DB from the backup file on your
online server.
Hope this helps!
Situation:
Php application with mysql database running on 2 sites
online -static ip X.X.X.X
localhost (not online most of time and dynamic ip)
application traffic is usually low <10 users.
what i need is that whenever a change is done to the online database, this change is pushed to localhost -if its online or when ever its available- and vise versa (any changes done locally is uploaded online to database when ever there is online connection).
is it possible to setup such replication by mysql ? or do i need to write a custom PHP that ping master server and once its available
thanks very much :).
Yes you can do this with replication. Just pick which server you want to be the master and have the second one send all of its changes to the main one then the main one could send its changes back.
Replication can be a bit daunting to set up but once its up and running its grate. http://dev.mysql.com/doc/refman/5.0/en/replication-howto.html
Let's first analyze your question:
The problem of accessing MySQL with a dynamic ip.
This is very easy. Once you installed MySQL on a server with an ever-changing IP, what you can do is go to NO-IP, DynDNS or any other Dynamic DNS service and register for free with them. Once you've registered with them, you would get a client for your operating system. Install that and then you can access your MySQL server using a domain name.
Example:
Instead of having to access your server at 127.0.0.1, you can access it as mysql-server.easynet.net etc.
Now the second and albeit complex part of your question, how to do available and lazy replication.
This is relatively a bit more complex than the previous step. But, what actually happens is that you have to choose a scheme of replication. Basically what you are looking for here is MASTER-MASTER replication since you have a possibility of changes happening at both the MySQL servers. Thus the updates need to be bi-directional, that's what this scheme of replication does. How to do it? Well, I am providing the links which I've found easier to follow:
Master-Master Replication
Step-by-step MySQL Master Replication
I hope that would ease your plight and answer your question!
Cheers!
Sure, you can
You need to setup both MySQL servers as Master and Slave at the same time.
Configure the online server as Master, and the localhost server as slave, and once replication is OK.
Configure the localhost as Master and the online server as slave.
I already did that on two servers.
About the dynamic IP on the local host, simply you can use any dynamic IP service like: no-ip, and use the dns name instead of the IP.
Here's a post i've written (in french, but you can get the configuration snippets from it) for setting up a MASTER-MASTER replication with a load balancer (mysql proxy) for balancing SQL queries between both nodes.
I've got 2 servers at different locations and I need a secure to do this.
SERVER1 shows the latest entries for the web application on SERVER2. This app from on a subdomain so that its not on the same server as the main website for security reasons.
Problem. main site on SERVER1 pulls from the database of that web app which is now on SERVER2. I can't do a remote SQL connection as that is too slow.
Is there an ideal way to code this or do this?
If I understand your question correctly, you are looking for a way to query the server1 database from server2, without using a remote SQL connection because it's too slow.
Based on that, any kind of remote operation will be too slow (e.g. a SSH tunnel isn't going to speed things since it just adds encryption to the process).
Personally, I would set up some kind of database replication - each time a record is inserted/changed/deleted on server1, that change is pushed to server2. Then you are able to query server1 as if it were local (i.e. query it on server2) where it'll always be up-to-date and should be sufficiently fast for your needs.
A remote SQL connection is the way to go.
The other option would be to do replication across SERVER1 and SERVER2, so each connection is local.
I am currently planing a web application and I want to plan it to eventually run on a cluster later.
The cluster would be made of a php web cluster and a mysql cluster and a standalone storage unit (maybe a cluster of it I really don't know how that works :s)
I want to know if the code will be different than when php and mysql are on the same machine and what would be different?
The fact that the web and database servers are on different physical machines wouldn't change your code at all. The only place you'd need to change code is where you connect to the database - replacing the localhost reference with the IP address or hostname of the database server.
A clustered web server may need a different approach for storing sessions. If you got multiple webservers behind a load balancer, consequitive requests from the same session may end up on different servers. You should store the session data in a different place, like a central memcache.
Apart from a few of those issues, you should be fine regarding the web server.
As far as I know, MySQL and clustering are no friends. Although I wasn't really involved in the process, I know there has been a lot of trouble to get two database servers run together in our environment and even now they are not really clustered. They syncronize, but only one is actively used while the other is a fallback server.