I already read a few threads here and I also went through the MySQL Replication Documentation (incl. Cluster Replication), but I think it's not what I really want and propably too complicated, too.
I got a local and a remote DB that might get both accessed/manipulated by 2 different persons at the same time. What I'd like to do is to sync them as soon as possible (means instantly or as soon the local machine goes online). Both DB's only get manipulated from my own PHP Scripts.
My approach is the following:
If local machine is online:
Let my PHP Script on the loal machine always send the SQL Query to the remote DB too
Let my PHP Script on the remote machine always store its queries and...
...let the local machine ask the remote DB every x minutes for new queries and apply them locally.
If local machine is offline:
Do step 2. also for both machines and send the stored queries to the remote DB as soon as
local machine goes online again. Also pull the queries from the remote machine for sure.
My questions are:
Did I just misunderstand Replication or am I right that my way would be easier in my case? Or is there any other good solution for what I'm trying to accomplish?
Any idea how I could check whether my local machine is online/offline? I guess I'd have to use JavaScript, but I don't know how. The browser/my script would always be running on the local machine.
What you're describing is master-master or multi-master replication. There are plenty of tutorials on how to set this up across the web. Definitely do your research before putting a solution like this into production as replication in MySQL isn't exactly elegant -- you need to know how to recover if (when?) something goes wrong.
Related
I'm doing a group project and we're creating an online game. We're about half way done and now it's time to implement a database to store our records/data and make the website go live on the internet.
I'm just confused on how PSQL works exactly. My understanding is that PSQL needs to be running on some server in order to access it. For previous assignments, I downloaded Postgres for my Mac and ran it on localhost. The PHP code was something along the lines of:
$dbconn = pg_connect("host=localhost port=5432 dbname=mydbname");
So, if we intend to use PSQL, where would the server be? Do one of us have to host the server? Can we use some sort of free online server? How do we connect to that server with PHP?
In summary, I have two main questions:
How do we make our code go live on the internet for free? (It's just a temporary website and will only be up for a few weeks at most)
How can we all access a shared PSQL database?
Sorry for the noob questions, I just got started with web development and am still learning.
So, if we intend to use PSQL, where would the server be? Do one of us have to host the server? Can we use some sort of free online server? How do we connect to that server with PHP?
PostGreSQL is going to have to run on some machine visible to anyone who needs to access it. If only your web server (i.e., the machine running PHP and your website) needs to talk to the PGSQL, then PGSQL can be installed on your web server. This is a very common configuration.
The server might also run on the LAN where your web server is running or it might be running on an entirely different network on a different continent. The most important thing is that any machine which must connect directly to the database can actually connect to it. If you're building a website, this means you have a web server. Your web server will need to connect to the PGSQL server. The second most important thing is that your web server and the PGSQL server should share a very fast connection for the sake of performance and efficiency.
It's probably most common for your web server to also host the database. On an ubuntu machine, installing a PostGreSQL server is as easy as running a few commands. A quick search yields many examples like this one.
How do we make our code go live on the internet for free? (It's just a temporary website and will only be up for a few weeks at most)
I don't know anyone who is in the habit of offering free web hosting or DBMS services. You could ask a friend. Or put an ad on craigslist or something. Or if you are tech-savvy (it doesn't sound like you are) then you could configure a high-end router at your home to use Dynamic DNS to point some domain at a machine running at your house.
How can we all access a shared PSQL database?
I have no experience with Heroku, but you might sniff around there. PostGreSQL's website also maintains a list of hosting companies. Amazon offers RDS instances running PGSQL. Digital Ocean has a variety of tutorials and how-tos on dealing with PostGres. You could probably fire up a 'droplet' server for super cheap and install it yourself without too much effort.
Amazon offer a free tier database solution for Postgres. Something like 300 hours (don't quote me on it) for a low level set up.
They have tutorials on doing this here:
https://aws.amazon.com/rds/?nc2=h_m1
Once set up you get the end point and your connection string becomes something like
db_connect ("host=[URLENDPOING] user=postgres dbname=postres")
I have a simple MySQL database (one table with 12 rows of data and 5 columns) sitting on the web-server of my host provider.
On my home PC I create the data programmatically and store it in a free version of SQL Server (on my home PC). I would like to "upload" this data to the MySQL db in real time (or as close as I can get) over the internet (I'm assuming this is the only way to connect the pipes).
I know that opening up a MySQL database to a remote internet connection probably is not a "secure" thing to do, but the resulting data table will be publicly available anyway via an "app" so I'm not too worried about that (I suppose a hacker could "overwrite" my data with their own if they were both industrious and inclined) but I think the risk/reward is so small its not a major concern.
Anyway, what is the easiest way to do this with some semblance of security? I only know how to program in VB (I did a little HTML and ASP back in the day, but that was a long time ago). I could learn a few lines of code in another language if need be.
I do not have a static IP, and I've never actually interacted with a MySQL database before (only SQL server, so my MySQL knowledge/ familiarity is zero...but a db is a db, so how hard can it be?). Because of my home network firewall, I can't allow connections "in". I will have to make the connection to the MySQL db "out" from my home PC --> to the hosted database.
Ok this problem is not actually super simple.
What you will find is most shared hosting providers do not allow just any IP to access their databases.
Solution? set the IP for your computer of course! BUT.....you are probably on home internet connection so your IP address can CHANGE (if you have a static IP you are a lucky person!)
So the best way - create a mini-API!
Basically, you want to post your data to a script (with some security of course) that then inserts this data into the database.
It is a lot of work but having done all this before it seems to be the only way unless you have a dedicated server / advanced access privileges!
You could take a look at WAMP for your home pc. It's simple to use.
And then you should take a look at Mysql remote connections(some details here)
I would try this:
At your local computer install MySQL server, there's a free community
edition available for download, try the web installer since its more lightweight. Use the
custom installation and make sure MySql Workbench is selected too.
Workbench has a migration tool for the most common databases, Try this locally, so you can tell if all your data is correctly migrated from your local SQL Server to a MySQL db and there are no data losses in the process.
Then, You could probably connect through Workbench to your online MySQL db
and migrate your data to it directly from your just created local db. In case you cannot connect, make a backup of your local db and send the files to
your server by ftp or similar process. Then, simply restore DB from the backup file on your
online server.
Hope this helps!
I have been looking into creating a custom mySQL Point Of Sales system so that there is one centralised database for inventory levels between multiple stores and online etc.
The biggest problem I see is the unlikely event that the internet drops out in the bricks and mortar stores. If this were to happen, could it be set up so that the POS system is running off a local mySQL database on that computer (using MAMP or something similar) and then once internet is available again, automatically sync the databases to update sales and inventory levels?
In regards to 'how is the actual POS system going to be accessed without internet' I'm was thinking that the POS system would be run on the server when internet is available, and then when the net drops out it would be run from files stores on the machine pointing to the local database on the machine.
Yes, a minimal & viable solution would just be to have all of the POS data entered locally as well as on the remote database, then it serves as a sort of backup in case anything happens to the central DB.
As far as automating the 'fix' of the central DB after an outage, maybe the best way is to have the central system request sales data from the local DBs of each store. If the workflow is setup like this, then you don't really have to do anything 'special' about internet outages.
The problem here is obviously writes. You can use replication to always have a local readable copy of the database, but it's tricky to have multiple masters when using replication. I haven't used MySQL Cluster, but it may be what you need.
But since the problem is writes you can possibly implement the writing part of the POS system as a service you send messages to. When the network is down, queue the messages and send them when online.
An easier solution may actually be to always ensure network stability. Set up some mobile (GSM/3G) connection for failover and possibly even a standard POTS telephone line as well.
MySQL Master/Slave Replication would seem the logical approach.
Your MAMP code works directly against the local (slave) database; and when the MAMP server has access to the master database, the databases stay synchronised. If the slave loses access to the Internet, it's still working locally; and when the internet connection is restored it will synchronise again with the master.
With a little care in the database design (particularly autoincrement pks), you can run multiple local/slave servers all with their own local store data, and the master is a repository of data across all stores.
I'm using a self-made customer system in PHP running with a local mySQL Database.
Now i have a second computer on a different location which has to use this Database too. So i gave this mysql Database on a Server reachable through internet.
My problem is now, that the first one has often problems with the internet connection and then the program will not work. But it has to work every time!
Now i do not know how i should handle this problem?
A local Database and one in the internet, but how should i merge them?
Should i make a local DB per computer and match them together in one?
I also want to change the framework behind this system to symfony2 so is there a way to solve this problem with this framework (e.g. doctrine?)
Thanks for your help!
Update:
My limitation is the Internet connection on the first computer which could not be eliminated.
If you really have limitations of (1) not being able to move the database off of the machine with a bad connection and (2) not being able to fix the bad connection; you are going to have to keep some sort of local instance on the second machine.
I would try to setup master-master replication from the first machine with the bad connection to the second machine. I'm not sure how reliable this will be considering the replication will be failing often due to the first machine's bad connection. This problem may be extrapolated if one or both machines are using old versions of MySQL. MySQL 5.5, for example, can be configured to actively monitor replication connectivity.
If the majority of your application does READS instead of WRITES, perhaps you could install Memcached (or something similar) on the second machine so that the application can pull data from local memory without requiring a connection to the MySQL server.
There are a few ways to achieve what you want (although maybe not exactly how you described), but the best way is definitely do host the database on a server that doesn't have Internet connectivity problems. Look for hosting that allows remote MySQL connections.
I am currently planing a web application and I want to plan it to eventually run on a cluster later.
The cluster would be made of a php web cluster and a mysql cluster and a standalone storage unit (maybe a cluster of it I really don't know how that works :s)
I want to know if the code will be different than when php and mysql are on the same machine and what would be different?
The fact that the web and database servers are on different physical machines wouldn't change your code at all. The only place you'd need to change code is where you connect to the database - replacing the localhost reference with the IP address or hostname of the database server.
A clustered web server may need a different approach for storing sessions. If you got multiple webservers behind a load balancer, consequitive requests from the same session may end up on different servers. You should store the session data in a different place, like a central memcache.
Apart from a few of those issues, you should be fine regarding the web server.
As far as I know, MySQL and clustering are no friends. Although I wasn't really involved in the process, I know there has been a lot of trouble to get two database servers run together in our environment and even now they are not really clustered. They syncronize, but only one is actively used while the other is a fallback server.