Looking for some suggestions on best way / possibility of implementing offsite backup of the data in my php app.
So we have an PHP app that runs on the clients local site, which dumps the MySQL data to a datefile.sql each night. what's the best way to get this moved to an external server.
We host a server that currently we manually FTP files to each morning. How best can we automate this, would we need to hard code in FTP credentials, what if we had multiple clients how could we separate out this so no hard coded credentials are needed.
The ideal situation would be to have a MySQL instance running on the external server that the local clients server replicates the data across to this on the fly and back if required. Not even sure that's possible?
Happy to try and explain further if needed.
Thanks,
Any ideas?
you could create a bash script running on your server, called by a cron at night, that uses rsync to fetch the sql file from the clients servers (if you have an ssh connection with them), and restore it on your own machine.
You can achieve this using cron. Just create a cronjob and schedule it to run when you need it to. For all the file-transfering hasle, you may use rsync (which also provides ways to transfer only different data etc).
Also, I think that MySQL has a build-in feature for replication and backups, but I'm not sure about this or how to configure it..
Related
My plan is to store data in my local MySQL database and after a time interval it will update my live MySQL database remotely. Is it possible?
I'm planning a inventory management script in PHP MySQL. I will install web application locally to my clients and it will backup local data to live server via API or any library.
Can any one suggest me any library for this all.
Thanks is advance.
You can achieve this with different methods:
Database Replication (recommended solution), it almost handles everything perfectly. You can read different tutorials available on internet to use it.
Scheduled PHP script to sync data on your specified occasions. There are several packages available for this, i.e. https://github.com/mrjgreen/db-sync. For scheduling you can use CronTab or supervisor or etc.
But I would personally recommend you to use replication, since it is a native DBMS solution for such scenarios.
I am working on a site that is hosted on
goDaddy, through cPanel
The client wants to transition from their old PHP server to a node.js system.
They would like to implement new code in phases while leaving the old site up and running. The old and new code would be running on the same server.
I have a good break point for phase 1, but am not sure how to allow the PHP and node code to run simultaneously and listening for requests on the same server. I am familiar with node, but not as much with PHP.
In short- Can I have PHP and Node.js running simultaneously on the same server? If so, what considerations need to be made?
Thank you in advance!
You will most likely want to make it to where you migrate to the node.js service one endpoint at a time. That way you can test, debug, and fix things quickly without too much work. I recommend you use express for your router and whatever database connector you want. You will want to canary test between the two as well.
I am trying to make a complete file & mySQL backup of my site each night.
I was thinking the best way of doing it would be to have a cronjob run each night that would login to a remote server and replicate all of the local files.
Then, I need to figure out a way to take backups of all the mysql databases (currently there are three) and upload them all to the remote server as well.
This sounds like a huge project and I don't know whether to reinvent the wheel here, or if there is some script out there which basically does the same thing already.
Use cronjob to run a bash script
mysqldump the databases
tar -cvf the files
wput it all to your remote server
Also you can set a variable like now=$(date +"%Y_%m_%d") to use in your file names
You can use the mysqldump command to backup the database to a file and then upload to a different server
http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html
have you thought about MySQL replication? Maybe that fits better to your needs and you doesn't need php to do it
http://dev.mysql.com/doc/refman/5.5/en/replication.html
I've got 2 servers at different locations and I need a secure to do this.
SERVER1 shows the latest entries for the web application on SERVER2. This app from on a subdomain so that its not on the same server as the main website for security reasons.
Problem. main site on SERVER1 pulls from the database of that web app which is now on SERVER2. I can't do a remote SQL connection as that is too slow.
Is there an ideal way to code this or do this?
If I understand your question correctly, you are looking for a way to query the server1 database from server2, without using a remote SQL connection because it's too slow.
Based on that, any kind of remote operation will be too slow (e.g. a SSH tunnel isn't going to speed things since it just adds encryption to the process).
Personally, I would set up some kind of database replication - each time a record is inserted/changed/deleted on server1, that change is pushed to server2. Then you are able to query server1 as if it were local (i.e. query it on server2) where it'll always be up-to-date and should be sufficiently fast for your needs.
A remote SQL connection is the way to go.
The other option would be to do replication across SERVER1 and SERVER2, so each connection is local.
I already read a few threads here and I also went through the MySQL Replication Documentation (incl. Cluster Replication), but I think it's not what I really want and propably too complicated, too.
I got a local and a remote DB that might get both accessed/manipulated by 2 different persons at the same time. What I'd like to do is to sync them as soon as possible (means instantly or as soon the local machine goes online). Both DB's only get manipulated from my own PHP Scripts.
My approach is the following:
If local machine is online:
Let my PHP Script on the loal machine always send the SQL Query to the remote DB too
Let my PHP Script on the remote machine always store its queries and...
...let the local machine ask the remote DB every x minutes for new queries and apply them locally.
If local machine is offline:
Do step 2. also for both machines and send the stored queries to the remote DB as soon as
local machine goes online again. Also pull the queries from the remote machine for sure.
My questions are:
Did I just misunderstand Replication or am I right that my way would be easier in my case? Or is there any other good solution for what I'm trying to accomplish?
Any idea how I could check whether my local machine is online/offline? I guess I'd have to use JavaScript, but I don't know how. The browser/my script would always be running on the local machine.
What you're describing is master-master or multi-master replication. There are plenty of tutorials on how to set this up across the web. Definitely do your research before putting a solution like this into production as replication in MySQL isn't exactly elegant -- you need to know how to recover if (when?) something goes wrong.