I am trying to make a complete file & mySQL backup of my site each night.
I was thinking the best way of doing it would be to have a cronjob run each night that would login to a remote server and replicate all of the local files.
Then, I need to figure out a way to take backups of all the mysql databases (currently there are three) and upload them all to the remote server as well.
This sounds like a huge project and I don't know whether to reinvent the wheel here, or if there is some script out there which basically does the same thing already.
Use cronjob to run a bash script
mysqldump the databases
tar -cvf the files
wput it all to your remote server
Also you can set a variable like now=$(date +"%Y_%m_%d") to use in your file names
You can use the mysqldump command to backup the database to a file and then upload to a different server
http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html
have you thought about MySQL replication? Maybe that fits better to your needs and you doesn't need php to do it
http://dev.mysql.com/doc/refman/5.5/en/replication.html
Related
I have two databases for my website. One I run my live website and the other I run a development website. My need is to copy the live database to dev database everyday.
Is there some xml API to achieve this, which will copy my database to another whichever name I want and then rename it further according to my needs.
I have to achieve this entirely using php (No Phpmyadmin interface).
I tried BigDump.php but since my database size is more than 100MB the script breaks.
For this task, I will suggest you please setup cron through cPanel and use mysqldump command for the backup and restart that with mysql command.
You can create simple bash script for this task and setup that script in cron with everyday setting
Looking for some suggestions on best way / possibility of implementing offsite backup of the data in my php app.
So we have an PHP app that runs on the clients local site, which dumps the MySQL data to a datefile.sql each night. what's the best way to get this moved to an external server.
We host a server that currently we manually FTP files to each morning. How best can we automate this, would we need to hard code in FTP credentials, what if we had multiple clients how could we separate out this so no hard coded credentials are needed.
The ideal situation would be to have a MySQL instance running on the external server that the local clients server replicates the data across to this on the fly and back if required. Not even sure that's possible?
Happy to try and explain further if needed.
Thanks,
Any ideas?
you could create a bash script running on your server, called by a cron at night, that uses rsync to fetch the sql file from the clients servers (if you have an ssh connection with them), and restore it on your own machine.
You can achieve this using cron. Just create a cronjob and schedule it to run when you need it to. For all the file-transfering hasle, you may use rsync (which also provides ways to transfer only different data etc).
Also, I think that MySQL has a build-in feature for replication and backups, but I'm not sure about this or how to configure it..
Ok, i have searched high and low and have been unable to find an answer that works for what i am trying to do. I have two databases, we'll call them DB1 and DB2. I have a cron job that runs every night at 4am that backs up DB1 and stores its data in a SQL file archive, we'll call this file db_backup.sql. The file is stored in a folder on the server, we'll call it ROOT/backups/db_backup.sql.
Info
database names: DB1 and DB2
backup filename: db_backup.sql
backup file path: ROOT/backups/db_backup.sql
What i'm trying to do:
I want to use the db_backup.sql file to build DB2. I am basically trying to set up database replication where i replicate DB1 out to DB2. Don't know of any other way to do this on shared hosting servers than what i'm trying to explain. I am trying to use php to import the db_backup.sql file into DB2.
My Environment:
The website and databases are on a shared hosting account with godaddy (yes, i would love to get dedicated servers to set up real replication, but can't afford it for now). The databases are mysql in phpmyadmin.
Is this something that is possible? Any help would be greatly appreciated! Let me know if you have any questions as well.
I'm not sure if I understand your problem. You need to copy the db_backup file to the second host where you have access to the database and load the sql file.
From a shell:
mysql -hhost -uusername -ppassword databasename < db_backup.sql
This will restore the tables on the second machine.
Should be as simple as, setting up a CRON job on server 2 to call a script on Server 1 which dishes out the SQL file, then that cron job script would import / rebuild the DB.
Make sure to require a hash (via get or post) before giving out the DB SQL on server 1, or else anyone could read it / have your database dump.
Can you not avoid PHP altogether and connect directly to the database remotely to perform the backup either via the command line or using a scripting language more suitable for long running processes?
I have a backup created of my mysql database (vbulletin forum v3.8) everyday. It's about 360mb in size. It's stored as one text file in a secure folder.
I'm thinking of getting another server, through a different host, and somehow automatically transferring the backup to my second server every day.
Any ideas on how I could automate this process? I'm thinking PHP and a cron job.
Cron definately. Php, if you like it, but using bash with mysqldump combined with gzip works wonders.
Schedule rsync to transfer the files (over ssh) with cron (if you're on Linux).
Cron + rsync may be your best bet. If the file is text as you say and the changes are "diff"able then rsync can be used to transfer only the updates to that file. For example, the crontab could look something like this:
20 4 * * * rsync -a --delete source/ username#remotemachine.com:/path/to/destination/
This will sync the remote machine once a day deleting any files in the remote copy that no longer exist on the source machine.
As a note, I just read again and noticed this is a mysql backup so they output of the dump could eventually contain binary and in that case you probably want to just use a replication server or copy the file whole each day. rsync could be used for the copy as well...
slightly different approach and technically not a backup solution, but might want to consider running mysql in a replication mode and replicating changes live. so if the worst happpens you will be up to date with the data and not a day behind.
I have a fairly small MySQL database (a Textpattern install) on a server that I do not have SSH access to (I have FTP access only). I need to regularly download the live database to my local dev server on demand; i.e., I would like to either run a script and/or have a cron job running. What are some good ways of doing this?
Some points to note:
Live server is running Linux, Apache 2.2, PHP 5.2 and MySQL 4.1
Local server is running the same (so using PHP is an option), but the OS is Windows
Local server has Ruby on it (so using Ruby is a valid option)
The live MySQL db can accept remote connections from different IPs
I cannot enable replication on the remote server
Update: I've accepted BlaM's answer; it is beautifully simple. Can't believe I didn't think of that. There was one problem, though: I wanted to automate the process, but the proposed solution prompts the user for a password. Here is a slightly modified version of the mysqldump command that passes in the password:
mysqldump -u USER --password=MYPASSWORD DATABASE_TO_DUMP -h HOST > backup.sql
Since you can access your database remotely, you can use mysqldump from your windows machine to fetch the remote database. From commandline:
cd "into mysql directory"
mysqldump -u USERNAME -p -h YOUR_HOST_IP DATABASE_TO_MIRROR >c:\backup\database.sql
The program will ask you for the database password and then generate a file c:\backup\database.sql that you can run on your windows machine to insert the data.
With a small database that should be fairly fast.
Here's what I use. This dumps the database from the live server while uploads it to the local server.
mysqldump -hlive_server_addresss -ulive_server_user -plive_server_password --opt --compress live_server_db | mysql -ulocal_server_user -plocal_server_password local_server_db
You can run this from a bat file. You can ever use a scheduled task.
Is MySQL replication an option? You could even turn it on and off if you didn't want it constantly replicating.
This was a good article on replication.
I would create a (Ruby) script to do a SELECT * FROM ... on all the databases on the server and then do a DROP DATABASE ... followed by a series of new INSERTs on the local copy. You can do a SHOW DATABASES query to list the databases dynamically. Now, this assumes that the table structure doesn't change, but if you want to support table changes also you could add a SHOW CREATE TABLE ... query and a corresponding CREATE TABLE statement for each table in each database. To get a list of all the tables in a database you do a SHOW TABLES query.
Once you have the script you can set it up as a scheduled job to run as often as you need.
#Mark Biek
Is MySQL replication an option? You could even turn it on and off if you didn't want it constantly replicating.
Thanks for the suggestion, but I cannot enable replication on the server. It is a shared server with very little room for maneuver. I've updated the question to note this.
Depending on how often you need to copy down live data and how quickly you need to do it, installing phpMyAdmin on both machines might be an option. You can export and import DBs, but you'd have to do it manually. If it's a small DB (and it sounds like it is), and you don't need live data copied over too often, it might work well for what you need.