mysql database Back-up (linux server to client m/c) automatically - php

How can we store the mysql database Back-up from a linux server to a client windows m/c automatically in a fixed duration of time...

You cannot "guarantee" that the backup will be saved in a fixed amount of time. If you setup a share to a folder on the windows machine you can have a cron job backup the database and save it to the share at a specific start time. But it depends on the size of the backup how long it takes to create and copy/move the backup to the share.

I wrote these scripts to backup a mysql database.
http://codingsnippets.com/category/mysql/
They'll have to be called from a cron job to do it regularly.
http://adminschoice.com/crontab-quick-reference
If you want to create a windows share on the windows box, you can mount it from linux with a command like this:
mount -t smbfs -o username=bjones //192.168.1.100/"Storage (H)" /mnt/windows-storage/
Good luck

Related

Eclipse auto-copy via SSH from one server to another upon save, PHP

I'm writing a PHP app, and our team has our servers set up so that I have to copy code onto a second server every time I edit a file. Naturally this is against the DRY concept, and simply a monotonous chore.
When I make a change to the file, and go to refresh the webpage to see the changes, both of these servers have to be updated, or else there is a chance that I will be seeing the old version on the server that hasn't been updated yet. So I'm copying/pasting/saving hundreds of times a day while I develop this app!
I'm using Eclipse Luna 4.4.1. In the Remote Systems Explorer, the system type is "SSH Only." This is the same for server 2.
I would like there to be some kind of functionality where each time I save sample.php on server 1, the same file sample.php will be automatically updated and saved with the same code on server 2. I don't want to have to do anything else other than just save the file on server 1; after doing that I'd like for the file on server 2 to be automatically updated.
Both servers are running on nginx OS.
I've been reading a lot about this here on SE, but many of the question/answers are 5+ years old, and I wonder if there is an optimal way to do this today. I don't have much experience with Java, but I'm happy to do this any way that will work. Any plugin or other way would be great.
Take a look at running rsync over ssh, and put it on a cron to run the update on a periodic interval as you desire.
Here is an example of an rsync process I run to copy database backups (type / contents of the file are mostly irrelevant here):
rsync -ar -e "ssh -l <username>" <local_path_of_files_to_watch> hostname:<target_path_where_files_land>
This is what my command looks like to rsync databases from a remote system that monitors radio traffic and records decoded messages to a daily log file:
rsync -av --remove-source-files --progress --stats -e "ssh -l stratdb" 192.168.0.23:/var/log/stratux.sqlite.*.gz /Users/stratdb/sqlite-dbs/.
If you want this process to run without having to define a password, you'll want to setup .ssh with shared keys to allow passwordless login from server 1 -> server 2 for this process (Ideally this is done from a user that has limited capabilities on both servers).
There are some continues deployment processes you could use that would do builds and copies and the like (e.g. Jenkins) if you need really robust and highly configurable solution... but if you just need to brute-force copy files from one server to another with a single easy command (ad-hoc) or automagically.. I'll investigate rsync

Backing up uploaded document on Linux

So I have a PHP application running on linux machine which is using a mysql database. I have manage to add the back up my mysql database every day by adding a code in the CRONTAB. In my application clients are able to upload document, of which a saved in a directory in the application folder ie /myapp/uploaded_documents/, I am looking at backing up this directory.
My question is: how do I back up a directory to a certain remote location on a certain time every day? Is it possible to also password protect this directory on my app folder?
Thank you
As told in previous answer, to backup periodically on a remote machine you can use rsync+ssh+crontab. Just set ssh to access the remote machine without password following (for ubuntu distro) https://help.ubuntu.com/community/SSH/OpenSSH/Keys, then add to crontab a rsync job at the time and days you want (check man crontab to understand how to do this), telling rsync to backup over ssh on a remote machine, something like 0 2 * * * rsync -ae ssh dir_to_bkp name#host:dir_where_bkp to backup each day at 02:00 am the "dir_to_bkp" in the "host" machine using "name" user and "dir_where_bpk" as destination. The -e ssh option in rsync specify to use ssh.
The best way is to use rsync, as you would be uploading (most likely) changes only.
http://linux.die.net/man/1/rsync
Additionally you can create incremental backup:
http://www.mikerubel.org/computers/rsync_snapshots/
So my suggested solution would be rsync + crontab

Replicate / Backup Entire Site & SQL Databases To Remote Server With PHP Cronjob

I am trying to make a complete file & mySQL backup of my site each night.
I was thinking the best way of doing it would be to have a cronjob run each night that would login to a remote server and replicate all of the local files.
Then, I need to figure out a way to take backups of all the mysql databases (currently there are three) and upload them all to the remote server as well.
This sounds like a huge project and I don't know whether to reinvent the wheel here, or if there is some script out there which basically does the same thing already.
Use cronjob to run a bash script
mysqldump the databases
tar -cvf the files
wput it all to your remote server
Also you can set a variable like now=$(date +"%Y_%m_%d") to use in your file names
You can use the mysqldump command to backup the database to a file and then upload to a different server
http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html
have you thought about MySQL replication? Maybe that fits better to your needs and you doesn't need php to do it
http://dev.mysql.com/doc/refman/5.5/en/replication.html

Automatically backing up a file to another server?

I have a backup created of my mysql database (vbulletin forum v3.8) everyday. It's about 360mb in size. It's stored as one text file in a secure folder.
I'm thinking of getting another server, through a different host, and somehow automatically transferring the backup to my second server every day.
Any ideas on how I could automate this process? I'm thinking PHP and a cron job.
Cron definately. Php, if you like it, but using bash with mysqldump combined with gzip works wonders.
Schedule rsync to transfer the files (over ssh) with cron (if you're on Linux).
Cron + rsync may be your best bet. If the file is text as you say and the changes are "diff"able then rsync can be used to transfer only the updates to that file. For example, the crontab could look something like this:
20 4 * * * rsync -a --delete source/ username#remotemachine.com:/path/to/destination/
This will sync the remote machine once a day deleting any files in the remote copy that no longer exist on the source machine.
As a note, I just read again and noticed this is a mysql backup so they output of the dump could eventually contain binary and in that case you probably want to just use a replication server or copy the file whole each day. rsync could be used for the copy as well...
slightly different approach and technically not a backup solution, but might want to consider running mysql in a replication mode and replicating changes live. so if the worst happpens you will be up to date with the data and not a day behind.

How to download a live MySQL db into a local test db on demand, without SSH?

I have a fairly small MySQL database (a Textpattern install) on a server that I do not have SSH access to (I have FTP access only). I need to regularly download the live database to my local dev server on demand; i.e., I would like to either run a script and/or have a cron job running. What are some good ways of doing this?
Some points to note:
Live server is running Linux, Apache 2.2, PHP 5.2 and MySQL 4.1
Local server is running the same (so using PHP is an option), but the OS is Windows
Local server has Ruby on it (so using Ruby is a valid option)
The live MySQL db can accept remote connections from different IPs
I cannot enable replication on the remote server
Update: I've accepted BlaM's answer; it is beautifully simple. Can't believe I didn't think of that. There was one problem, though: I wanted to automate the process, but the proposed solution prompts the user for a password. Here is a slightly modified version of the mysqldump command that passes in the password:
mysqldump -u USER --password=MYPASSWORD DATABASE_TO_DUMP -h HOST > backup.sql
Since you can access your database remotely, you can use mysqldump from your windows machine to fetch the remote database. From commandline:
cd "into mysql directory"
mysqldump -u USERNAME -p -h YOUR_HOST_IP DATABASE_TO_MIRROR >c:\backup\database.sql
The program will ask you for the database password and then generate a file c:\backup\database.sql that you can run on your windows machine to insert the data.
With a small database that should be fairly fast.
Here's what I use. This dumps the database from the live server while uploads it to the local server.
mysqldump -hlive_server_addresss -ulive_server_user -plive_server_password --opt --compress live_server_db | mysql -ulocal_server_user -plocal_server_password local_server_db
You can run this from a bat file. You can ever use a scheduled task.
Is MySQL replication an option? You could even turn it on and off if you didn't want it constantly replicating.
This was a good article on replication.
I would create a (Ruby) script to do a SELECT * FROM ... on all the databases on the server and then do a DROP DATABASE ... followed by a series of new INSERTs on the local copy. You can do a SHOW DATABASES query to list the databases dynamically. Now, this assumes that the table structure doesn't change, but if you want to support table changes also you could add a SHOW CREATE TABLE ... query and a corresponding CREATE TABLE statement for each table in each database. To get a list of all the tables in a database you do a SHOW TABLES query.
Once you have the script you can set it up as a scheduled job to run as often as you need.
#Mark Biek
Is MySQL replication an option? You could even turn it on and off if you didn't want it constantly replicating.
Thanks for the suggestion, but I cannot enable replication on the server. It is a shared server with very little room for maneuver. I've updated the question to note this.
Depending on how often you need to copy down live data and how quickly you need to do it, installing phpMyAdmin on both machines might be an option. You can export and import DBs, but you'd have to do it manually. If it's a small DB (and it sounds like it is), and you don't need live data copied over too often, it might work well for what you need.

Categories