Can anybody help how to setup a automatic backup of mysql database and then emailing it to me daily?
Thank You.
AutoMySQLBackup gets great reviews. I have not used it but it seems to do exactly what you are looking for. Also, here is another link with different ways to backup, including emailing.
If you're looking to automate your MySQL database backups, say with a cron job, and you host your databases with CPanel web hosting, then there's a PHP script you can use.
You can pick it up here: http://www.hostliketoast.com/2011/10/cpanel-hosting-full-database-backup-script-free-download/
Simple. Not need more.
#!/bin/sh
# Must be installed mutt on your box
mydate=$(date +"%Y%m%d%H%M")
host=$(hostname)
databases='' #'database1 database2 databaseN'
mysqldump --opt --all-databases > /tmp/$host.$mydate.sql
# if not all mysql databases mysqldump --opt $databases > /tmp/$host.$mydate.sql
gzip --best /tmp/mysql02.$mydate.sql
echo "Backup MySQL" | mutt -a /tmp/$host.$mydate.sql.gz -s "Backup MySQL $mydate" -- mail#mail.to
Related
Good day!
I have a simple php code that will export a database
exec("E:\wamp\bin\mysql\mysql5.6.12\bin\mysqldump --user=root --password= --host=localhost accounts > E:\database_backup\backup.sql");
This code works but I'm accessing the application on another computer locally via ip like (192.168.1.32/filename/backup.php) so it's not saving.
How can I make it work locally? Thank you in advance.
You need to specify database IP in your mysqldump command and you also need to define required grant privileges in DB ( root#anothercomputerIP)
Using php is it possible to create a backup of a specific mysql database table using php.
Additionally I would like it gzipped also.
I would like it so the file was ready to be loaded directly in using the standard mysql import command i.e.
mysql -uuser -ppass mydb < mybackupfile.sql
Current solutions i am looking at involve iterating over each row and adding to a file - however this doesnt seem correct.
You could use php to execute a mysqldump command. Like so...
exec("mysqldump -uuser -ppass --compact mydb mytable > filename.sql");
Then run another command to compress it however you like.
The best tool for this job is mysqldump and gzip combined:
mysqldump ...options... | gzip -9 > backup.sql.gz
As a note, you'll need to reverse this process before restoring:
gunzip -dc backup.sql.gz | mysql ...options...
PHP itself might be able to initiate this process but it's not an exceptionally good tool on its own. You could try phpMyAdmin and see if that works for you.
I'm using a php script to backup my sql databases remotely that utilizes mysqldump. http://www.dagondesign.com/files/backup_dbs.txt
and I tried to add the the --lock-tables=false since I'm using MyISAM tables but still got an error.
exec( "$MYSQL_PATH/mysqldump --lock-tables=false $db_auth --opt $db 2>&1 >$BACKUP_TEMP/$db.sql", $output, $res);
error:
mysqldump: Couldn't execute 'show fields from `advisory_info`': Can't create/write to file 'E:\tmp\#sql_59c_0.MYD' (Errcode: 17) (1)
Someone told me this file was the lock file it self and I was able to find it in my Server that I wanted to backup.
So is this the lock file? And does it lock the database if you do remotely no matter if I put the variable --lock-tables=false? Or should it not be there since there are a lot of people working on the server and someone might have created it?
It's likely --lock-tables=false isn't doing what you think it's doing. Since you're passing --lock-tables, it's probably assuming you do want to lock the tables (even though this is the default), so it's locking them. In Linux, we don't prevent flags but appending something like =false or =0, but normally by having a --skip-X or --no-X.
You might want to try --skip-opt:
--skip-opt Disable --opt. Disables --add-drop-table, --add-locks,
--lock-tables, --set-charset, and --disable-keys.
Because --opt is enabled by default, you can --skip-opt then add back any flags you want.
On Windows 7 using Wamp, the option is --skip-lock-tables
Took from this answer
I am using php, mysql and my server is Windows Server 2003 with IIS6.
I am planning to backup my database on hourly basis. I can do the cronjob, tested by write date&time into a log file. It works perfectly.
Now I want to use the cronjob to backup my databases. How do I do that? Create a php script and let it run every hour??
Use mysqldump. Example (with the options that I usually use):
mysqldump --single-transaction --hex-blob --opt -e --quick --quote-names -r put-your-backup-filename-here put-your-database-name-here
As others have written, the mysqldump tool is the simplest solution, however if you have a large database, even with the --quick setting, I'd recommend you do some experimenting. With the old C-ISAM engine, queries are processed one at a time. Although this is less of an issue with InnoDB, I'm not sure whether MySQL now supports full concurrent queries. Your backup may affect the transactional processing.
If it does prove to be a problem, then a simpple solution would be to configure a slave instance of the database running on the same machine, then either run mysqldump against that, or shutdown the slave temporarily while backing up the raw data files.
Alternatively you could push the mirroring down to the OS level and perform a disk level snapshot - but you'd need to stop the transactions on the database and flush it before creating the snapshot (or breaking the mirror).
C.
Use the mysqldump tool to dump your databases to a file.
I think you can user cron job to start a copy of the db files, I don't think that cron job to run php script that makes backup is a good choice (it's complex without need it).
I think that also mysqldump is a good choice, but I can't help you about it.
Edit: This script is intended for Unix (or variants).
I wanted to do the same (including sending an email of the backup and archiving my code/pages as well) and wrote something like this:
#! /bin/bash
NOW=`date +"%Y-%m-%d"`
MAIL_TO="email#example.com";
MAIL_SUBJECT="Hourly backup"
MAIL_MESSAGE="mail-message";
DB_FILE="backup-database-$NOW.sql.gz"
SITE_FILE="website-$NOW.tar.gz"
echo "Database dump:" >> $MAIL_MESSAGE
mysqldump --defaults-extra-file=.mysql-pwd --add-drop-table -C my_databse 2>> $MAIL_MESSAGE | gzip > $DB_FILE 2>> $MAIL_MESSAGE
echo "Site dump (www and php-include):" >> $MAIL_MESSAGE
tar -zcf $SITE_FILE /path/to/www/ /path/to/php-include/ 2>> $MAIL_MESSAGE
echo >> $MAIL_MESSAGE
echo >> $MAIL_MESSAGE
echo "Done" >> $MAIL_MESSAGE
mutt -s "$MAIL_SUBJECT" -a $DB_FILE -a $SITE_FILE $MAIL_TO < $MAIL_MESSAGE
Update: Finally got this thing working but still not sure what the problem was. I am using a wamp server that I access through a networked folder.
The problem that still exists is that to execute the mysqldump I have to access the php file from the actual machine that is being used to host the WAMP server.
End of update
I am running a wamp server and trying to use mysqldump to backup a mysql database I have. The following is the PHP code I am using to run mysqldump.
exec("mysqldump backup -u$user -p$pass > $sql_file");
When I run the script the page just loads inifnately and the backup is not created.
A blank file is being created so I know something is happening.
Extra info:
* exec() is not disabled
* PHP is not running in safe mode
Any ideas??
Win XP, WAMP, MYSQL 5.0.51b
mysqldump is likely to exceed the maximal time php is supposed to run on your system. Try using the command in cmd or increase the max_execution_time in your php.ini .
Are you sure $pass is defined and doesn't have a space character at the start?
If it wasn't, mysqldump would be waiting for command line entry of the password.
I had the same thing happen a while back. A co-worker pointed me to the MySQL GUI tools and I have been making backups with that. The Query Browser that comes with it is nice, too.
MySQL GUI tools
It might help to look at the stderr output from mysqldump:
$cmd = "mysqldump backup -u$user -p$pass 2>&1 > $sql_file";
exec($cmd, $output, $return);
if ($return != 0) { //0 is ok
die('Error: ' . implode("\r\n", $output));
}
Also you should use escapeshellarg() if $user or $pass are user-supplied.
I've also struggled with using the mysqldump utility. I few things to check/try based on my experience:
Is your server set up to allow programs to run programs with an exec command? (My webhost's server won't let me.) Test with a different command.
Is the mysqldump utility installed? Check with whereis mysqldump.
Try adding the optimize argument --opt