I have a pretty large db in MySql, and I need to take backups of it every day or so.
I need to be able to take backups from any computer, so therefore I thought about making a php script to do this and put this php script online (offcourse with password protection and authorization etc so that only I can access it).
I wonder however, how is this done properly?
What commands should I use, and is it possible to change settings of the backup (for instance Add AUTO_INCREMENT value = true)?
I would appreciate examples...
Also, if this is a bad method (unsafe, or maybe gives bad backups with bad sql files), what other method would be preferred?
I have shell-access and I have a VPS (ubuntu server).
My Mysql version is 5.1
Thanks
There's no need to involve PHP in the database backup. You just need a script that uses mysqldump to backup the database, and setup a CRON job to periodically execute the script:
mysqldump db_name > backup-file.sql
...will backup your database to a file, by redirecting the output from the mysqldump to the specified file name.
Peter brought up a good point, that the command would only give you one day of archiving--any archive over two days old would be overwritten. This would allow you have a rolling log going back seven days:
CURRENT_DAY_OF_WEEK=`date '+%u'`
FILENAME="mysqlbackup_"$CURRENT_DAY_OF_WEEK".sql"
mysqldump db_name > $FILENAME
Also be aware that file permissions will apply - can't write a file if the user executing the script doesn't have permissions to the folder.
I agree with OMG Ponies mysqldump + script is the way to go.
The only other option that I use is to set up a slave server. This provides an almost instant backup against hardware failure and can be located in a different building to your main server. Unless you have a large number of writes to the database, you don't necessarily need a very powerful server as it is not processing queries, only database updates.
Related
For development I have a set of tables in a database I want to modify the tables during development and testing. But be able to go back to the original state i started with every time I run the application. is there a way to achieve this without actually backing up and restoring every time.
OS: Win10 Software: XAMPP MySQL(MariaDB) with PHP
You can make a backup and then restore the backup with a different name. Then point the dev/test environment of your application (you do have a different "test" copy of your application as well, I hope?) at the new copy of the database.
Each time you want to "go back to the start", just restore the backup (with the alternative name) again.
Backup/restore is scriptable, so you can automate it if you need to.
If your data is being exhausted as you say there's no other way than return to original state, but you might look for the ways to make it faster and easier. If the db is big and you look to shorten the restore time look at the suggestions here:
https://dba.stackexchange.com/questions/83125/mysql-any-way-to-import-a-huge-32-gb-sql-dump-faster
Also you can write a shell script that wraps the restore operations suggested in one of the solutions from the link above in single command.
I need to preprocess some statistics data before writing it to the main database. My php-cli script retrieves data every 10 minutes and should store it somewhere. Every hour all saved data are preprocessed again and then writted to the main db.
I thought that sqlite should be nice solution if I would keep it in memory. I have not very big amounts of data (I am able to keep it in my RAM).
Actually, I am new to sqlite (previously I was working only with mySQL).
I found here that I can use :memory: instead of file name to work with memory only. But after client is disconnected database is destroyed, and if I will try to connect again to :memory: from my script - it will be different (new, empty) database.
Is it right? If it is, how can I work with same database using different php script calls, if sqlite is stored in memory?
P.S. perhaps it is a solution to put sqlite file to some "magic" directory? (I am using linux)
First, linux is pretty smart about using a filesystem cache. Thus, reading data from a disk is often surprisingly fast, and you should measure if the performance gain is worth it.
If you want to go ahead, one method you might consider is using a ramdisk. Linux provides a way to create a filesystem in memory, and you can place your sqlite file in there.
# mkdir -i /mnt/ram
# mount -t ramfs -o size=20m ramfs /mnt/ram
# touch mydb.sql
# chown apache:apache mydb.sql
... or something similar.
SQLite database created in :memory: is automatically destroyed when you disconnect from database handle or exit/terminate process that what using it.
If you want to have multiple PHP sessions to have access to database, you cannot use :memory:. If you don't have big amounts of data, you should store your intermediate results in helper tables in some persistent database - MySQL or SQLite using real file on disk.
Is it possible to back up an access database? I have done research on how to backup access database through php but I wasnt able to get a good answer. Most of the result that came out is about backing up MySQL database. Can anyone help me :) thanks
re: actually performing the backup
Backing up a native Access database is simply a matter of copying the entire database file (.mdb for Access_2003 and earlier, .accdb for Access_2007 and later). You could use PHP for that, but any scripting language would work, even a simple Windows batch file that does something like
copy /Y d:\apps\databases\mydatabase.accdb z:\backups\databases\*.*
If you're really set on using PHP then you'll likely end up using the copy() function.
re: automatic scheduling of the backup
The Task Scheduler in Windows could take care of that for you. Once you've created your script to copy the database file(s) you can create a scheduled task to run it periodically. See the MSDN article Using the Task Scheduler (Windows) for more information.
I am just currently wondering how I can backup a folder which contains 8000+ images without the script timing out, the folder in all contains around 1.5gb of data, which we need to backup ourselves every so often.
I have tried the zip functionality provided in PHP, however it simply times out the request due to the huge number of files needed to be backed up, it does however work with smaller amounts of work.
I am trying to run this script through a HTTP REQUEST, would putting it through a Cronjob ignore the timeout?
Does anyone have any recommendations?
I would not use php for that.
If you are on linux I would setup a cron job and to run a program like rsync periodically.
A nice introduction about rsync.
Edit: If you do want / need to go the php way, you can also consider just copying instead of using zip. zip normally doesn't do much with images and if you have a database already, you can check your current directory against the database and just do a differential backup (just copy the new files). That way only your initial backup would take a long time.
You can post the code so we can optimize it, other than that, you should change your php.ini (configuration file) and remove/increase the timeout (the longest time your script can run on your server)
I've built a simple cms with an admin section. I want to create something like a backup system which would take a backup of the information on the website, and also a restore system which would restore backups taken. I'm assuming backups to be SQL files generated of the tables used.
I'm using PHP and MySQL here - any idea how I can do this?
One simple solution for backup:
Call mysqldump from php http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html, and save the backup file somewhere convenient.
This is a safe solution, because you have mysqldump on any mysql installation, and it's a safe way to generate a standard sql script
It's also safe to save the whole database.
Be careful with blobs, like saved images inside database
Be careful with utf-8 data
One simple solution for restore:
Restore operation can be done with the saved script.
First disable access to the the site/app.
Take down the database.
Restore the database from the script with mysqlimport http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html
Calling external applications from php:
http://php.net/manual/en/book.exec.php
If linux/unix, check out "man pg_dump". It basically dumps whole databases (with all tables and table definitions), e.g. "pg_dump mydb > db.sql"
Create a script that calls mysqldump
Is it just for yourself or do you want it as a feature of the CMS?
I would advise using MySQL Administrator from www.mysql.com to create backups. It's a very useful tool which you can use to schedule backups from external databases you have access to. It allows you to restore to and be selective over which tables.
If it's for a feature of the CMS then a) I'm not sure that's a great plan and b) one of the above should do it!
for the backup part you can use a ready solution. check out astrails-safe for mysql/pgsql/plain_files/subversion backup with encryption and upload to S3/SFTP.
shouldn't be too hard to add the restore capability (basically just decrypt, decompress, and pipe data to the appropriate command). for mysql (w/o the encryption):
zcat -d mysqldump-blog.090820-0000.sql.gz | mysql database_name