Using php is it possible to create a backup of a specific mysql database table using php.
Additionally I would like it gzipped also.
I would like it so the file was ready to be loaded directly in using the standard mysql import command i.e.
mysql -uuser -ppass mydb < mybackupfile.sql
Current solutions i am looking at involve iterating over each row and adding to a file - however this doesnt seem correct.
You could use php to execute a mysqldump command. Like so...
exec("mysqldump -uuser -ppass --compact mydb mytable > filename.sql");
Then run another command to compress it however you like.
The best tool for this job is mysqldump and gzip combined:
mysqldump ...options... | gzip -9 > backup.sql.gz
As a note, you'll need to reverse this process before restoring:
gunzip -dc backup.sql.gz | mysql ...options...
PHP itself might be able to initiate this process but it's not an exceptionally good tool on its own. You could try phpMyAdmin and see if that works for you.
Related
I have a .sql file that contains a handful of TRUNCATE, DROP TABLE IF EXISTS and CREATE TABLE statements as well as thousands of INSERT and UPDATE statements.
I'm trying to use PHP to execute all the statements in the file in batch as follows:
$sql = file_get_contents('path/to/sql/file.sql');
$link->($sql);
Where $link is a valid MySQLi DB link. I've also tried swapping out $link->($sql) with $link->multi_query($sql), but neither seem to work.
If it makes any difference, I'm attempting to do this in an XAMPP localhost environment on Windows 10 with PHP 7.1 and MariaDB 15.1.
Does anyone know why this would cause problems and not work?
What's the best way in PHP to execute a SQL file with thousands of statements in it?
Thanks.
Ryan, thanks again for the advice. As you suggested, running the mysql command directly on the SQL file is the way to go.
The following worked for me:
shell_exec('C:\\xampp\\mysql\\bin\\mysql.exe -h localhost -u root -ppassword -D db-name "SELECT * FROM table-name;"');
Please note that you don't want a space between the -p option and the password itself. If you don't have a password, then just remove the -p option altogether.
Edit: The sample command above is a very arbitrary one to demonstrate running the mysql command. If you want to read and execute a bunch of SQL statements in a file, the following is more what you need:
shell_exec('C:\\xampp\\mysql\\bin\\mysql.exe -h localhost -u root -ppassword -D db-name < C:\\xampp\\htdocs\\path\\to\\file.sql');
I need to save data from a relational database record set in MySQL (composed of multiple tables and multiple sub-records for each master record) to an xml file. My questions are:
Is there an xml standard format for defining the relation between the database structure and the xml to output?
Are there recommended open-source libraries in PHP that would do the job?
It is possible to export and import XML data using the mysql monitor.
$ mysql -uroot -p --xml -e 'SELECT * FROM tablename' > /tmp/tabllename.xml
The mysql monitor has an --xml option, which enables us to dump data in XML format. The -e option executes a statement and quits the monitor.
For whole database you can use mysqldump tool
mysqldump --xml databasename > databasename.xml
or
mysqldump -X databasename> databasename.xml
Hope it helps!!
I have a 28 MB sql file need to import to mysql.
Firstly, i'm using xampp to import, and it fails, and so i change my max_file_uploads, post_size(something like that)in php.ini-development and php.ini-product to 40 MB, but it still show "max:2048kb" and import fail again.
From research, i've learned to import by using mysql.exe, so i open mysql.exe and type the command line(msdos) below:
-u root -p dbname < C:\xxx\xxx\sqlfile.sql
but still failed again and again.....
what the problem is? xampp? or my sql setting?
Try this:
mysql -uroot -p --max_allowed_packet=24M dbname
Once you log into the database:
source C:\xxx\xxx\sqlfile.sql
I think that you should be able to load your file
How large is your file?. You might as well do it from a console:
mysql -u##USER## -p ##YOUR_DATABASE## < ##PATH_TO_YOUR_FILE##
Do that without executing your mysql.ext file: just "cd" right into the directory and try the command.
It will ask for your password and start importing right away. Don't forget to create the database or delete all tables if it's already there.
I always found this approach quick, painless and easier that rolling around with php directives, phpmyadmin configuration or external applications. It's right there, built into the mysql core.
You should increase max_allowed_packet in MySQL.
Just execute this command before importing your file:
set global max_allowed_packet=1000000000;
I also fetched the similar problem. So after that I also conclude , large sql file will never be imported to mysql. It will always give timeout error.
Then I found a solution.
There is an software Heidisql.
follow below steps:-
1) download the software.
2) then install the software
3) create new session in Heidisql and open the session
4) then go to Tools -> Load SQL File -> Browse.
That's it. This solution works best for me.
check the link here
I found the only solution was to log in to MySQL from the command line and use the 'source' command:-
1) cd to the directory containing your SQL file for import, then log into MySQL:
#> mysql -u YOURUSERNAME -p -h localhost
2) use MySQL commands to import the data:
#> use NAMEOFYOURDB;
#> source NAMEOFFILETOIMPORT.sql
This also feeds back info about progress to your terminal, which is reassuring.
How to create a schema with existing tables. I have 100 tables then now I want to create a schema for this table can you tell the procedure how or any tools.
I am trying the TOAD but it shows the result in HTML format. I want the result in SQL.
How create a schema using PHP.
A single show create table <tablename> statement in mysql command line tool will suffice, if you want schema only and no data. The result will be a full create table ddl
Take Dump of all table so you get schema...refer this Dump
mysqldump -u root -p[root_password] [database_name] > dumpfilename.sql
mysqldump --user MYSQL_USER --password=MYSQL_PASSWORD --add-locks --flush-privileges \
--add-drop-table --complete-insert --extended-insert --single-transaction \
--database DB_TO_BACKUP > FILENAME_FOR_SQL_STATEMENTS
this will give a file full of SQL create and insert statements that can be used to recreate the DB. The back slashes "\" are continue to next line... you may want to ignore it and write the whole command in one line
Now, I am not sure if you wanted to the whole Db with data or just tables... you may want to use --no-data to get rid of insert SQL statements and just create and delete (and lock) statements, e.g.
mysqldump --user MYSQL_USER --password=MYSQL_PASSWORD --add-locks --flush-privileges \
--add-drop-table --no-data --single-transaction \
--database DB_TO_BACKUP > FILENAME_FOR_SQL_STATEMENTS
http://dev.mysql.com/doc/refman/5.5/en/mysqldump.html#option_mysqldump_no-data
That is:
mysqldump --no-data -u <username> -p --database <dbname> <further options> > <dbname>.sql
from the command line, on the server (perhaps via ssh).
...can you tell the procedure how or any tools
You can use 'Generate Schema Script' or 'Backup' GUI tool in dbForge Studio for MySQL.
Limited Backup is available in free express version, the size of backup files is limited to 1MB, it is enough for 100 table schemas.
You can use mysql workbench, connect with existing database there and you can import and export the schema in sql formats
Can anybody help how to setup a automatic backup of mysql database and then emailing it to me daily?
Thank You.
AutoMySQLBackup gets great reviews. I have not used it but it seems to do exactly what you are looking for. Also, here is another link with different ways to backup, including emailing.
If you're looking to automate your MySQL database backups, say with a cron job, and you host your databases with CPanel web hosting, then there's a PHP script you can use.
You can pick it up here: http://www.hostliketoast.com/2011/10/cpanel-hosting-full-database-backup-script-free-download/
Simple. Not need more.
#!/bin/sh
# Must be installed mutt on your box
mydate=$(date +"%Y%m%d%H%M")
host=$(hostname)
databases='' #'database1 database2 databaseN'
mysqldump --opt --all-databases > /tmp/$host.$mydate.sql
# if not all mysql databases mysqldump --opt $databases > /tmp/$host.$mydate.sql
gzip --best /tmp/mysql02.$mydate.sql
echo "Backup MySQL" | mutt -a /tmp/$host.$mydate.sql.gz -s "Backup MySQL $mydate" -- mail#mail.to