I want to export a table data into a csv file using mysqldump.
I want to make something like:
mysqldump --compact --no_create_info --tab=testing --fields-enclosed-by=\" --fields-terminated-by=, -uroot -proot mydatabase mytable
but i keep getting this error:
(Errcode: 13) when executing 'SELECT INTO OUTFILE'
I made my testing folder writable(I'm using Ubuntu as enviornment). Can somenone explain how to export a table in a CSV file, or how to modify my command shell in order to work? Thanks!
The trouble with all these INTO OUTFILE or --tab=tmpfile answers is that it requires running mysqldump on the same server as the MySQL server.
My solution was simply to use mysql (NOT mysqldump) with the -B parameter, inline the SELECT statement with -e, then massage the ASCII output with sed, and wind up with CSV including a header field row:
Example:
mysql -B -u username -ppassword database -h dbhost -e "SELECT * FROM accounts;" |sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g"
"id","login","password","folder","email"
"8","mariana","57d40c8a954bc9e65f401592062019b1457be359","mariana",""
"3","squaredesign","b01134fac01dab765bcb55ab0ca33f9ec2885a7b","squaredesign","mkobylecki#squaredesign.com"
"4","miedziak","601f1889667efaebb33b8c12572835da3f027f78","miedziak","miedziak#mail.com"
"5","Sarko","480225f1ac707031bae300f3f5b06dbf79ed390e","Sarko",""
"6","Logitrans
Poland","9033b6b3c4acbb27418d8b0b26f4c880ea6dea22","LogitransPoland",""
"7","Amos","50f5604164db2d5ff0e984f973d2202d5358b6a6","Amos",""
"9","Annabelle","07e832cb64e66fa03c13498a26a5f8e3bdebddf1","Annabelle",""
"11","Brandfathers and
Sons","f08b194668249e4cb81fbb92df846e90569f5c01","BrandfathersAndSons",""
"12","Imagine
Group","e86f1645504c7d5a45ed41d16ecc39ed19181773","ImagineGroup",""
"13","EduSquare.pl","80c3c099f4ecc043a19f1527400d63588567a0ad","EduSquare.pl",""
"101","tmp","b01134fac01dab765bcb55ab0ca33f9ec2885a7b","_","WOBC-14.squaredesign.atlassian.net#yoMama.com"
Add a > outfile.csv at the end of that one-liner, to get your CSV file.
Related
I'm using laravel and mysql as database, and I want to make a backup database in hosting with one click button in html, but I wonder what is the best query for them instead of query all data in all table with loop, then insert it one by one to host with loop. Thanks before
Use mysqldump
$ mysqldump -h <host> -u <user> -p<password> <database_name> > backup.sql
This will create a sql file backup.sql with all your data.
Note the missing space beween -p and <password>.
It's okay to give the password interactively (in which case, you'll have to skip the <password> part), but since you mentioned laravel in the question, I am assuming you want to code this and want it to be non-interactive.
EDIT:
Using code
Execute the command with exec()
$cmd = "mysqldump -h <host> -u <user> -p<password> <database_name> > backup.sql";
exec($cmd, $output, $return_value);
I need to restore just a single table in my database.
I have a .sql file that has all the info I need for one table but the rest would overwrite important information for other tables.
Instead of using the solution here - using a tool I've never heard of, I figured it would be more sure fire to do it manually.
Unfortunately, the MySqlDump generated a GIANT insert line too long to paste into mysql command line...
What should I do?
Should I use sed like the link above describes?
Or could I copy paste the commands for that specific table from the mysqldump.sql into a new .sql file then call:
mysql -u root -p -h localhost < copyPasteFile.sql
u can try it
mysql -u root -p databasename -h localhost < copyPasteFile.sql
mysql -uuser -ppassword -e "create database temporary"
mysql -uuser -ppassword temporary < copyPasteFile.sql
mysqldump -uuser -ppassword temporary yourtable > onlythattable.sql
mysql -uuser -ppassword therealdb < onlythattable.sql
mysql -uuser -ppassword -e "drop database temporary"
make sure that copyPasteFile.sql does not have a "use somedatabase;" if it was exported with phpmyadmin, it probably has that, if it was exported with mysqldump it wont.
I am not sure if its the best way but I just spin up a new schema and restore the backup in to it, Dump the data I need and import that in to the production database.
I use the MYSQL work bench which makes it easier but the below steps can probably be reproduced at the command line
Create new MYSQL Schema
Import self-contained backup
Set "Target Schema" to the new empty database (.sql Dump must not contain schema creation code)
Restore backup
Dump table to csv and import it in to the production live database
Delete Restore Schema
To restore to a schema at the command line it looks like you use:
mysql -u <user name> -p <password> <database name> < sqlfilename.sql
I'm trying to use PhpMyAdmin v. 4.5.3.1 to access a DB on a localhost and export a table but it is not working.
I can access the DB, insert, search, etc. but when I click on "Export" tab it gives me this message:
I don't have this issue with PhpMyAdmin 4.2.6 using the same WAMP....
Does anyone knows how to fix it?
Thank you!
I think you should use mysqldump instead, when exporting data. From the command line:
mysqldump -uMYSQL-USER -h server -pMYSQL-USER database_name > /path-to-export
Or from a script:
$command = "mysqldump -uMYSQL-USER -h server -pMYSQL-USER database_name > /path-to-export/file.sql";
exec($command, $output, $return_var);
This can easily be automated.
You could fix this error by increasing memory limit to your requirement and restart httpd/apache service. I fixed it sometimes by increasing memory_limit. But now i prefer to use terminal commands only to handle it. Its better to always get habitual using terminal commands for doing such big operations in mysql. You get speed and more control over it as you are not dependent upon GUI based systems.
Use mysqldump in terminal to export data:
mysqldump -u root -p db_name > /home/dump.sql
Use mysqldump in terminal to export only schema without data:
mysqldump -u root -p db_name --no-data > /home/dump.sql
I am using LAMP stack on Ubuntu.
Using 'mysqldump' command to dump 'stack_db' into file.
While dumping a database to file I don't want column comments in dump.
I tried "--skip-comments and --compact options"
Looking for a solution to skip comments column fields while using mysqldump
Thanks in Advance.
Try your mysqldump statement as below:
mysqldump --no-create-info -uroot -p ....
Or:
--skip-comments to skip comments
I have got alldatabase.sql file
How I can restore data with it?
My console comand is not working:
mysqldump -u root -p < alldatabase.sql
I must to create database first, but all database in my one file. What I must to do?
You use mysqldump to take a snapshot, but you restore using the mysql command
mysql -u root -p < alldatabase.sql