restore large databases in mysql via phpmyAdmin - php

i already save a backup for all databases in phpmyadmin by about 4.5 MB and gzip format.
but now i want to restore that in phpmyAdmin via import tab. but we know that max allowed size for upload a file is 2,048KiB and when try to upload shown this message error:
No data was received to import. Either no file name was submitted, or the file size exceeded the maximum size permitted by your PHP configuration
i use MySQLDumper script to restore but in this script we can choose only one database to restore while i want to restore all my databases that i had.
how can i do this. i use wamp on windows 8.

You can restore large database using mysqldumper http://www.mysqldumper.net. If you want to restore it on your domain server first you upload sql file using ftp and then restore using mysqldumper

You can bypass having to use phpmyadmin by using the terminal.
You should unzip your mysqldump first.
Get the DOS command prompt by typing "cmd" under Start->Search programs and files.
Navigate to the directory where your mysqldump is, and then type (for single database):
mysql -uusername -ppassword name of database < filename of the mysqldump
If you want to restore all the databases (it seems like you dumped using --all-databases):
mysql -uusername -ppassword --databases name of database name of another database name of yet another database < filename of the mysqldump
Also note that there is no space in between -u and the username, and -p and the password.

Related

How to divide exported mysql database file to parts and import to database?

I have 200 mb exported database file and now I can't import it to hosting database with phpmyadmin because my internet connection is slow. Database from wordpress site. How I can divide it and import to database?
You can try to zip the mysql database since it is mostly text, it will reduce the size to almost 10-15% to the original size, so possibly around 20-30 mb.
Please do this on your local machine on your command line.
mysqldump -u root -p db | gzip > /tmp/db.sql.gz
then copy the db.sql.gz to your remote server using ssh or whatever is available to you and run the below command.
zcat db.sql.gz | mysql -u user -p db
If you don't have ssh-access there is no way around splitting your file up, so that phpmyadmin can import it without reaching the max_execution_time.

Why phpmyadmin doesn't notice me that it reaches the maxmium php execution time?

I'm using phpmyadmin to export one of my database.
And it was huge ( larger than 1 GiB )
And export procedure ended because of the PHP had run 360 seconds ( maxmium execution time ) and was killed while I thought that the backup file was correct and complete.
Later, I want a rollback.
I drop the old database.
And I import the backup sql file.
And
My data are partially lost.
Why phpmyadmin doesn't notice me???
I'm really ******* angry.
For databases huge like that phpMyAdmin is not the best solution. Better use shell command to dump/import database. Under linux (most likely you are on linux)
mysqldump -u username -p database_name > dump_file_name.sql
Copy it, i.e. with FTP to other server and then import like:
mysql -u username -p database_name < dump_file_name.sql
After "-p" you can immediately put your password, but you can also skip it and you'll be asked to enter it.

Uploading database mysql

I've got a problem with my mysql server. Basically I cannot upload my database... I tried already :
mysql -u username -p database_name > file.sql
mysqldump -p -u username database_name > dbname.sql
Nothing works... first method stuck after i insert password to mysql server and second method did nothing, after mysqldumb nothing happed...
Also I tried to use phpmyadmin, but durring uploading after 13s I get blank screen, with no error. I setup memory_limit and upload_max_filesize to 128M but still getting this problem.
Thanks
You can upload sql files using .rar or zip folder. via phpmyadmin in this case you will have compressed the file size and you shouldn't face that issue again.
You want to read from the sql file, not write to it.
Use < not > for the first line.
You may have to get a clean copy of file.sql as you've probably overwritten it.
(The second line is for copying the database into a file, not the other way around)
The first command is only for connecting to the database. You are not seeing anything because all the output is being directed to file.sql. Try removing the > file.sql from the command and you should see errors or the sql prompt.
The second command "looks" ok, assuming you want to save data TO the file, not load data FROM the file.
To load data, use
mysql -h hostname -u user --password=password databasename < filename

How can I convert MySQL database to SQLite in PHP?

I need PHP code to convert the database. I tried How to convert mysql to SQLite using PHP but it dint have answer
I finally found its solution.
Save sh file available at
https://gist.github.com/943776
and execute
"./mysql2sqlite.sh DBNAME --databases DBNAME -u DB_USERNAME -pDB_PASSWORD | sqlite3 database.sqlite"(without qoutes and with "`")
in php file.
Save both files in one folder
Go to phpmyadmin
click export database
download as sql file.
Download it....
go to your sql lite management software
import the .sql file
done...

Importing huge Database into local server

Is there any way I can Import a huge database into my local server.
The database is of 1.9GB and importing it into my local is causing me a lot of problems.
I have tried sql dumping and was not successful in getting it in my local and have also tried changing the Php.ini settings.
Please let me know if there is any other way of getting this done.
I have used BigDump and also Sql Dump Splitter but I am still to able to find a solution
mysql -u #username# -p #database# < #dump_file#
Navigate to your mysql bin directory and login to your mysql
Select the database
use source command to import the data
[user#localhost] mysql -uroot -hlocalhost // assuming no password
[user#localhost] use mydb // mydb is the databasename
[user#localhost] source /home/user/datadump.sql
Restoring a backup of that size is going to take a long time. There's some great advice here: http://vitobotta.com/smarter-faster-backups-restores-mysql-databases-with-mysqldump/ which essentially gives you some additional options you can use to speed up both the initial backup and the subsequent restore.

Categories