Can`t import magento sql database - php

I'm trying to install Magento on a localhost, but when I try to import the sql database, it freezes. Also, i can't access phpmyadmin anymore, neither other website hosted on localhost.
I thought the file was too large, but I increases post_max_filesize and upload_max_filesize. I also increased the maxmimum_execution_time to a higher value.
I've seen that after I turn on xampp again, the database is partially uploaded, sometime with 143 queries or 200+ queries.
What should I do to upload the entire database?

Try using a commandline mysql import:
mysql -p -u username database_name < file.sql

Related

How to make a copy of large database from phpmyadmin?

I want to create a dev environment of my website on the same server. But I have a 7Gb of database which contains 479 tables and I want to make a copy of that database to the new DB.
I have tried this with the help of PHPmyadmin >> Operations >> copy database to functionality. But every time it will fail and return the error
Error in processing request Error code: 500 Error text: Internal Error.
Please let me know there is any other method/ solution to copy this database to a new database from cpanel please advise
Create an export of your database. This should be easily done thru PhpMyAdmin interface. Once you downloaded the DB export, you need to create a new DB where you will put your exported data. This, too, should be easily done thru PhpMyAdmin user interface.
To upload it, we cannot use Import -> Browse your computer because it has a limit of 2MB. One solution is to use Import -> Select from the web server upload directory /var/lib/phpMyAdmin/upload/. Upload your exported data in this directory. After that, your uploaded data should be listed in the dropdown next to it.
If this fails too, you can use the command line import.
mysql -u user -p db_name < /path/to/file.sql
Limited to phpMyAdmin? Don't do it all at once
Large data-sets shouldn't be dumped (unless it's for a backup), instead, export the database without data, then copy one table at a time (DB to DB directly).
Export/Import Schema
First, export only the database schema via phpMyAdmin (uncheck data in the export options). Then import that onto a new database name.
Alternatively, you could use something like below to generate statements like below, once you've created the DB. The catch with this method is that you're likely to lose constraints, sprocs, and the like.
CREATE TABLE [devDB].[table] LIKE [prodDB].[table]
Copy data, one table at a time.
Use a good editor to create the 470 insert statements you need. Start with a list of table names, and use the good old find-and-replace.
INSERT INTO [devDB].[table] SELECT * FROM [prodDB].[table];
This may choke, depending on your environment. If it does, drop and recreate the dev database (or empty all tables via phpMyAdmin). Then, run the INSERT commands a few tables at a time.
Database Administration requires CLI
The real problem you're facing here is that you're trying to do database administration without access to the Command Line Interface. There are significant complicated details to migrating large sets of data efficiently, most of which can only be solved using tools like mysqldump.
NOTE: I have just read your comment, and as I can understand you don't have access to command line. Please check Solution Two, this will definitely work.
The only solution that will work for you (which work for me at 12GB database) is directly from the command line:
Solution One
mysql -u root -p
set global net_buffer_length=1000000; --Set network buffer length to a large byte number
set global max_allowed_packet=1000000000; --Set maximum allowed packet size to a large byte number
SET foreign_key_checks = 0; --Disable foreign key checking to avoid delays, errors and unwanted behavior
source file.sql --Import your sql dump file
SET foreign_key_checks = 1; --Remember to enable foreign key checks when the procedure is complete!
If you have root access you can create bash script:
#!/bin/sh
# store start date to a variable
imeron=`date`
echo "Import started: OK"
dumpfile="/home/bob/bobiras.sql"
ddl="set names utf8; "
ddl="$ddl set global net_buffer_length=1000000;"
ddl="$ddl set global max_allowed_packet=1000000000; "
ddl="$ddl SET foreign_key_checks = 0; "
ddl="$ddl SET UNIQUE_CHECKS = 0; "
ddl="$ddl SET AUTOCOMMIT = 0; "
# if your dump file does not create a database, select one
ddl="$ddl USE jetdb; "
ddl="$ddl source $dumpfile; "
ddl="$ddl SET foreign_key_checks = 1; "
ddl="$ddl SET UNIQUE_CHECKS = 1; "
ddl="$ddl SET AUTOCOMMIT = 1; "
ddl="$ddl COMMIT ; "
echo "Import started: OK"
time mysql -h 127.0.0.1 -u root -proot -e "$ddl"
# store end date to a variable
imeron2=`date`
echo "Start import:$imeron"
echo "End import:$imeron2"
Source
Solution Two
Also, there is another option which is very good for those who are on shared hosting and don't have command line access. This solution worked for me on 4-5GB files:
MySQL Dumper: Download (You will able to backup/restore SQL file directly from "MySQL Dumper" you don't need phpmyadmin anymore).
Big Dump: Download (Just restore from Compress file and SQL file, need BIGDUMP PHP file editing for big import $linespersession = 3000; Change to $linespersession = 30000;)
Solution Three:
This solution definitely works, it is slow but works.
Download Trial version of (32 or 64 bit): Navicat MySQL Version 12
Install -> and RUN as Trial.
After that Add your Computer IP (Internet IP, not local IP), to the MySQL Remote in cPanel (new database/hosting). You can use wildcard IP in cPanel to access MySQL from any IP.
Goto Navicat MySQL: click on Connection put a connection name.
In next "Hostname/IP" add your "Hosting IP Address" (don't use localhost).
Leave port as it is (if your hosting defined a different port put that one here).
add your Database Username and Password
Click Test Connection, If it's successful, click on "OK"
Now on the Main Screen you will see all the database connected with the username on the left side column.
Double click on your database where you want to import SQL file:
Icon color of the database will change and you will see "Tables/views/function etc..".
Now right click on database and select "Execute SQL file" (http://prntscr.com/gs6ef1).
choose the file, choose "continue on error" if you want to and finally run it. Its take some time depending on your network connection speed and computer performance.
The easiest way is to try exporting the data from phpmyadmin. It will create the backup of your data.
But Sometimes, transferring large amount of data via import/export does result into errors.
You can try mysqldump to backup the data as well.
I found a few links for you here and here.
This is the mysqldump database backup documentation.
Hope it helps. :D
You can use mysqldump as follow
mysqldump —user= —password= --default-character-set=utf8
You can also make use of my shell script, which actually wrote long back for creating back-up of MySQL database on regular basis using cron job.
#!/bin/sh
now="$(date +'%d_%m_%Y_%H_%M_%S')"
filename="db_backup_$now".gz
backupfolder=“"
fullpathbackupfile="$backupfolder/$filename"
logfile="$backupfolder/"backup_log_"$(date +'%Y_%m')".txt
echo "mysqldump started at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
mysqldump —user= —password= --default-character-set=utf8 | gzip > "$fullpathbackupfile"
echo "mysqldump finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
chown "$fullpathbackupfile"
chown "$logfile"
echo "file permission changed" >> "$logfile"
find "$backupfolder" -name db_backup_* -mtime +2 -exec rm {} \;
echo "old files deleted" >> "$logfile"
echo "operation finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
echo "*****************" >> "$logfile"
exit 0
I have already written an article on Schedule MySQL Database backup on CPanel or Linux.
Here's how I handled that problem when I faced it... Unfortunately this only works for Mac OS.
Download Sequel Pro - Completely free, and it has worked really well for me for over a year now.
Remotely connect to your server's database. You will probably need to add your ip address to the "Remote MYSQL" section in CPANEL. If you don't have the credentials, you can probably get them from your website's config file.
Once you're in the server, you can select all of your tables, secondary click, and select Export > As SQL Dump. You probably won't need to edit any of the settings. Click "Export".
Login to your local servers database, and select "Query" from the top menu.
Drag and drop the file that was downloaded from the export and it will automatically setup the database from the sql dump.
I hope this helps. It's a little work around, but it's worked really well for me, especially when PMA has failed.
Since the requirements include PHPMyAdmin, my suggestion is to:
select the database you need
go to the "Export" tab
click the "Custom - display all possible options" radio button
in the "Save output to a file" radio button options, select "gzipped" for "Compression:"
Remove the "Display comments" tick (to save some space)
Finish the export
Then try to import the generated file in the new Database you have (if you have sufficient resources - this should be possible).
Note: My previous experience shows that using compression allows larger DB exports/import operations but have not tested what is the upper limit in shared hosting environments (assuming this by your comment for cPanel).
Edit: When your export file is created, select the new database (assuming it is already created), go to the "Import" tab, select the file created from the export and start the import process.
If you have you database in your local server, you can export it and use BigDump to inserting to new database on the global server BigDump
I suspect that PHPMyAdmin will handle databases of that size (PHP upload/download limits, memory constraints, script execution time).
If you have acccess to the console, i would recommend doing export/import via the mysql command line:
Export:
$ mysqldump -u <user> -p<pass> <liveDatabase> | gzip > export.sql.gz
And Import:
$ gunzip < export.sql.gz | mysql -u <user> -p<pass> <devDatabase>
after you have created the new dev database in e.g. PHPMyAdmin or via command line.
Otherwise, if you only have access to an Apache/PHP environment, I would look for an export utility that splits export in smaller chunks. MySQLDumper comes to mind, but it's a few years old and AFAIK it is no longer actively maintained and is not compatible with PHP 7+.
But I think there is at least a pull request out there that makes it work with PHP7 (untested).
Edit based on your comment:
If the export already exists and the error occurs on import, you could try to increase the limits on your PHP environment, either via entries in .htaccess, changing php.ini or ini_set, whatever is available in your environment. The relevant settings are e.g. for setting via .htaccess (keep in mind, this will work only for apache environments with mod_php and also can be controlled by your hoster):
php_value max_execution_time 3600
php_value post_max_size 8000M
php_value upload_max_filesize 8000M
php_value max_input_time 3600
This may or may not work, depending on x32/x64 issues and/or your hosters restrictions.
Additionally, you need to adjust the PHPmyadmin settings for ExecTimeLimit - usually found in the config.default.php for your PHPMyAdmin installation:
Replace
$cfg['ExecTimeLimit'] = 300;
with
$cfg['ExecTimeLimit'] = 0;
And finally, you probably need to adjust your MySQL config to allow larger packets and get rid of the 'lost connection' error:
[mysqld] section in my.ini :
max_allowed_packet=256M

how to upload a large sql file in phpmyadmin on server

How can i upload a large sql file in phpmyadmin on server. I'm trying to migrate my wordpress website from localhost to online but when i'm trying to upload my sql file which is large then 50 MB it show error. I have only cpanel access.
modify two limit options in 'php.ini'
// larger than 50M
upload_max_filesize = 00M
post_max_size = 00M
You can upload large .sql file in two way.
1.Edit the php.ini page
post_max_size = 800M
upload_max_filesize = 800M
max_execution_time = 5000
max_input_time = 5000
memory_limit = 1000M
most of the case the execution time will be time out and fail the upload.
so that use the command line if you have back end access.
mysql -p -u your_username your_database_name < file.sql
mysql -p -u example example_live < /home2/example/public_html/file.sql
/home2/example/public_html/file.sql (file path)
Given the limitations your host places on your account, I think your best solution is to use the phpMyAdmin "UploadDir" feature.
You'll need to be able to control the configuration of your phpMyAdmin, which probably means you'll need to install your own copy to your web space, but that's relatively easy. Then you'll have to modify your config.inc.php to add a line like $cfg['UploadDir'] = 'upload'; (you can use any directory name here). Next, create the directory and upload your SQL file there using SFTP, SSH, or whatever other means your host gives you of interacting with your files. Now the file should appear in a dropdown on the Import tab.
You need to break your SQL file into smaller SQL files.
To do this when you export from localhost, only choose a few tables to export at a time (you may have to choose the Custom export method).
Basically you need to make every SQL file smaller than 50MB. You can then import each SQL file, one by one.
You can compress your sql file, then upload the sql zip file to the server. If the file size is still higher than limit stated in your server, you may need to increase the limits as explained by #DongHyun.
However, most shared hosting plans don't allow PHP.ini editing. In that case, contacting customer support might be helpful.
Try to import it from mysql console as per the taste of your OS.
mysql -u {DB-USER-NAME} -p {DB-NAME} < {db.file.sql path}
or if it's on a remote server use the -h flag to specify the host.
mysql -u {DB-USER-NAME} -h {MySQL-SERVER-HOST-NAME} -p {DB-NAME} < {db.file.sql path}
for more example

Importing 500MB .db file into phpmyadmin using a SQL dump

I have a ~450MB .db (sqllite3) file that I need to import into my local phpmyadmin server on XAMPP. I went through the following steps to make this happen but all of them failed.
-I used sqlitebrowser to create a SQL dump of the .db file.
-When directly importing into phpmyadmin did not work I edited the php.ini to have the following properties:
upload_max_filesize = 500MB
post_max_size = 500MB
memory_limit = 512MB
max_execution_time = 3600
then tried again, without success. Even though phpmyadmin says I can import files of 500MB it still gives me a error that my file might be too large.
-I tried using both BigDump and manual CLI script for importing the dump for it but both of them return an error stating 'the system cannot find the file specified' and 'Can't open eve.sql for import'
At this point I am out of ideas. I would prefer however to make the changes to my local server so that it can import files of such size in the future as I plan to parse some YAML files and use them to update the database using a php script.
How about using the mysql command line utility
mysql -p -u user -h 127.0.0.1 database < data.sql
It usually handles big imports without problems
If you just have one big database, use a sql-splitter or use mysqldumper for import, export and querys. Works on local and live environment without need of shell (SSH) access.

How can I solve permissions error when importing MySQL databases with PHPMyAdmin?

I'm developing a web site that is based around a MySQL db.
Because I wanted to do it off-line first, I installed WAMP server on my machine. This has Apache, an SQL server, and PHP.
I created a database on my off-line machine. The database is calledOffDB, created by user OffUser, with password OffPwd. I have exported it to an SQL file (less than one MB).
I want to import this db into my online db, which is called OnDB, created by user OnUser, with password OnUser. When I try to import in PHPMyAdmin, I get the following error:
"User OnUser doesn't have permissions to access databse OffDB"
In the off-line PHPMyAdmin, I added a new user, OnUser, with password OnPwd, and granted it all permissions to OffDB. Then I exported OffDB again, and atempted to import again. It failed with the same error. Same thing if I add new off-line user OnUser, with password OffPwd, and grant it permission - upon import I always get the error.
How can I solve that?
You need to add the OffUser and OffDB to the live server, not the other way around.
Alternatively, make a backup that does not specify the database name and users (for the latter, look for options involving GRANTs, and disable those). mysqldump should have options for that, at least if you use the mysqldump <database_name> variant, as should PHPMyAdmin.

Empty SQL Files - PHPmyAdmin

So I'm trying to export my e-commerce database via PhpMyAdmin and it keeps downloading empty SQL Files. I'm 100% sure i'm selecting all my tables and I'm using the default sets when opening the export tab. It keeps coming out to 0 bytes. If needed i'll detail all the checkboxes I have checked in php my admin but not sure if it's needed.
Is my database too big? What's another way I can back up my database?
You can use mysqldump on your terminal.
Might be timeout or memory limit exceeded. Try mySQLDumper. Works fine for me.
I had the same issue as OP - the exported .sql backup was always empty. Selecting the gzipped compression did the trick - hope this helps somebody else!
This could be a memory issue, increase the memory_limit in your php.ini file to 512M temporarily (assuming it is currently much less), restart required services for this change to take affect, and I am willing to bet this will now work for you... don't forget to change your memory_limit back when you are done.
Using mysqldump through ssh worked for me
Example
user: admin
password: pass123
database name: mydatabase
mysqldump --add-drop-database --add-drop-table --user=admin --password=pass123 mydatabase > backup.sql
Then I downloaded the file via FTP

Categories