cannot import too large sql file to mysql - php

I have a 28 MB sql file need to import to mysql.
Firstly, i'm using xampp to import, and it fails, and so i change my max_file_uploads, post_size(something like that)in php.ini-development and php.ini-product to 40 MB, but it still show "max:2048kb" and import fail again.
From research, i've learned to import by using mysql.exe, so i open mysql.exe and type the command line(msdos) below:
-u root -p dbname < C:\xxx\xxx\sqlfile.sql
but still failed again and again.....
what the problem is? xampp? or my sql setting?

Try this:
mysql -uroot -p --max_allowed_packet=24M dbname
Once you log into the database:
source C:\xxx\xxx\sqlfile.sql
I think that you should be able to load your file

How large is your file?. You might as well do it from a console:
mysql -u##USER## -p ##YOUR_DATABASE## < ##PATH_TO_YOUR_FILE##
Do that without executing your mysql.ext file: just "cd" right into the directory and try the command.
It will ask for your password and start importing right away. Don't forget to create the database or delete all tables if it's already there.
I always found this approach quick, painless and easier that rolling around with php directives, phpmyadmin configuration or external applications. It's right there, built into the mysql core.

You should increase max_allowed_packet in MySQL.
Just execute this command before importing your file:
set global max_allowed_packet=1000000000;

I also fetched the similar problem. So after that I also conclude , large sql file will never be imported to mysql. It will always give timeout error.
Then I found a solution.
There is an software Heidisql.
follow below steps:-
1) download the software.
2) then install the software
3) create new session in Heidisql and open the session
4) then go to Tools -> Load SQL File -> Browse.
That's it. This solution works best for me.
check the link here

I found the only solution was to log in to MySQL from the command line and use the 'source' command:-
1) cd to the directory containing your SQL file for import, then log into MySQL:
#> mysql -u YOURUSERNAME -p -h localhost
2) use MySQL commands to import the data:
#> use NAMEOFYOURDB;
#> source NAMEOFFILETOIMPORT.sql
This also feeds back info about progress to your terminal, which is reassuring.

Related

Not able to import any sql file using command line in windows xampp

I am trying to import some .sql file (which contains my database) into xampp mysql database but no luck.
Here is my attempts :
step-1: I Entered into c:\xampp\mysql\bin>
step-2: And then used this command for import my sql file to my local database (I placed 'raj-db.sql' in 'c:\xampp\mysql\bin>' folder).
Step-3: After click on enter it is asking 'Enter password:' and immediately going to 'c:\xampp\mysql\bin>' path with out enter any password.
Please can you help me on this.I installed fresh xampp also but still behaving the same.
It is coming like this
c:\xampp\mysql\bin>mysql -u root -p my-db-name < file-name.sql
Enter password:
c:\xampp\mysql\bin>
If I run 'c:\xampp\mysql\bin>mysql' it is coming as 'MariaDB [none]>'.But usually it should come like 'mysql>'.
How to Load MySQL Database Dump Using XAMPP
Step 1 : Start your Xampp
Step 2 : Start your MySQL Server (if not already started)
Step 3 : Launch into CLI mode by clicking on the CMD button (black
button).
Step 4 : Copy your Database backup into same directory.
Step 5 : Load the backup into your already created Database using the
following command
mysql -u root -p DatabaseName < DatabaseDump.sql
You may use the attached screenshots as your guide.
Hope this helps.
Does the target database is already created? Import needs it created and empty for it to work.
CREATE DATABASE db-name
If it is already created try this and then import with your command:
use db_name;

how to dump mysql db from terminal macbook

I'm a new MacBook Pro user.
Because the file is so large, I need to import it from terminal.
I already know how to import and export MySQL data using terminal in Linux,
but since I'm newbie in the iOS environment, I'm lost.
I think I'm missing something, maybe the path or anything, I just don't know.
I'm using XAMPP. I access my htdocs file from terminal, with this
cd /Applications/xampp/xamppfiles/htdocs/abcFolder
and then i try to import my db with this :
mysql -u root mir_existing < mirdb_21_november_2016\ \(1\).sql
Since I have no password, I remove the -p syntax.
But when I press enter to run the script, the result is command not found.
Many of you referred me to this page.
How can I access the mysql command line tool when using XAMPP in OS X?
I already did it, but I don't know where to access my mysql file path to import the db. It's different.
For example. i need to run this script to import the db right?
mysql -u mysql_user -p DATABASE < backup.sql
for example, my backup.sql is on htdocs/abcFolder/backup.sql
How can access it ?
Should I try this?
mysql -u mysql_user -p DATABASE < htdocs/abcFolder/backup.sql
i already tried that thing.
nothing happen. sigh.
How do I import my db?
If doing
which mysql
doesn't yield any results, you'll need to add the /path/to/mysql to your PATH variable and put it in your .bash_profile or .bashrc for future use so you don't have to keep adding it. After adding it to one of these files, just do
source .bash_profile
or
source .bashrc
depending on where you put it.
e.g. on one of my macs I use MAMP and need access to the bins it provides (mysql among them) so this is in my .bash_profile:
export PATH="$PATH:/Applications/MAMP/Library/bin"

mysqlworkbench giving version error on exporting database

When I try to export my database through MySQL Workbench remotely from localserver,
I am getting some below version error:
mysqldump Version Mismatch [Content] mysqldump.exe is version 5.5.16, but the MySQL Server to be dumped has version 5.6.10-log. Because the version of mysqldump is older than the server, some features may not be backed up properly. It is recommended you upgrade your local MySQL client programs, including mysqldump to a version equal to or newer than that of the target server. The path to the dump tool must then be set in Preferences -> Administrator -> Path to mysqldump Tool
I am trying to find a solution - I searched for solution on google but couldn't find a good answer to solve the issue.
Does anyone know, how to fix this issue in MySQL Workbench?
Go to: Edit -> Preferences -> Administrator -> Path to Mysqldumptool:
Look for file mysqldump.exe in your MySQL Server installation folder (it could be: mysql/bin/).
Then click it, and OK. After that try to do the backup.
Fortunately, although not obvious, there is a fairly straightforward solution. You just need to update the mysqldump.exe. The up to date version of the .exe file can be found in. To solve the issue just go to Edit->Preferences->Administrator, and browse the following path
C:\Program Files\MySQL\MySQL Server 5.6\bin\mysqldump.exe
grab this file and replace it at the Path to mysqldump tool textbox
There may be the following path is set before the above mentioned before, so just replace it with the newer one
C:\Program Files\MySQL\MySQL Workbench CE 5.2.47\mysqldump.exe
The paths may be slightly different for you, but the solution remains the same.
In some OS(64bit), there are two folders
C:\Program Files (x86)\MySQL
and
C:\Program Files\MySQL
But you have to go for C:\Program Files\MySQL
Hope it will help :)
In Linux-based like Ubuntu, Edit > Preferences... > Administration (tab), set "Path to mydqldump Tool" to /usr/bin/mysqldump (most likely by default that's where it suppose to be).
If you're not sure, you can try to find where is the mysqldump located by running the following command in terminal:
locate mysqldump
The message says you need a newer mysqldump tool. One that matches the server you want to dump from. So depending on the platform you are running get a copy of the mysqldump tool from a server installation that has a recent version. Each server comes with a mysqldump tool. So it should be easy to get a copy.
Put the tool in a location where it has a persistent home, but does not conflict with other instances, and point MySQL Workbench at it (as the message says).
This occurs when the version of your mysql workbench is different than that of your mysql server. Solution to this is to use mysqldump.exe having version same as that of your server for taking the export/dump.
Steps :
Download the mysql zip of the same version as your server. (eg. mysql-5.7.25-winx64.zip)
Inside this zip you will find mysqldump.exe under bin folder.
Open the Mysql workbench. Go to Edit -> Preferences -> Administration.
Now, in Path to mysqldump Tool :, give the path of this downloaded mysqldump.exe
I was trying to solve this issue with the default mysqldump using Edit > Preferences... > Administration (tab), and setting from /usr/bin/mysqldump which did not work.
I saw that XAMPP server has a mysqldump file too which worked fine! Generally, you can find it in /opt/lampp/bin/mysqldump (for Debian and similar) so you can use this path in your preferences inside workbench.
I did take an archive for mysql version of need from here https://downloads.mysql.com/archives/community/
And used from there mysqldump. It helped for me.
On my Mac, (running latest Mac OS Sierra), I changed the path of mysqldump to /Applications/XAMPP/xamppfiles/bin/mysqldump, and that solved the problem. Previously, the path was set to a different version (older version) of mysqldump. So, you need to get the newest version of mysqldump.
The solution that worked for me is the following:
enter the page https://www.pconlife.com/viewfileinfo/mysqldump-exe/
there is a list of mysqldump.exe files, they must download the version that matches the one that appears in the error.
then go to the folder where mysql is installed usually
C: \ Program Files \ MySQL \ MySQL Workbench 8.0 CE
There they put the downloaded file, and they give it replace.
now open MySQL they will change the path that appears in:
Preferences> Administration> Path to mysqldump tool:
by the address where the replaced file was
that's all
As these answers are not totally clear for Mac users this is where I found my MySQLDump file:
Applications > MAMP > Library > bin > mysqldump
A quick search of 'mysqldump' should locate this.
I followed the above answers and go to:
Preferences > Administation > Path to mysqldump Tool:
This was my path in there now: /Applications/MAMP/Library/bin/mysqldump
wb_admin_export.py (used by mysqldump) looks at PATH variable to find mysqldump and get version number. make sure its mysqldump from mysql bundle, not mysql workbench...
Mac user here: I had this problem after updating MySQLWorkbench. Tried everything.... at last, I downloaded the old version back and downgraded MySQLWorkbench. Worked flawlessly.
For Mac users, it works only after restarting Mysql Workbench, after setting Path to Mysqldumptool in settings (Edit - Preferences - Administrator ).
None of the other answers here has worked for me, so I'll post another way that has fixed mine(I'm using Windows WSL Ubuntu 18.04).
TL;DR: check if you have the line local_infile=1 in the configuration file of your MySQL and change it to loose-local-infile=1 or comment it out altogether if you don't need it now and then restart your MySQL Workbench.
Further Explanation: I closed MySQL Workbench and opened up my terminal and ran mysqldump --version and it gave me this error: mysqldump: [ERROR] unknown variable 'local_infile=1'., I realized that I'd added something to the /etc/my.cnf file previously in order to import some data into a database using a local file, but as it turns out, some other MySQL tools (such as mysqldump, probably) do not understand this line well, so I commented it out and then mysqldump --version works fine and gives the correct version number without any other issues. Now I open the MySQL Workbench once again and this time it's working fine.
HTH.
If none of the above solutions worked the version of mysqldump can be hardcoded in wb_admin_export.py
def get_mysqldump_version():
#path = get_path_to_mysqldump()
#if not path:
# log_error("mysqldump command was not found, please install it or configure it in Edit -> Preferences -> Administration")
# return None
#
#output = StringIO.StringIO()
#rc = local_run_cmd('"%s" --version' % path, output_handler=output.write)
#output = output.getvalue()
#
#if rc or not output:
# log_error("Error retrieving version from %s:\n%s (exit %s)"%(path, output, rc))
# return None
#
#regexp = ".*Ver ([\d.a-z]+).*"
#if ("Distrib" in output):
# regexp = ".*Distrib ([\d.a-z]+).*"
#
#s = re.match(regexp, output)
#
#if not s:
# log_error("Could not parse version number from %s:\n%s"%(path, output))
# return None
#
#version_group = s.groups()[0]
#major, minor, revision = [int(i) for i in version_group.split(".")[:3]]
#return Version(major, minor, revision)
return Version(5, 7, 30)
Only this worked for me: Workbench on windows and Mysql server on a remote linux.
I had to make a local copy of my remote database and was facing MySQLWorkbench´s version problems. In order to avoid reinstall MySQLWorkbench to adapt to the remote database version, I did next:
I exported my database from the remote server into /home/my-user/ folder (on remote server) using ssh:
root#bananapi# mysqldump -u root -p my-incredible-password > /home/my-user/database-dump-18-set-2020.sql
Having the sql script in /home/my-user/ remote directory I download it in my local folder using scp command:
my-user % scp root#remote-server-ip-address:/home/my-user/database-dump-18-set-2020.sql /Users/my-mac-user/tmp/
The I just had to open the sql script file using my MySQLWorkbench and import the data in my local database. I hope this can help somebody.
A possible solution is to create a script that runs mysqldump with the flag --column-statistics=0, then configure Workbench to point to the script:
#ECHO OFF
"C:\Program Files\MySQL\MySQL Workbench 8.0 CE\mysqldump.exe" %* --column-statistics=0
For WordPress data dumps (in my case it uses MySQL v 5.7.39) I downloaded the respective version of Workbench (v6.3.10) and installed it in a different directory. Then configured MySQL workbench v 8.0.28 to point the paths to the mysqldump tool and MySQL tool to Workbench v6.3.10 app internal paths by copying/pasting the routes.
Then exporting was successful.

MySql table lock while using MySqlDump remotely?

I'm using a php script to backup my sql databases remotely that utilizes mysqldump. http://www.dagondesign.com/files/backup_dbs.txt
and I tried to add the the --lock-tables=false since I'm using MyISAM tables but still got an error.
exec( "$MYSQL_PATH/mysqldump --lock-tables=false $db_auth --opt $db 2>&1 >$BACKUP_TEMP/$db.sql", $output, $res);
error:
mysqldump: Couldn't execute 'show fields from `advisory_info`': Can't create/write to file 'E:\tmp\#sql_59c_0.MYD' (Errcode: 17) (1)
Someone told me this file was the lock file it self and I was able to find it in my Server that I wanted to backup.
So is this the lock file? And does it lock the database if you do remotely no matter if I put the variable --lock-tables=false? Or should it not be there since there are a lot of people working on the server and someone might have created it?
It's likely --lock-tables=false isn't doing what you think it's doing. Since you're passing --lock-tables, it's probably assuming you do want to lock the tables (even though this is the default), so it's locking them. In Linux, we don't prevent flags but appending something like =false or =0, but normally by having a --skip-X or --no-X.
You might want to try --skip-opt:
--skip-opt Disable --opt. Disables --add-drop-table, --add-locks,
--lock-tables, --set-charset, and --disable-keys.
Because --opt is enabled by default, you can --skip-opt then add back any flags you want.
On Windows 7 using Wamp, the option is --skip-lock-tables
Took from this answer

Importing huge Database into local server

Is there any way I can Import a huge database into my local server.
The database is of 1.9GB and importing it into my local is causing me a lot of problems.
I have tried sql dumping and was not successful in getting it in my local and have also tried changing the Php.ini settings.
Please let me know if there is any other way of getting this done.
I have used BigDump and also Sql Dump Splitter but I am still to able to find a solution
mysql -u #username# -p #database# < #dump_file#
Navigate to your mysql bin directory and login to your mysql
Select the database
use source command to import the data
[user#localhost] mysql -uroot -hlocalhost // assuming no password
[user#localhost] use mydb // mydb is the databasename
[user#localhost] source /home/user/datadump.sql
Restoring a backup of that size is going to take a long time. There's some great advice here: http://vitobotta.com/smarter-faster-backups-restores-mysql-databases-with-mysqldump/ which essentially gives you some additional options you can use to speed up both the initial backup and the subsequent restore.

Categories