I was trying to upload a large .sql file which have more than 14000 rows. which caused the error maximum execution time when importing .SQL data file . Then I searched on google to resolve the issue and followed answers on this question . So i made some changes in C:\xampp\phpMyAdmin\libraries\config.default.php file by changing $cfg['ExecTimeLimit'] = 0; .
Now the issue is that phpmyadmin is not opening on localhost. How can I resolved this issue?
Remove $cfg['ExecTimeLimit']
Then Restart XAMPP then Try This
Better way to import large DB :
copy the file in mysql -> bin folder
run command within folder and try command below :
mysql -u root -p -v databasename < dbfiletoimport.sql
this problem caught me out too...I was able to start phpmyadmin after opening config.default.php in wordpad and saving it again, seems using notepad stops it from working...I hope this saves someone hours of frustration!
Related
I tried to import a large sql file but its not getting imported.
And it shows the following error.
phpMyAdmin - Error
Incorrect format parameter
I have been using Xampp-php 5.6 in ubuntu16.04.
I have tried these links given below already but none of it worked.
https://www.webtrickshome.com/forum/how-to-fix-phpmyadmin-error-incorrect-format-parameter-that-appeared-while-importing-a-database
importing db phpMyAdmin - Error Incorrect format parameter
Importing large database file in MAMP phpMyAdmin
You can easily import using cmd promt
mysql -u username(of phpmyadmin) -p database_name < file_path/file_name.sql
you can also do the changes in php.ini file to increase upload size of phpmyadmin.
in php.ini of your PHP installation (note: depending if you want it for CLI, Apache, or Nginx, find the right php.ini to manipulate)
post_max_size=500M
upload_max_filesize=500M
memory_limit=900M
or set other values.
Restart/reload apache if you have apache installed or php-fpm for nginx if you use nginx.
Remote server?
increase max_execution_time as well, as it will take time to upload the file.
NGINX installation?
you will have to add: client_max_body_size 912M; in /etc/nginx/nginx.conf to the http{...} block
I was facing the same problem when I was trying to import a 90mb database. You can solve this problem by opening
xampp->phpMyadmin->libraries->config.default.php
after opening the file search for $cfg['ExecTimeLimit'] = 300;
Here you can set the time to 1000 or whatever you want after the equal sign. The default value is 300.
thanks for thi answer
post_max_size=500M
upload_max_filesize=500M
memory_limit=900M
There file format can be .sql.zip
Shown in the attached picture. It happened to me.
When I try to insert a CSV file gives me the error "No data was received to import. Either no file name was submitted, or the file size exceeded the maximum size permitted by your PHP configuration"
I tried to change the size of the data in the php.ini, but it did not help. :(
Please, help!
If you are using linux OS please try in terminal mode.
cd /your file path
mysql -u root -p123456
use database name
source filename
I think this will help you to solve this.
I have a ~450MB .db (sqllite3) file that I need to import into my local phpmyadmin server on XAMPP. I went through the following steps to make this happen but all of them failed.
-I used sqlitebrowser to create a SQL dump of the .db file.
-When directly importing into phpmyadmin did not work I edited the php.ini to have the following properties:
upload_max_filesize = 500MB
post_max_size = 500MB
memory_limit = 512MB
max_execution_time = 3600
then tried again, without success. Even though phpmyadmin says I can import files of 500MB it still gives me a error that my file might be too large.
-I tried using both BigDump and manual CLI script for importing the dump for it but both of them return an error stating 'the system cannot find the file specified' and 'Can't open eve.sql for import'
At this point I am out of ideas. I would prefer however to make the changes to my local server so that it can import files of such size in the future as I plan to parse some YAML files and use them to update the database using a php script.
How about using the mysql command line utility
mysql -p -u user -h 127.0.0.1 database < data.sql
It usually handles big imports without problems
If you just have one big database, use a sql-splitter or use mysqldumper for import, export and querys. Works on local and live environment without need of shell (SSH) access.
This question already has answers here:
What is the easy way to import large database to mysql using phpmyadmin?
(4 answers)
Closed 8 years ago.
I am trying to import a 5MB file in phpMyadmin.
When I follow the normal procedure of importing it just makes me wait and nothing is happening. Since yesterday morning that file is getting imported and it shows that moving circle sign above the browse.
Now when I followed the other method -
1 C:\wamp\apps\phpmyadmin3.2.0.1\config.inc.php
2 Find the line with $cfg['UploadDir'] on it and update it to:
3 $cfg['UploadDir'] = 'upload';
4 Create a directory called ‘upload’ within the phpmyadmin directory.
5 C:\wamp\apps\phpmyadmin3.2.0.1\upload\
I got this error -
Fatal error: Maximum execution time of 300 seconds exceeded in C:\wamp\apps\phpmyadmin3.5.1\libraries\import\sql.php on line 135
I checked the C:\wamp\apps\phpmyadmin3.5.1\libraries\import\sql.php file but couldn't find anything like 300 or execution time.
Guys pls help me with how can I import this %MB file fast so that I can proceed with my tool development !!
Try this solution:Use below command
mysql -u root -p dbname
use dbname;
mysql> SHOW DATABASES;
mysql> show tables;
SOURCE path (D:/your_file_name.sql); // please give correct path
Use mysql console to import large sql file.
Type the below command in mysql console:
SOURCE "Path of your file located and mention your file name"
For Ex: SOURCE C:\Users\test\Desktop\.sql
OR : Update the value in php.ini
upload_max_filesize=128M
post_max_size = 128M
max_execution_time = 300
max_input_time = 300
Thanks
I think you are using windows OS. So following step may solve your problem
Open cmd.
Navigate mysql installation directory [type command cd "C:\wamp\bin\mysql\mysql5.6.12(May be different version)\bin"]
now write command mysql -u <user> -p <password> <dbmame> < C:/path/of/file.sql
This trick work for me.
Ensure that these options match your needs in php.ini file
upload_max_filesize=2M
max_execution_time=300
post_max_size=8M
I am trying the upload the .sql file having size 150MB using terminal or phpmyadmin but giving the errors
this is what I did before the importing the file
in php.ini file
1)post_max_size = 20000M
2)upload_max_filesize = 20000M
3)max_execution_time = 50000
4)max_input_time = 50000
5)memory_limit = 20000M
&
in /etc/mysql/my.cn
6)max_allowed_packet = 2G
& in the /usr/share/phpmyadmin/libraries/config.defalut.php
$cfg['ExecTimeLimit'] = '0'; //to make it unlimited, this was 300 default
even after these many setting I am getting errors:
when tried from the terminal as
mysql -u root -p dbname < mydbfile.sql & then entered the password & got the
error: ERROR 2006 (HY000) at line 23: MySql server has gone away
when I tried to import the database file using phpmyadmin after taking 3-4hrs it also results in the errors
like: No data received
is there any other way like reading the .sql file using the php & inserting into the database one by one. is it good way?
any idea what could be the problem?
Thanks in advance!
I think you should split the .sql insert commands and import them in multiple phases. Do check the "max connection time" setting.
You can also try http://www.ozerov.de/bigdump/
This may be because of max_allowed_packet
Change in the my.ini/my.cnf file. Include the single line under [mysqld] in your file
max_allowed_packet=500M
now restart the MySQL service once you are done.
You can see it's curent value in mysql like this:
SHOW VARIABLES LIKE 'max_allowed_packet'
You can try to change it like this, but it's unlikely this will work on shared hosting:
SET GLOBAL max_allowed_packet=16777216;
You can read about it here http://dev.mysql.com/doc/refman/5.1/en/packet-too-large.html