I am doing the export from Magento database on one instalation via phpMyAdmin, and I have tried exporting it in .sql and in tar.gz.
When I import that database in another instalation, I get no errors but there are tables missing, so I have only half of the tables and I see that after letter L (log_visitors table) there are no tables.
Its very strange, could someone give me idea what is it about?
On my version of phpMyAdmin, there is a checkbox with the following text on the Import page:
Allow the interruption of an import in case the script detects it is close to the PHP timeout limit. (This might be good way to import large files, however it can break transactions.)
Make sure that checkbox is unchecked (by default it is checked), else it may be causing your problem.
So we turned on the "enclose in transaction" and "disable foreign keys" checkfields on phpMyAdmin while exporting, and also seperated export in two .sql files. It worked.
Note that the problem wasnt standard php timeout or standard phpMyadmin limitations on uploading big files, as we previousuly also tried bigdumb and ssh commands for import and it didnt worked.
As #Willem stated, you could have that option checked when you import the sql file.
But if you're still not be able to import it after unchecking that option, this means that the file you are importing is too big.
In this case you can use the mysql command line utility, if you have SSH access to your server or you can try bigdump, a php script that let's you import a database splitting the SQL query in chunks!
Hope it helps :)
Related
I'm trying to install MediaWiki (1.29.1 or 1.27.3) locally with a large Wiktionary dump (3GB).
After converting the xml dump into an sql file and importing the latter into my DB that I create with this script, I followed the MediaWiki installation instructions in the browser to generate my specific "LocalSettings.php". I get the message
There are MediaWiki tables in this database. To upgrade them to MediaWiki 1.29.1, click Continue."
By clicking the "continue" button, the browser stays in loading state forever.
My understanding is that my DB containing the wiktionary dump has some tables that are not compatible with the version of wikimedia that I'm using. Therefore, an update of the DB is required.
I tried to run install.php from the command line to avoid having timeout with the browser. The command didn't return anything (after waiting more than 2 hours).
I tried as well a workaround:
Create my DB with empty Tables
Generate "LocalSettings.php" from the browser (that was fast since the DB is small)
Import the wiki sql dump to my DB
Refresh the index.php page
I got then a blank page with this message
Exception caught inside exception handler.Set $wgShowExceptionDetails
= true; and $wgShowDBErrorBacktrace = true; at the bottom of LocalSettings.php to show detailed debugging information.
All the examples and tutorials that I found online about this matter are assuming/using a small or new created DB.
Any idea what's wrong? Did really someone tried to use an existing wikimedia dump and run it locally? Why is there no such an advanced example?
You wrote "I'm trying to install Wikimedia (1.29.1 or 1.27.3)". I suppose that you are talking about Mediawiki, not Wikimedia. Am I right?
1) You can try parsed version of Wiktionary. It is a little bit old (2014) http://whinger.krc.karelia.ru/soft/wikokit/index.html
2) You can try to use my tutorial about download Wiktionary dump, uploading to MySQL, converting and parsing to something more usefull for work: Getting started Wiktionary parser.
See: MySQL import
The issue in a first level originates from mwdumper which seems to be outdated. An sql DB I generated using mwdumper is missing some tables which should have been though created by running update.php. It was not possible for me to run any php file neither from shell nor from the browser and I suspect the size of the dump to be the cause.
The workaround which by some magic helped to overcome this issue was:
run the update.php from shell with missing db credentials. This somehow enables logs and make the execution of index.php possible through the browser
add manually to the missing table columns claimed in the error messages (the column types here should be respected)
place a LocalSettings.php file, generated easily from a Wiktionary DB with empty tables, in the right directory of the mediawiki installation.
Run index.php from the browser
Et voila! The huge wiktionary mysql dump is queryable now throw the mediawiki interface. Not sure if such a trick could be called a solution but it solved the problem in my case. An explanation for what could have happened in background would be definitely helpful.
I am running magento 1.6.2
I built a dataflow profile to import my customers/logins. All goes well there. I need to be able to run this every day, and I have seen some code out there to trigger dataflow profiles to run from the command line, but I can't figure out how to run the profile with a new CSV each time. Heck, I can't even find where it put the CSV that I uploaded! Any ideas where it might be? And does anybody know the best way to run a profile from the CLI?
Thanks
Well, I'll answer my own question:
The directory where it puts files uploaded interactively is /tmp/magento/var/import/
Turns out, you don't need that information, because you don't want the interactive upload, you want the hard-coded path set in the "remote/local server" setting.
I got the command line import working by using the HO_Shellimport found here: https://github.com/ho-nl/Ho_ShellImport. It's free and easy (no affiliation).
I have datas that will be added monthly, every month I must import more than 30 XSLX! When I am using Navicat I must set the relation from xslx column with mysql table column and I do it repeatedly! It's wasting my time ...
I was wondering that I can import from xlsx to MYSQL automatically, using PHP or Navicat or anything else maybe? Please help me.
SOLVED :
I am using SimpleXLSX, it's faster and efficient. Very recommended for everyone!
You can write simple Perl script that will do the following:
Open your Excel file(s) for reading using DBD::Excel driver.
Open MySQL server connection using DBD::mysql driver.
Read data from Excel using SELECT and insert data into MySQL using INSERT.
Profit! :-)
To make it work on Windows, you should install following free software:
Activestate Perl from http://activestate.com/perl, then DBD::mysql and DBD::Excel as follows:
ppm install DBD::Excel
ppm install DBD::mysql
And finally, you need to write your Perl script. It should not be very difficult if you follow documentation on links above.
I would go for PHPExcel
Good examples and docs are supplied.
Try Data Import tool in dbForge Studio for MySQL.
Customize data import once and save template file (import options, field mapping and etc.), then use it many times. It is also possible to use data import in command-line mode.
We need to import a MySQL dump from another website to ours. We've been trying SimpleXML, XML DOM etc, but the file is so huge it's crashing our server. We looked into BigDump, but that doesn't handle the XML import. Every tag in our XML file is called <field name="something"> In <table_something> tags, which I haven't seen before - usually it's descriptive custom tags. This is probably because I haven't done much database importing before now,
What we would like is some way to make our PHP import this huge file. It needs to be freshly updated every night so I'm thinking of dropping the tables and importing fresh unless there's a way to search for differences but I wouldn't know how.
Can anyone help with this? What would be the standard procedure for achieving these results?
Moving a database is usually done outside of PHP, via command line script:
Dump db to a file
tar the file
FTP to new server
Then on the new server
untar the file
Import to mysql
Do you have shell access?
If you have to do it via PHP, you'll need to split up the dump into lots of tiny files and import them one at a time. Depending on the size of your database, this could take a really long time.
When doing the dump, minimize the file size by not using XML, stripping comments, etc.
Is there a way to export priviledges/users out of phpMyAdmin version 3.3.9? And yes, in such format that later those could be imported into a new installation.
It would be good if database relations would be kept and so on.
If phpMyAdmin cannot handle it, MySQL command line solution will work too.
Thanks in advance!
Basically you want to dump some tables from mysql database, like columns_priv,db,tables_priv and user
As far as I remember phpmyadmin have configuration option to hide some databases, but you can access it by typing ?db=mysql in url.