I have a problem when importing a table form SQL Server to MySQL.
This is what I do:
First I export a CSV file with the data of the table Clients from SQL Server.
Then I import that CSV file to an auxiliar table in MySQL.
Finaly, I execute a query to insert the new data into the Clients table from MySQL
Is there any way to automatize this process? I've tried several methods, but it wasn't successful. ( Methods such as link both servers and configure them. )
English is not my first language.
Kind regards.
You should be able to create a linked server between them and execute your insert directly into MySQL. Take this code once working and put it into a scheduled job.
MySQL Linked Server Set Up
Related
I have a MySQL database on which i want to import data from multiple csv files. For the data I provided a table on which I want to merge the several files into one (join). Unfortunately I have the problem that my data is too big and therefore it is quite time-consuming until I get everything stored in the table. Therefore the question: What is the best way to deal with a huge amount of data?
I took the liberty to create a temporary table for each csv file and load the data into it. Then I joined all tables and wanted to insert the result of my query into the big table and there I already had the problem with the long waiting time. I would like to limit the solutions to the following languages: MySQL, PHP. So far I used the GUI of datagrip and the sql-console for importing these files.
Use any data integration tool like Pentaho, then follow the below steps:
Pentaho has CSV import object
You could join multiple CSV file using join object
Select all the columns from merging output
Then push it to MySQL using DB connector output object
There is a pretty neat library that does exactly this. Helps you to migrate data from one source to another. And it does pretty quickly.
https://github.com/DivineOmega/uxdm
You could use a shell script to loop through the files (this one assumes they're in the current directory)
#!/bin/bash
for f in *.csv
do
mysql -e "load data infile '"$f"' into table my_table" -u username --password=your_password my_database
done
You can achieve this easily with the use of pentaho data integration (ETL tool).
It provided us csv data input in which you can mention your csv file. then link to table output step in which you can use jdbc or jndi connection of your mysql database.
Actually I am using R for neural networks. I have Xampp server installed on windows. I want to create a trigger on a table in MySQL, which after insertion into that table, runs a R script stored on the server itself(along with data in R object i.e .Rdata)?
Any kind of help would be appreciated.
MySQL triggers run only SQL code. They don't have access to anything outside the MySQL daemon. The only way this could work is that there would exist a plugin for MySQL for communication with R. Plugins can make new functions available which can be used for example also in triggers. However I am not aware of such MySQL plugin for R.
Your other option is to periodically connect to MySQL from your R code to check if new data have been inserted.
If you need to know exactly which data was inserted, you can create an insert trigger for table data, which will create new row in separate table newdata with the IDs of inserted rows. Then your R code needs to check for any rows in newdata table, read the IDs, fetch the corresponding rows from data table and then remove the info about it from newdata.
I don't know if you know how to access MySQL from R code. One option is to use RMySQL package. Good tutorial is for example here.
I have to transfer selected mysql data one server DB to other Server DB but issue is that i only have to use native php code
I can do it using shell_exec or exec command in php but i am not allowed to use any linux shell command.
I have to fetch selected data from the table then do the sql dump and data count is also huge so i don't know it is possible with php native code or not because it may get session out because of large amount of data.
what i have tried
https://github.com/2createStudio/shuttle-export
it support php based sql dump but i need selected data from table with conditions like
select * from customer where table_id=47
I have check few more example but not finding anything concrete
Please suggest how i can proceed.
Thanks
I am currently trying to combine two MYSQL Database installations into a single installation. I have already used a batch script to export each individual database to SQL files so they can be imported into the MYSQL that is being kept.
The problem is each individual database has a unique user assigned to it which also needs to be brought over. When doing this in the past, I imported the "mysql" database along with the result, and this caused corruption.
What is the best way to export ONLY the users from the "mysql" database and import them into a a different MySQL instance?
Use the --no-create-info option to mysqldump to keep it from dropping the old table on the target server.
If you have any overlap in the usernames on the two installations, use the --ignore option so that they will be ignored when merging.
So the command is:
mysqldump --no-create-info --ignore mysql user > user.sql
IF you are USING SQL yog then,
go to the TABLE which you need to export to other host/database
right click on the TABLE
SELECT copy TABLE TO different HOST/Database
Hope it is helpful
I'm creating locally a big database using MySQL and PHPmyAdmin. I'm constantly adding a lot of info to the database. I have right now more than 10MB of data and I want to export the database to the server but I have a 10MB file size limit in the Import section of PHPmyAdmin of my web host.
So, first question is how I can split the data or something like that to be able to import?
BUT, because I'm constantly adding new data locally, I also need to export the new data to the web host database.
So second question is: How to update the database if the new data added is in between all the 'old/already uploaded' data?
Don't use phpMyAdmin to import large files. You'll be way better off using the mysql CLI to import a dump of your DB. Importing is very easy, transfer the SQL file to the server and afterwards execute the following on the server (you can launch this command from a PHP script using shell_exec or system if needed) mysql --user=user --password=password database < database_dump.sql. Of course the database has to exist, and the user you provide should have the necessary privilege(s) to update the database.
As for syncing changes : that can be very difficult, and depends on a lot of factors. Are you the only party providing new information or are others adding new records as well? Are you going modify the table structure over time as well?
If you're the only one adding data, and the table structure doesn't vary then you could use a boolean flag or a timestamp to determine the records that need to be transferred. Based on that field you could create partial dumps with phpMyAdmin (by writing a SQL command and clicking Export at the bottom, making sure you only export the data) and import these as described above.
BTW You could also look into setting up a master-slave scenario with MySQL, where your data is transferred automatically to the other server (just another option, which might be better depending on your specific needs). For more information, refer to the Replication chapter in the MySQL manual.
What I would do, in 3 steps:
Step 1:
Export your db structure, without content. This is easy to manage on the exporting page of phpmyadmin. After that, I'd instert that into the new db.
Step 2:
Add a new BOOL column in your local db in every table. The function of this is, to store if a data is new, or even not. Because of this set the default to true
Step 3:
Create a php script witch connects to both databases. The script needs to get the data from your local database, and put it into the new one.
I would do this with following mysql methods http://dev.mysql.com/doc/refman/5.0/en/show-tables.html, http://dev.mysql.com/doc/refman/5.0/en/describe.html, select, update and insert
then you have to run your script everytime you want to sync your local pc with the server.