I am writing for advice on whether solution you recommend to choose.
Here is in a nutshell what I have:
Axapta databse on MS SQL Server 2008 R2.
Several SQL queries using the data of the database through PHP web application.
Webserver, which is running web application.
No administrator privileges on the Axapta databse (only SELECT permission).
Rights to write and modify on the webserver with MySQL database.
Light computer with Windows OS permanently working on the network. On this light computer I have admin rights.
Here's what I want to achieve :
Creating replication (exact copy) of few (10) tables on the webserver as another database, which will sync up as often as possible with Axapta database .
Adding some dictionary tables and views for the new database (to be able to insert the dictionary to earlier SQL queries) .
For now, I came up with this solutions:
I tried to add 'replace' and 'case when' to SQL queries on the Axapta database, without any other additional databases. However, with these new large dictionaries, query performance was poor and waiting for the results drived me crazy ;)
The only thing I could do is a manual export via ODBC to a MySQL webserver database. Is there a free program to automate this process? I mean eg. hourly update data from Axapta MSSQL database to webserver MySQL database (with help of this light computer which I mentioned before)?
Please let me know if you see any other possibilities to expand utlility of webapplication which uses Axapta database.
I don't know if there are some SQL sync agent from MSSQLServer to MySQL.
If you are writing your own tool, you can try to get your diff by yourself:
delete MySQL records where the RecId is not available in MSSQLServer anymore
insert new records for MSSQLServer records with unknown RecId
update records when ModifiedDateTime is greater than your last sync
At last I found a powerful solution, and reccomend it to everyone who has similar issue. PDI -Pentaho Data Integration (Community Edition) is a perfect way to do the dictionary transformations I mentioned above. It could even read from .xls or .txt files. You could get it from the link below, and don't forget to be active developer community member.
http://community.pentaho.com/projects/data-integration/
Related
I am trying to setup remote access to the data of Sage 100 Advanced ERP for use within a website running on a MySQL database. I only need to get inventory levels for products, so it's a read-only application.
As I understand it, Sage 100 comes with an ODBC driver that can allow remote access to Sage's flat-file data storage by creating a database view. What I need to do is copy a few fields from that data on the Sage server over to the web server hosting the website.
To automate this process, I assume I'll need to setup a cron job on the web server that runs a PHP script (preferred language) executing SQL queries that connect to the remote server, extract the needed data, and write it to the appropriate tables in the MySQL database. I'm fine with that last step, but I'm unsure of the steps to connect and retrieve data from the ODBC data source.
How can I connect to and extract Sage 100 data from an ODBC Data Source to write to a MySQL Database on another server?
Or, is there a way to sync/mirror the ODBC Data Source to a MySQL Database on a separate server that I could then use to copy data over to the website's database?
Note: MySQL has documentation on pulling data FROM MySQL using ODBC, but no info on how to import data TO MySQL using ODBC on an external server.
It's actually very easy. Just establish an ODBC connection to the SOTAMAS90 DSN. Your connection string looks like this:
"DSN=SOTAMAS90; UID=MyUserId; PWD=MyPassword; Company=ABC"
Note that by default Sage installs a 32-bit version of the driver, which means you must target your application to 32 bits. Or you can install the 64-bit version of the driver, which can be found in your Sage directory, in the WKSetup folder.
After that just write code to SELECT * from each of the tables you need, and write them into your MySql database.
I don't really know MySql well, but in SQL Server you can set up a Linked Server, point it to SOTAMAS90, and then query the SQL Server database instead of the ODBC driver. But it's slow. Much better performance if you can run a nightly ETL to populate your MySQL database and query that. Be sure to set foreign keys and create indexes for them after to define the tables.
Hope that helps.
Aaron
Is there a way to sync specific tables from MySQL to MSSQL and vice versa?
For example:
-Sync tbl_employees(MySQL) to tbl_employees(MSSQL)
-Sync tbl_attendance(MSSQL) to tbl_attendance(MySQL)
I've read about MSSQL Linked Server, but I don't really understand how it works.
What I want is that whenever there are changes on the MySQL database when I use PHP, the changes will be synced automatically to MSSQL database which is manipulated by a .NET application.
I don't know if I explained it well enough. If you have any questions, please ask on the comments. It would really help me a lot.
set-up SQL Server Linked Server to MySQL,
then synchronize the tables what ever u want
more details are here???
http://www.ideaexcursion.com/2009/02/25/howto-setup-sql-server-linked-server-to-mysql/
You can manually create a middle layer application with multiple connection... and do it manually...but its only applicable for small scale purposes.else
You can Create a linked server between SQLServer and MySQL Server. configure the Linked Server Provider, but all subsequent connections require only a DSN and Linked Server.
I have a website running with php and mysql on Centos(Amazon instance). The newly brought ERP(to be integrated with existing system) uses Oracle as db(located on a separate Windows server). The orders from the website are inserted into the Master Mysql database and which are replicated into the Slave Mysql database. These orders need to be pushed to the Oracle Db. I have arrived on 4 methods to do this.
Use mysql UDF for http communication that would send the rows on a Insert Trigger on the slave to the Oracle webservices on Oracle server
Use cron jobs(with a low interval may be 5 mins,polling) with Php script that would get the new orders from mysql and send to the Oracle db via Oracle services/Php services on Oracle hosted server.
Use sys_exec() udf to invoke php script to insert into Oracle db
Use memcached with MySql and let Php poll the memcached to retrieve data and send it to Oracle server, but unsure whether we could migrate existing Mysql version to new version Mysql 5.6
I already have the UDF's in place and tested them, they are good to go. but still in dilemma regarding data integrity and reliability in case of using UDF's with triggers.
Is there a better method for doing this. Or else which method shall I follow to do the same.
I am aware of the security threats of UDF's, you can't restrict it to any user
One more thing I am not allowed to introduce new changes to the existing website php code for the operation.
SymmetricDS replicates parts of or entire database schemas across different vendor products. It works by installing triggers, but it replicates on a transaction basis ensuring that transactions gets replayed onn target database (s) in the right order.
I'm planning a PHP based application and am hoping to get some advice. Here's the background:
The application will be getting some data from multiple MS SQL Server 2008 databases which are on one remote server (about 50 databases, looking at about 200,000 records across these per day). This data will then need to have various calculations run against each record, and the results displayed to the end user via a web browser.
I'll only have read-only access to the MS SQL Server 2008, and because of this together with the fact that it's a remote server, I was thinking of putting the data into one or more local MySQL databases (which I have root access to) which I can do the calculations within and the web application can query directly.
My question is this:
In terms of performance, what would be the best way to handle the data transfer process? Should I:
Use PHP to pull out the data from MS SQL 2008 and drop it into MySQL, perhaps using a PDO connection
Use Linked Servers to do this? (and if so can I do this with read-only access to the MSSQL server
Another method I've not thought of...
Thanks for any help you can give.
I have inherited a web-application project (PHP/MySQL) which makes extensive use of stored procedures, and I am unsure how to transfer these to my local machine for development.
Specifically, it is the exporting and importing of these stored procedures that I am facing difficulty with (with the ultimate aim of being able to automate installation of a "base" version of the application database, complete with stored procedures).
I shall state the bits and pieces that I believe I understand:
I can export the information_schema.ROUTINES table, but cannot import it (phpMyAdmin) onto my local server (I believe that information_schema is a system database?)
As I do not have shell/command-line access to the "online" MySQL servers, tools such as mysqldump are only of very limited use (to port to my local testing machine, but not vice versa)
The command SHOW CREATE PROCEDURE procedure_name; is supposed to output SQL that can be used to create the procedure, but I can't seem to get this functioning correctly (ERROR 1305 (42000): PROCEDURE delAddress does not exist)
Assuming that the the above works, the next step is to have it be looped to export all stored procedures (..and then be run via phpMyAdmin elsewhere, to import it)
Please correct anything that is incorrect, and provide guidance on how to automate porting between database servers.
Many thanks
Turns out the solution was ridiculously simple.
On the export page, there is an Add CREATE PROCEDURE / FUNCTION / EVENT statement option.
By checking this box, it also exports the SQL necessary to import procedures at a later date/on another server.
Use MySQL Workbench's migration feature. The whole thing is free and it does an amazing job.