Moving data from SQL Server to MYSQL for a PHP application - php

I'm planning a PHP based application and am hoping to get some advice. Here's the background:
The application will be getting some data from multiple MS SQL Server 2008 databases which are on one remote server (about 50 databases, looking at about 200,000 records across these per day). This data will then need to have various calculations run against each record, and the results displayed to the end user via a web browser.
I'll only have read-only access to the MS SQL Server 2008, and because of this together with the fact that it's a remote server, I was thinking of putting the data into one or more local MySQL databases (which I have root access to) which I can do the calculations within and the web application can query directly.
My question is this:
In terms of performance, what would be the best way to handle the data transfer process? Should I:
Use PHP to pull out the data from MS SQL 2008 and drop it into MySQL, perhaps using a PDO connection
Use Linked Servers to do this? (and if so can I do this with read-only access to the MSSQL server
Another method I've not thought of...
Thanks for any help you can give.

Related

Is there any problem or consideration for creating 1000 SQLite databases on a PHP server?

In an Android application,
Users have their own stores and I want to create a separate SQLite database on one PHP server for them
Is there any problem or consideration for creating you say 1000 SQLite databases on a PHP server?
having a lot of databases on a php server shouldn't be the problem. Even having one thousand databases open is more a matter of your server resources (ram/cpu) than a php limitation. You should think about a design to limit the number of users who can have an active open database connection, otherwise you are in danger that somebody can kill your server with a 3 line script.

Sync Sage 100 Data from an ODBC Data Source to a MySQL Database

I am trying to setup remote access to the data of Sage 100 Advanced ERP for use within a website running on a MySQL database. I only need to get inventory levels for products, so it's a read-only application.
As I understand it, Sage 100 comes with an ODBC driver that can allow remote access to Sage's flat-file data storage by creating a database view. What I need to do is copy a few fields from that data on the Sage server over to the web server hosting the website.
To automate this process, I assume I'll need to setup a cron job on the web server that runs a PHP script (preferred language) executing SQL queries that connect to the remote server, extract the needed data, and write it to the appropriate tables in the MySQL database. I'm fine with that last step, but I'm unsure of the steps to connect and retrieve data from the ODBC data source.
How can I connect to and extract Sage 100 data from an ODBC Data Source to write to a MySQL Database on another server?
Or, is there a way to sync/mirror the ODBC Data Source to a MySQL Database on a separate server that I could then use to copy data over to the website's database?
Note: MySQL has documentation on pulling data FROM MySQL using ODBC, but no info on how to import data TO MySQL using ODBC on an external server.
It's actually very easy. Just establish an ODBC connection to the SOTAMAS90 DSN. Your connection string looks like this:
"DSN=SOTAMAS90; UID=MyUserId; PWD=MyPassword; Company=ABC"
Note that by default Sage installs a 32-bit version of the driver, which means you must target your application to 32 bits. Or you can install the 64-bit version of the driver, which can be found in your Sage directory, in the WKSetup folder.
After that just write code to SELECT * from each of the tables you need, and write them into your MySql database.
I don't really know MySql well, but in SQL Server you can set up a Linked Server, point it to SOTAMAS90, and then query the SQL Server database instead of the ODBC driver. But it's slow. Much better performance if you can run a nightly ETL to populate your MySQL database and query that. Be sure to set foreign keys and create indexes for them after to define the tables.
Hope that helps.
Aaron

How to add additional information to Axapta Database without administrative privileges?

I am writing for advice on whether solution you recommend to choose.
Here is in a nutshell what I have:
Axapta databse on MS SQL Server 2008 R2.
Several SQL queries using the data of the database through PHP web application.
Webserver, which is running web application.
No administrator privileges on the Axapta databse (only SELECT permission).
Rights to write and modify on the webserver with MySQL database.
Light computer with Windows OS permanently working on the network. On this light computer I have admin rights.
Here's what I want to achieve :
Creating replication (exact copy) of few (10) tables on the webserver as another database, which will sync up as often as possible with Axapta database .
Adding some dictionary tables and views for the new database (to be able to insert the dictionary to earlier SQL queries) .
For now, I came up with this solutions:
I tried to add 'replace' and 'case when' to SQL queries on the Axapta database, without any other additional databases. However, with these new large dictionaries, query performance was poor and waiting for the results drived me crazy ;)
The only thing I could do is a manual export via ODBC to a MySQL webserver database. Is there a free program to automate this process? I mean eg. hourly update data from Axapta MSSQL database to webserver MySQL database (with help of this light computer which I mentioned before)?
Please let me know if you see any other possibilities to expand utlility of webapplication which uses Axapta database.
I don't know if there are some SQL sync agent from MSSQLServer to MySQL.
If you are writing your own tool, you can try to get your diff by yourself:
delete MySQL records where the RecId is not available in MSSQLServer anymore
insert new records for MSSQLServer records with unknown RecId
update records when ModifiedDateTime is greater than your last sync
At last I found a powerful solution, and reccomend it to everyone who has similar issue. PDI -Pentaho Data Integration (Community Edition) is a perfect way to do the dictionary transformations I mentioned above. It could even read from .xls or .txt files. You could get it from the link below, and don't forget to be active developer community member.
http://community.pentaho.com/projects/data-integration/

Mysql to Oracle Communication

I have a website running with php and mysql on Centos(Amazon instance). The newly brought ERP(to be integrated with existing system) uses Oracle as db(located on a separate Windows server). The orders from the website are inserted into the Master Mysql database and which are replicated into the Slave Mysql database. These orders need to be pushed to the Oracle Db. I have arrived on 4 methods to do this.
Use mysql UDF for http communication that would send the rows on a Insert Trigger on the slave to the Oracle webservices on Oracle server
Use cron jobs(with a low interval may be 5 mins,polling) with Php script that would get the new orders from mysql and send to the Oracle db via Oracle services/Php services on Oracle hosted server.
Use sys_exec() udf to invoke php script to insert into Oracle db
Use memcached with MySql and let Php poll the memcached to retrieve data and send it to Oracle server, but unsure whether we could migrate existing Mysql version to new version Mysql 5.6
I already have the UDF's in place and tested them, they are good to go. but still in dilemma regarding data integrity and reliability in case of using UDF's with triggers.
Is there a better method for doing this. Or else which method shall I follow to do the same.
I am aware of the security threats of UDF's, you can't restrict it to any user
One more thing I am not allowed to introduce new changes to the existing website php code for the operation.
SymmetricDS replicates parts of or entire database schemas across different vendor products. It works by installing triggers, but it replicates on a transaction basis ensuring that transactions gets replayed onn target database (s) in the right order.

Automate data flow from MS Access to MySQL

I need to automate a data flow from MS-Access to MySQL for an insurance company.
Users will continue to work in MS-Access, but the changes done through the Access forms should take effect in MySQL as well.
This is the decision of our web analyst, because the web system will not be able to replace all MS-Access functionalities that are currently used, and the data management will currently continue to be done in MS-Access (this was the decision).
I see two options, and one of the options contains my question:
Automate the data upload through VBA (when changes occur via the form, an Ajax request sends the data via GET parameters to a PHP script on the server that performs the necessary manipulations on the data before storing them in MySQL. Immediate upload into MySQL could have been possible as well, but it was decided not to do this);
Send a simple request to a PHP script, and perform all actions in PHP: Get records from MS-Access in PHP, manipulate the data in PHP, and store them to MySQL in PHP. (My preference. I will need to create a script that runs automatic migrations every night as well, so I will need to do this anyway).
The MS-Access database is stored on our local network, ie. a different server than where the PHP script runs (which is an external web server).
I would prefer to manage everything in PHP, but (my question):
Is it possible to connect to the host of our local network and the
host of the external server simultaneously, querying data from
MS-Access, manipulating them with PHP and inserting / updating them in MySQL at the
same time, in a single script?
Is it possible to connect to the host of our local network and the
host of the external server simultaneously, querying data from
MS-Access, manipulating them with PHP and inserting / updating them in
MySQL at the same time, in a single script?
Yes, but you don't really need a PHP script, access can work with MySQL through an ODBC connection or by using ADODB connections. It would be easiest to modify the forms to work with the MYSQL back-end.
The advantage of the ODBC connection is you can insert from the access DB to the MySQL DB in a single query.
Using ADODB connections you can take an access query, and do a batch insert (you'll have to do it through a string), but that can also be relatively quick since you only connect to the MySQL db to send the INSERT command, the rest is string manipulation.
If you are going to go with an asynchronous bulk update through php
<?php
$dbName = $_SERVER["DOCUMENT_ROOT"] . "products\products.mdb";
if (!file_exists($dbName)) {
die("Could not find database file.");
}
$db = new PDO("odbc:DRIVER={Microsoft Access Driver (*.mdb)}; DBQ=$dbName; Uid=; Pwd=;");
That code will allow you to connect to the access back-end (credit to http://www.sitepoint.com/using-an-access-database-with-php/)
note that the queries will be parsed by access, so you should format them the access way.

Categories