I am connection to Oracle using PHP.
Now i've a file in PHP containing some data which I want to load into Oracle database table. Easiest way to do this is to read file and create insert queries to do this.
But can I load this data from my file in my server into a oracle table without uploading the file to oracle server as i don't have its FTP access?
Another consideration for bulk loading data into Oracle table(s) is the SQL Loader utility.
You don't have to upload it the ORACLE server in order to do your inserts.
I have a similar example:
I read .csv files which have a lot of records regarding used cars' makes, models
and quotes .
I have a class which reads these files ( from my local PC) prepairs insert, updates and deletes makes all these CRUD operations in a REMOTE ORACLE SERVER.
Your web server should have access, authorization, to connect to the ORACLE instance, that's all.
Related
I am trying to setup remote access to the data of Sage 100 Advanced ERP for use within a website running on a MySQL database. I only need to get inventory levels for products, so it's a read-only application.
As I understand it, Sage 100 comes with an ODBC driver that can allow remote access to Sage's flat-file data storage by creating a database view. What I need to do is copy a few fields from that data on the Sage server over to the web server hosting the website.
To automate this process, I assume I'll need to setup a cron job on the web server that runs a PHP script (preferred language) executing SQL queries that connect to the remote server, extract the needed data, and write it to the appropriate tables in the MySQL database. I'm fine with that last step, but I'm unsure of the steps to connect and retrieve data from the ODBC data source.
How can I connect to and extract Sage 100 data from an ODBC Data Source to write to a MySQL Database on another server?
Or, is there a way to sync/mirror the ODBC Data Source to a MySQL Database on a separate server that I could then use to copy data over to the website's database?
Note: MySQL has documentation on pulling data FROM MySQL using ODBC, but no info on how to import data TO MySQL using ODBC on an external server.
It's actually very easy. Just establish an ODBC connection to the SOTAMAS90 DSN. Your connection string looks like this:
"DSN=SOTAMAS90; UID=MyUserId; PWD=MyPassword; Company=ABC"
Note that by default Sage installs a 32-bit version of the driver, which means you must target your application to 32 bits. Or you can install the 64-bit version of the driver, which can be found in your Sage directory, in the WKSetup folder.
After that just write code to SELECT * from each of the tables you need, and write them into your MySql database.
I don't really know MySql well, but in SQL Server you can set up a Linked Server, point it to SOTAMAS90, and then query the SQL Server database instead of the ODBC driver. But it's slow. Much better performance if you can run a nightly ETL to populate your MySQL database and query that. Be sure to set foreign keys and create indexes for them after to define the tables.
Hope that helps.
Aaron
I work often in MS Access and I always create linked tables to csv or txt files so that when some part of data changes in a source file the change appears in the dtb as well.
Is there a way to create linked tables in a mysql database used for storing data for php page?
Can phpMyAdmin in xampp do this?
MySQL supports a CSV storage engine.
Read the documentation here for more details:
http://dev.mysql.com/doc/refman/5.6/en/csv-storage-engine.html
So you can create a table that is linked to a CSV file, and if you modify the file, the new data will immediately become visible to SQL queries.
Yes, you can create a linked table to MySQL in Access. You will have to install the MySQL driver on the user's machine, and setup an ODBC. Then you can create a linked table to that ODBC connection.
I need to automate a data flow from MS-Access to MySQL for an insurance company.
Users will continue to work in MS-Access, but the changes done through the Access forms should take effect in MySQL as well.
This is the decision of our web analyst, because the web system will not be able to replace all MS-Access functionalities that are currently used, and the data management will currently continue to be done in MS-Access (this was the decision).
I see two options, and one of the options contains my question:
Automate the data upload through VBA (when changes occur via the form, an Ajax request sends the data via GET parameters to a PHP script on the server that performs the necessary manipulations on the data before storing them in MySQL. Immediate upload into MySQL could have been possible as well, but it was decided not to do this);
Send a simple request to a PHP script, and perform all actions in PHP: Get records from MS-Access in PHP, manipulate the data in PHP, and store them to MySQL in PHP. (My preference. I will need to create a script that runs automatic migrations every night as well, so I will need to do this anyway).
The MS-Access database is stored on our local network, ie. a different server than where the PHP script runs (which is an external web server).
I would prefer to manage everything in PHP, but (my question):
Is it possible to connect to the host of our local network and the
host of the external server simultaneously, querying data from
MS-Access, manipulating them with PHP and inserting / updating them in MySQL at the
same time, in a single script?
Is it possible to connect to the host of our local network and the
host of the external server simultaneously, querying data from
MS-Access, manipulating them with PHP and inserting / updating them in
MySQL at the same time, in a single script?
Yes, but you don't really need a PHP script, access can work with MySQL through an ODBC connection or by using ADODB connections. It would be easiest to modify the forms to work with the MYSQL back-end.
The advantage of the ODBC connection is you can insert from the access DB to the MySQL DB in a single query.
Using ADODB connections you can take an access query, and do a batch insert (you'll have to do it through a string), but that can also be relatively quick since you only connect to the MySQL db to send the INSERT command, the rest is string manipulation.
If you are going to go with an asynchronous bulk update through php
<?php
$dbName = $_SERVER["DOCUMENT_ROOT"] . "products\products.mdb";
if (!file_exists($dbName)) {
die("Could not find database file.");
}
$db = new PDO("odbc:DRIVER={Microsoft Access Driver (*.mdb)}; DBQ=$dbName; Uid=; Pwd=;");
That code will allow you to connect to the access back-end (credit to http://www.sitepoint.com/using-an-access-database-with-php/)
note that the queries will be parsed by access, so you should format them the access way.
I want to write web service which gets sqlite database from client device (i-phone app) and sync with server mysql DB. both sqlite and mysql has same DB table structure.
Should i accept json string from client for each table and parse it on the server end and get store the data in mysql DB. is it the right solution or is there any other solution for syncing both client and server DB ?
SQLite has an option to dump the DB into a text file containing SQL commands. See Exporting from SQLite to SQL Server and http://sqlite.org/sqlite.html.
You can dump the SQLite into a file, have your program tweak the SQL a bit (remove table creation, triggers and other irrelevant stuff) and run it on the MySQL DB.
A process running on my machine collects data from various websites and stores it in the local mysql db. Same data is exported using SELECT INTO OUTFILE and FTPed to the shared host every few hours. My hosting provider doesn't allow LOAD DATA INFILE to be executed on the shared host? What are my other options for automated/scheduled load to MYSQL db on my shared host?
There's lots of different solutions actually. You could export the data as INSERT statements and import that SQL file on the server, or import your current outfile dumps using a PHP script instead of using LOAD DATA INFILE, or create a webservice that allows you to update the data instead of your current dump/FTP/import scenario.
I found an excellent solution on this webpage.