I have inherited a web-application project (PHP/MySQL) which makes extensive use of stored procedures, and I am unsure how to transfer these to my local machine for development.
Specifically, it is the exporting and importing of these stored procedures that I am facing difficulty with (with the ultimate aim of being able to automate installation of a "base" version of the application database, complete with stored procedures).
I shall state the bits and pieces that I believe I understand:
I can export the information_schema.ROUTINES table, but cannot import it (phpMyAdmin) onto my local server (I believe that information_schema is a system database?)
As I do not have shell/command-line access to the "online" MySQL servers, tools such as mysqldump are only of very limited use (to port to my local testing machine, but not vice versa)
The command SHOW CREATE PROCEDURE procedure_name; is supposed to output SQL that can be used to create the procedure, but I can't seem to get this functioning correctly (ERROR 1305 (42000): PROCEDURE delAddress does not exist)
Assuming that the the above works, the next step is to have it be looped to export all stored procedures (..and then be run via phpMyAdmin elsewhere, to import it)
Please correct anything that is incorrect, and provide guidance on how to automate porting between database servers.
Many thanks
Turns out the solution was ridiculously simple.
On the export page, there is an Add CREATE PROCEDURE / FUNCTION / EVENT statement option.
By checking this box, it also exports the SQL necessary to import procedures at a later date/on another server.
Use MySQL Workbench's migration feature. The whole thing is free and it does an amazing job.
Related
I've recently taken over a WordPress site from another 'dev' company. The site is highly unstable and not delivering what the client needs. So, I'm currently trying to export the site wholesale to one of our company server.
Numerous backup plugins have failed for various reasons, so I'm now trying to get an export of the MySQL database via phpmyadmin to import into our MySQL server. I've taken a full export from the old server and tried importing it into the new server via phpmyadmin . However, this fails after a period of time, with no real indicator of why.
Next, I exported every table individually and tried to import them to the new server. The first 2/3 or so appear to work but then the latter 1/3 all fail to import with the output/error message saying that the table has multiple primary keys declared.
I really need to export the database structure and data from the old server and transfer it to the new one. So, I'm really perplexed as to what my next move could be. If these tables have multiple primary keys and this prevents an import, how were they created in the first place??
What can I do to remedy the situation and get the data migrated?
This could be a problem of the dump you get. If it was too big, your phpmyadmin would just break down after a while. If you have ssh access to your server you could easily import your dump by shell after uploading your dump file to your server. If this is not possible you could maybe work with a big dump script http://www.ozerov.de/bigdump/ which automatically imports the dump partially, so there is no server timeout anymore.
It appears that this was related to some limiting factor with PHP / phpMyAdmin. I was able to import the generated SQL scripts via the SQL tool in Virtualmin / Webmin without any apparent issue.
I am writing for advice on whether solution you recommend to choose.
Here is in a nutshell what I have:
Axapta databse on MS SQL Server 2008 R2.
Several SQL queries using the data of the database through PHP web application.
Webserver, which is running web application.
No administrator privileges on the Axapta databse (only SELECT permission).
Rights to write and modify on the webserver with MySQL database.
Light computer with Windows OS permanently working on the network. On this light computer I have admin rights.
Here's what I want to achieve :
Creating replication (exact copy) of few (10) tables on the webserver as another database, which will sync up as often as possible with Axapta database .
Adding some dictionary tables and views for the new database (to be able to insert the dictionary to earlier SQL queries) .
For now, I came up with this solutions:
I tried to add 'replace' and 'case when' to SQL queries on the Axapta database, without any other additional databases. However, with these new large dictionaries, query performance was poor and waiting for the results drived me crazy ;)
The only thing I could do is a manual export via ODBC to a MySQL webserver database. Is there a free program to automate this process? I mean eg. hourly update data from Axapta MSSQL database to webserver MySQL database (with help of this light computer which I mentioned before)?
Please let me know if you see any other possibilities to expand utlility of webapplication which uses Axapta database.
I don't know if there are some SQL sync agent from MSSQLServer to MySQL.
If you are writing your own tool, you can try to get your diff by yourself:
delete MySQL records where the RecId is not available in MSSQLServer anymore
insert new records for MSSQLServer records with unknown RecId
update records when ModifiedDateTime is greater than your last sync
At last I found a powerful solution, and reccomend it to everyone who has similar issue. PDI -Pentaho Data Integration (Community Edition) is a perfect way to do the dictionary transformations I mentioned above. It could even read from .xls or .txt files. You could get it from the link below, and don't forget to be active developer community member.
http://community.pentaho.com/projects/data-integration/
I have a website running with php and mysql on Centos(Amazon instance). The newly brought ERP(to be integrated with existing system) uses Oracle as db(located on a separate Windows server). The orders from the website are inserted into the Master Mysql database and which are replicated into the Slave Mysql database. These orders need to be pushed to the Oracle Db. I have arrived on 4 methods to do this.
Use mysql UDF for http communication that would send the rows on a Insert Trigger on the slave to the Oracle webservices on Oracle server
Use cron jobs(with a low interval may be 5 mins,polling) with Php script that would get the new orders from mysql and send to the Oracle db via Oracle services/Php services on Oracle hosted server.
Use sys_exec() udf to invoke php script to insert into Oracle db
Use memcached with MySql and let Php poll the memcached to retrieve data and send it to Oracle server, but unsure whether we could migrate existing Mysql version to new version Mysql 5.6
I already have the UDF's in place and tested them, they are good to go. but still in dilemma regarding data integrity and reliability in case of using UDF's with triggers.
Is there a better method for doing this. Or else which method shall I follow to do the same.
I am aware of the security threats of UDF's, you can't restrict it to any user
One more thing I am not allowed to introduce new changes to the existing website php code for the operation.
SymmetricDS replicates parts of or entire database schemas across different vendor products. It works by installing triggers, but it replicates on a transaction basis ensuring that transactions gets replayed onn target database (s) in the right order.
I would like to synchronize local Mysql db with remote Mysql db.
Due to internet fail, we have to use local application. Once internet problem solved, db should synchronize with remote db.
Because we are using same application in local and remote.
You can use SQLyog which has Database synchronization feature to sync two databases if you are looking for open source tool then refer pt-table-sync http://www.percona.com/doc/percona-toolkit/2.1/pt-table-sync.html
You can make the dump of your local database, and import that on the hosted server when ready.
I think I would create a table changes on a local database and whenever I do any change on the database, it is also saved in the changes table.
For synching then, I'd
loop through changes table and make query-like string
copy it to a clipboard
Have a page on a hosted web with input where to paste that string
When submitted, PHP explodes the string into different queries
Loop through queries and perform them against a hosted database
Of course, here intensive validation should be considered to ensure that the pasted text is a query string copied from the local server
I am afraid you need to do the export and import thing. Don't think there is any automatic kind out there..erm, i would like to know as well
I have a local database, and all the tables are defined. Eventually I need to publish my data remotely, which I can do easily with PHPmyadmin. Problem however is that my remote host doesn't allow remote SQL connections at all, so writing a script that does a mysqldump and run it through a client (which would've been ideal) won't help me here. Since the schema won't change, but the data will, I need some kind of PHP client that works "reverse".
Edit: I want this as an automated solution, so I don't have to copy/paste the SQL everytime I make a change!
My question is if such a client exists and what would be recommended to use (by experience). I just need an one way trip here, from my local database (Rails) to the remote database (supports PHP), preferable as simple and slick as possible. Thank you for your replies, comments and feedback!
I believe phpmyadmin has the ability to upload and execute an SQL file, so you can just import a mysqldump via that means.