Use MySQL and PostgreSQL in the same PHP script - php

I'm trying to do some migrations from an old site to a new site. The old site uses MySQL and the new site uses PostgreSQL. My problem is I wrote a migration script in PHP that queries info from the old DB so that I can insert them into the new DB within that same script. The reason I need the script is I have to call other functions that do things and manipulate the data since the table columns aren't a one for one match so I can't just do a backup and restore type situation. I have a class for both DB's that I use.
The mysql queries work but postgres' don't. They get error messages saying pg_query(): 19 is not a valid PostgreSQL link resource in xxx
So is it possible to run them both in the same script? If I call the two scripts separately it works ok but I can't get the data from the old server to the new one.
I've looked everywhere and don't see many questions needing to use both DB's in one file.
Any help would be cool.

You are using the same variable for both resources and passing the mysql resource to the postgresql function

Related

Database name in Wordpress / PDO error reporting

I apologize in advance if this turns out to be a stupid question, but I've been racking my brain for an hour already.
I normally work in PHP with Laravel, but now I've been plunged into a project that uses a Wordpress database and a custom framework which uses PDO queries directly from the controller (no models exist); I need help figuring something out since I'm very inexperienced in Wordpress and PDO.
The client gave me a link to a database named simb2317419733, it has a Wordpress structure and the prefix is wp_wd5t1y9832_.
However, the queries on the site seem to reference tables that don't exist in this database. For example, the following query is trying to insert into a table named answers but:
1) no such table exists in the database
2) no error is thrown
Here is the query info:
http://pastebin.com/n08LnFbK
Notice that all the info matches the database above (host, user, password, prefix) but the database name itself is just wordpress. Is this a normal occurrence in Wordpress, or is this a case of the client simply giving me the wrong database which happens to have the same prefix? Or is the answers table missing but PDO isn't reporting the error for some reason?
Check that plugin how it creates the connection to database. There are two options: or it connects to the same database like the whole worpdress using constants from wp_config file, or there is plugin configuration, where you can provide different connection details. It can be either set in wp admin panel or some config files... But if you figure out, that it is using the same database, then there are two options. As you said, you've received wrong information. The second: plugin is not installed and it hasn't set up its tables.

Force MySQL to write back

I have an issue where an instance of Solr is querying my MySQL database to refresh its index immediately after an update is made to that database, but the Solr query is not seeing the change made immediately prior.
I imagine the problem has to be something like Solr is using a different database connection, and somehow the change is not being "committed" (I'm not using transactions, just a call to mysql_query) before the other connection can see it. If I throw a sufficiently long sleep() call in there, it works most of the time, but obviously this is not acceptable.
Is there a PHP or MySQL function that I can call to force a write/update/flush of the database before continuing?
You might make Solr use SET TRANSACTION ISOLATION LEVEL = READ-COMMITTED to get more prompt view of updated data.
You should be able to do this with the transactionIsolation property of the JDBC URL.

Converting the Server MYSQL DB to SQLite DB

I have a huge Database on the server and i need that DB in SQLite so that i can use it in my application in android as well as in ios app.
I got one solution for this when i go to phpmyadmin and select my db on server i exported the Tables one by one into CSV file and then imported those in my SQLite Browser one by one to get all the tables (And Then corrected column names and type manually by editing every table columns).
This way i made it as a .sqlite DB to be used in the app.
But i want to know more on these points below :
Is their some kind of a backend application that most developers use to convert their DB into SQLite DB. (If yes then what kind of stuff do they use)
Is their any PHP script that can do this stuff. (If yes then what script is used and how ?).
Is their any other simple way to deal with this problem of getting SQLite DB from the server. (If yes then what are the possible ways to do this ?).
Can any one get me some idea about this ?

Rename DB and Copy DB Structure to new DB - MySQL & PHP or Ruby

Reference: Copy Database Sructure of Mysql Database
Here's my problem... I have a site that I use PPC to drive traffic to. I track the visitors' keyword, PPC source, ad versions, etc. Currently I store this data in a MySQL DB (InnoDB) named visits. However, when this PPC campaign is running full throttle it generates a lot of data. Every so often my site crashes because this DB fills up and stops responding. (And because I forget to manually do a copy and empty...)
So now I want to create a PHP or Ruby script that runs once a week/month to put the gathered data into an archive DB and empty the DB used for data collection. I assume the fastest way is to rename the existing DB visits to something with a date stamp in the name like visits_010113_020113 for the month of Jan 2013. Then copy create a new visits with only the structure. The primary key is 32 char hash generated by PHP's md5 function so duplicate keys due to auto-increment is not an issue.
(I chose a DB to store the data in because I'm familiar with DBs and I wanted to be able to parse data for custom reporting. I am open to suggestions of a different architecture but I don't want to be spending the next 3 weeks coding up new classes and such for a new architecture right now.)
I ran a Google search on copying the structure of a DB to a new DB (the first result is the one I referenced above and most of the rest of the first page were very similar). However, the solutions all use mysqldump through the CLI. I want to do everything via PHP or Ruby. I could use an SSH class I have for PHP to execute the CLI but that seems like a hack.
I was hoping there was a simple SQL statement I could pass to do the renaming and copying. My preferred solution would be entirely in PHP. I use PHP 5.3.10-1ubuntu3.6 with Suhosin-Patch, mysql 5.5.29-0ubuntu0.12.04.2, and Ubuntu 12.04 server. I also use PHP's PDO object to interface with MySQL.
Thanks
So this would require you to have a list of the tables that need to be copied, but I like
CREATE TABLE cur_db.tbl_name LIKE old_db.tbl_name
So your script could rename the DB, create the new db, then run this in a loop over your table names.

Inserting several hundred records into mysql database using mysqli & PHP

I'm making a Flex 4 application and using ZendAMF to interact with a MySQL database. I got Flex to generate most of the services code for me (which utilizes mysqli) and roughly edited some of the code (I'm very much a novice when it comes to PHP). All works fine at this point.
My problem is - currently the application inserts ~400 records into the database when a user is created (it's saving their own data for them to load at a later date) but it does this with separate calls to the server - i.e. each record is sent to the server, then saved to the database and then the next one is sent.
This worked fine in my local environment, but since going on a live webserver it only adds these records some of the times. Other times it will totally ignore it. I'm assuming it's doing this because the live database doesn't like getting spammed with hundred of requests at practically the same time
I'm thinking a more efficient way would be to package all of these records into an array, send that to the server just the once and then get the PHP service to do multiple inserts on each item in the array. The problem is, I'm not sure how to go about coding this in PHP using mysqli statements.
Any help or advice would be greatly appreciated - thanks!
Read up on LOAD DATA LOCAL INFILE. It seems it's just what you need for inserting multiple records: it inserts many records from a file (though not an array, unfortunately) into your table in one operation.
It's also much, much faster to do multiple-record UPDATEs with LOAD DATA LOCAL INFILE than with one-per-row UPDATEs.
ee you should handle the user defaults separate and only store the changes
and if they are not saved you must check warnings form mysql or errors form php, data don't disappear
try
error_reporting(-1);
just before insering
and
mysqli::get_warnings()
aferwards

Categories