Wordpress Woocommerce 2nd sql database connection issue - php

I have just transferred a backup of a WordPress site to a new domain. I loaded all of the sites files using FTP and I connected the main database (called boxedsco_master.sql) to the new site's database using phpMyAdmin. The new prefix for the databases is "boxeipxy_" rather than "boxedsco_". I also updated wp-config.php to connect the master database, which worked fine.
However there was also a second database called boxedsco_boxes.sql. Which I also uploaded using phpMyAdmin, and is now called "boxeipxy_boxes.sql".
The site is mainly working, however, there are no notification emails when an order is placed, either to me customer or my client. I believe this could be because the second database is not correctly connected, as the database contains tables called things like "wp_easycontactforms_customforms".
How do I connect this second database? Would there be a php file to update with the new database name? I can't find a php file that references "boxedsco_boxes.sql" in order to update it?
Using Wordpress 3.8.5 and WooCommerce 2.0.20

No wonder no-one responded! There should not be 2 databases! I thought it was very strange!
The email notifications issue was a bug with woocommerce not liking the send-from address! Changing it to "wordpress#[yourdomain].com" fixed the issue completely!
Thanks anyway folks!

the fact that you have a two files with .sql suffixes doesn't mean there are two databases. Crack open the files and look inside. They are most likely two dumps from the same database; have a look if they contain the same data and/or tables or not.
As the philosopher says, "a poem about the moon is not moon". A sqldump isn't a database, it's a "representation" of a database. Most likely you have two poems about the same moon.

Related

Setting Up a Connection to a Database from a Website and Running Queries

I am working on a project for a Database Systems class where we created a database and have to present it somehow. Our group had the idea to create an 'imdb' type website where various information about movies could be stored and presented to the user. We have our database created and are about to begin work on the front-end for the website.
We have some confusion among our group as to how to proceed. We understand that we need to use PHP to connect to the database with the username and password credentials, but from there:
Do you have to have the PHP connection code on every webpage, or just the home page?
What do the queries look like when communicating with the database to retrieve and display data?
Would a WordPress site be easier to use when working in this capacity as it's more or less a 'Proof of Concept' type project, not a fully fleshed out site?
As itoctopus said, every page will need to connect to the database. I have found the easiest way to accomplish this is to build your db connection in a "header" file (a file that is "included" in every php that needs access to the db).
Here are your answers:
Yes- you will need to have the connection on every page. Usually, the connection parameters and script is located in one file which you require on every page.
A query looks like:
SELECT `name` FROM `movies` ORDER BY ID DESC LIMIT 0, 10;
You may find some WordPress plugins which will do something similar. If it's a proof of concept, then I suggest you go this way (if such a plugin exists). It's also a good idea to read extensively on PHP/MySQL development or hire someone who can do that if you want to make that project real.

Copying an MySQL/Debian8 Database to a different platform

I found variations on how to make copies of a DB, and fill in tables, within the same platform, but I want to port to a different platform.
I tried Archiving the folder with the same name as my DB form /var/lib/mysql, changing the folder name, then extracting back to /var/lib/mysql.
When I fired up phpMyAdmin, it showed the new database name, but no tables.
I expect the same thing to happen when I extract on the new platform, but there I won't be able to use the Insert from a local copy.
What directories and/or files does MySQL on Debian8 need to be tweaked to let the tables show?

How to merge local and live databases?

We've been developing for Wordpress for several years and whilst our workflow has been upgraded at several points there's one thing that we've never solved... merging a local Wordpress database with a live database.
So I'm talking about having a local version of the site where files and data are changed, whilst the data on the live site is also changing at the same time.
All I can find is the perfect world scenario of pulling the site down, nobody (even customers) touching the live site, then pushing the local site back up. I.e copying one thing over the other.
How can this be done without running a tonne of mysql commands? (it feels like they could fall over if they're not properly checked!) Can this be done via Gulp's (I've seen it mentioned) or a plugin?
Just to be clear, I'm not talking about pushing/pulling data back and forth via something like WP Migrate DB Pro, BackupBuddy or anything similar - this is a merge, not replacing one database with another.
I would love to know how other developers get around this!
File changes are fairly simple to get around, it's when there's data changes that it causes the nightmare.
WP Stagecoach does do a merge but you can't work locally, it creates a staging site from the live site that you're supposed to work on. The merge works great but it's a killer blow not to be able to work locally.
I've also been told by the developers that datahawk.io will do what I want but there's no release date on that.
It sounds like VersionPress might do what you need:
VersionPress staging
A couple of caveats: I haven't used it, so can't vouch for its effectiveness; and it's currently in early access.
Important : Take a backup of Live database before merging Local data to it.
Follow these steps might help in migrating the large percentage of data and merging it to live
Go to wp back-end of Local site Tools->Export.
Select All content radio button (if not selected by default).
This will bring an Xml file containing all the local data comprised of all default post types and custom post types.
Open this XML file in notepad++ or any editor and find and replace the Local URL with the Live URL.
Now visit the Live site and Import the XML under Tools->Import.
Upload the files (images) manually.
This will bring a large percentage of data from Local to Live .
Rest of the data you will have to write custom scripts.
Risk factors are :
When uploading the images from Local to Live , images of same name
will be overriden.
Wordpress saves the images in post_meta generating a serialized data for the images , than should be taken care of when uploading the database.
Serialized data in post_meta for post_type="attachment" saves serialized data for 3 or 4 dimensions of the images.
Usernames or email ids of users when importing the data , can be same (Or wp performs the function of checking unique usernames and emails) then those users will not be imported (might be possible).
If I were you I'd do the following (slow but affords you the greatest chance of success)
First off, set up a third database somewhere. Cloud services would probably be ideal, since you could get a powerful server with an SSD for a couple of hours. You'll need that horsepower.
Second, we're going to mysqldump the first DB and pipe the output into our cloud DB.
mysqldump -u user -ppassword dbname | mysql -u root -ppass -h somecloud.db.internet
Now we have a full copy of DB #1. If your cloud supports snapshotting data, be sure to take one now.
The last step is to write a PHP script that, slowly but surely, selects the data from the second DB and writes it to the third. We want to do this one record at a time. Why? Well, we need to maintain the relationships between records. So let's take comments and posts. When we pull post #1 from DB #2 it won't be able to keep record #1 because DB #1 already had one. So now post #1 becomes post #132. That means that all the comments for post #1 now need to be written as belonging to post #132. You'll also have to pull the records for the users who made those posts, because their user IDs will also change.
There's no easy fix for this but the WP structure isn't terribly complex. Building a simple loop to pull the data and translate it shouldn't be more then a couple of hours of work.
If I understand you, to merge local and live database, until now I'm using other software such as NavicatPremium, it has Data Sycn feature.
This can be achieved live using spring-xd, create a JDBC Stream to pull data from one db and insert into the other. (This acts as streaming so you don't have to disturb any environment)
The first thing you need to do is asses if it would be easier to do some copy-paste data entry instead of a migration script. Sometimes the best answer is to suck it up and do it manually using the CMS interface. This avoids any potential conflicts with merging primary keys, but you may need to watch for references like the creator of a post or similar data.
If it's just outright too much to manually migrate, you're stuck with writing a script or finding one that is already written for you. Assuming there's nothing out there, here's what you do...
ALWAYS MAKE A BACKUP BEFORE RUNNING MIGRATIONS!
1) Make a list of what you need to transfer. Do you need users, posts, etc.? Find the database tables and add them to the list.
2) Make a note all possible foreign keys in the database tables being merged into the new database. For example, wp_posts has post_author referencing wp_users. These will need specific attention during the migration. Use this documentation to help find them.
3) Once you know what tables you need and what they reference, you need to write the script. Start by figuring out what content is new for the other database. The safest way is to do this manually with some kind of side-by-side list. However, you can come up with your own rules on how to automatically match table rows. Maybe to check for $post1->post_content === $post2->post_content in cases the text needs to be the same. The only catch here is the primary/foreign keys are off limits for these rules.
4) How do you merge new content? The general idea is that all primary keys will need to be changed for any new content. You want to use everything except for the id of post and insert that into the new database. There will be an auto-increment to create the new id, so you wont need the previous id (unless you want it for script output/debug).
5) The tricky part is handling the foreign keys. This process is going to vary wildly depending on what you plan on migrating. What you need to know is which foreign key goes to which (possibly new) primary key. If you're only migrating posts, you may need to hard-code a user id to user id mapping for the post_author column, then use this to replace the values.
But what if I don't know the user ids for the mapping because some users also need to be migrated?
This is where is gets tricky. You will need to first define the merge rules to see if a user already exists. For new users, you need record the id of the newly inserted users. Then after all users are migrated, the post_author value will need to be replaced when it references a newly merged user.
6) Write and test the script! Test it on dummy databases first. And again, make backups before using it on your databases!
I've done something simillar with ETL (Extract, Transform, Load) process when I was moving data from one CMS to another.
Rather than writing a script I used a Pentaho Data Integration (Kettle) tool.
The Idea of ETL is pretty much straight forward:
Extract the data (for instance from one database)
Transform it to suit your needs
Load it to the final destination (your second database).
The tool is easy to use and it allows you to experiment with various steps and outputs to investigate the data. When you design a right ETL proces, you are ready to merge those databases of yours.
How can this be done without running a tonne of mysql commands?
No way. If both local and web sites are running at the same time how can you prevent not having the same ids' with different content?
so if you want to do this you can use mysql repication.i think it will help you to merge with different database mysql.

Problems migrating MS Access to MySQL Database

We recently developed a PHP/MySQL program that works fine when test data is entered through the program but when we migrated actual data from MS Access to MySQL using ODBC there were problems.
The MySQL and Access databases have quite different schemas. Looking at the migrated data in phpmyadmin, it appears that the data was imported (all the data is there) but when we try to view data through the program, data retrieval is selective (some data is retrieved and some is not). For example, if I select “company” on a certain page from a dropdown, it correctly displays company information but if I navigate to another page that is supposed to display products associated with a subdivision and select “subdivision” from a dropdown, it either doesn’t display any data or displays only one product even though there are several.
How do I troubleshoot this problem?
Originally we were looking for a protocol or some suggestions on how to troubleshoot this issue. We decided it would be more manageable to initially look at a small section of data only. We discovered that the data needed to be cleaned. There were text strings in fields that should not allow them and strange characters in certain fields. There was also an issue with blank spaces on certain data. To fix this, we had to strip all the blanks and then add them in again. Once the data was cleaned, the program was working as it should.

replicating specific data between database tables

I've seen some posts on here regarding similar ideas to this, but to be specific I thought I should point out my requirement exactly.
I have a database driven site, and the client wants a replica of it for users from the USA. They want most of the site to be the same, except some of data, which they want to be different for US visitors.
The site runs on a php/MySQL database content management system I have written. I think we are going to approach the 'USA' version like this...
Place a clone of the whole site in a folder called /us (no surprises there)
Duplicate all the tables, but precede the names with us_
I'm thinking, of adding a field to the original sites tables called 'replicate' for example, and then every 15 mins or so, run a script to copy all the records from the original table to the us_ tables where the replicate field is marked yes
On the US version of the content management system, all records that are copied from the UK site are somehow locked so only records marked no at the original site can differ on the UK site.
Does this sound like I'm heading along the right lines ?
Why not make a new database for ONLY the tables/rows that are for US visitors only.
make a php array or something that says what table should be called from what database.
Seems like less effort to me.
Did you check the MySQL manual for topics about replication?

Categories