We recently developed a PHP/MySQL program that works fine when test data is entered through the program but when we migrated actual data from MS Access to MySQL using ODBC there were problems.
The MySQL and Access databases have quite different schemas. Looking at the migrated data in phpmyadmin, it appears that the data was imported (all the data is there) but when we try to view data through the program, data retrieval is selective (some data is retrieved and some is not). For example, if I select “company” on a certain page from a dropdown, it correctly displays company information but if I navigate to another page that is supposed to display products associated with a subdivision and select “subdivision” from a dropdown, it either doesn’t display any data or displays only one product even though there are several.
How do I troubleshoot this problem?
Originally we were looking for a protocol or some suggestions on how to troubleshoot this issue. We decided it would be more manageable to initially look at a small section of data only. We discovered that the data needed to be cleaned. There were text strings in fields that should not allow them and strange characters in certain fields. There was also an issue with blank spaces on certain data. To fix this, we had to strip all the blanks and then add them in again. Once the data was cleaned, the program was working as it should.
Related
I have a Wordpress site that utilizes a custom post type, call it CPT-1, that I created using JetEngine. Inside of CPT-1 are meta fields. Once that was setup, I did a bulk insert of data using Ultimate CSV Importer Pro and it put this information into CPT-1 and I could put each column of data into the meta fields I wanted to use. These fields are then used later in tables.
Is there a way to go around the CSV Importer part of this process and just pull from a database? In the long term, i'd like to make changes to certain posts and upload different posts while using CPT-1 but I don't think using a CSV every time will be easy or accurate. If I could just pull from a database that I make updates to, I can track those changes easily and manage it.
I have database experience but not so much with Wordpress databases. What tables would I have to pay attention to if I were to go down this route?
Wordpress uses MySQL as a backend, so there is no reason you can't just insert the data directly. You'll need to get the credentials Wordpress uses to connect to the database, and then connect yourself, probably from your own custom PHP script.
I am generally skiddish doing things like you described because Wordpress is a complex piece of software and I don't have a lot of awareness of what it is doing behind-the-scenes (nor do they really intend users to have such awareness, most functionality is hidden from the user.)
However, if you have been doing a CSV import, and you have tested it extensively, and it's working fine with that method, there is no reason you couldn't carry out this same thing with less manual work on your part via a PHP script.
I'm afraid I can't get much more specific in my answer because I don't have information about what exactly you did with the CSV.
A straightforward (but not super efficient) way of doing this would be a PHP script where you initiate a database connection to the database you update, and a second connection to the MySQL database, fetch a query of whatever rows you want to update (whatever you would normally be exporting via CSV) and iterate row-by-row and insert this data into the MySQL database. You can make this significantly more efficient by making a single prepared statement, and then executing it repeatedly with each row of values.
A more efficient way of doing it would be to pull the data into your PHP script and then format it as a single query which you could then add into the MySQL database.
If you already have CSV importing working, you could even do a "lazy" solution where you just write a PHP script that generates a CSV and then feeds it into MySQL and imports it the same way your other program was. It's hard for me to tell from what you said, which of these solutions would work. However, I have used all three solutions, depending on what I'm doing and what kind of error handling I want.
In general, if errors happen rarely-to-never you are probably better off with the single, bulk insert methods whether one query or PHP automating the export of a CSV and then passing it to be imported into Wordpress' MySQL database.
We've been developing for Wordpress for several years and whilst our workflow has been upgraded at several points there's one thing that we've never solved... merging a local Wordpress database with a live database.
So I'm talking about having a local version of the site where files and data are changed, whilst the data on the live site is also changing at the same time.
All I can find is the perfect world scenario of pulling the site down, nobody (even customers) touching the live site, then pushing the local site back up. I.e copying one thing over the other.
How can this be done without running a tonne of mysql commands? (it feels like they could fall over if they're not properly checked!) Can this be done via Gulp's (I've seen it mentioned) or a plugin?
Just to be clear, I'm not talking about pushing/pulling data back and forth via something like WP Migrate DB Pro, BackupBuddy or anything similar - this is a merge, not replacing one database with another.
I would love to know how other developers get around this!
File changes are fairly simple to get around, it's when there's data changes that it causes the nightmare.
WP Stagecoach does do a merge but you can't work locally, it creates a staging site from the live site that you're supposed to work on. The merge works great but it's a killer blow not to be able to work locally.
I've also been told by the developers that datahawk.io will do what I want but there's no release date on that.
It sounds like VersionPress might do what you need:
VersionPress staging
A couple of caveats: I haven't used it, so can't vouch for its effectiveness; and it's currently in early access.
Important : Take a backup of Live database before merging Local data to it.
Follow these steps might help in migrating the large percentage of data and merging it to live
Go to wp back-end of Local site Tools->Export.
Select All content radio button (if not selected by default).
This will bring an Xml file containing all the local data comprised of all default post types and custom post types.
Open this XML file in notepad++ or any editor and find and replace the Local URL with the Live URL.
Now visit the Live site and Import the XML under Tools->Import.
Upload the files (images) manually.
This will bring a large percentage of data from Local to Live .
Rest of the data you will have to write custom scripts.
Risk factors are :
When uploading the images from Local to Live , images of same name
will be overriden.
Wordpress saves the images in post_meta generating a serialized data for the images , than should be taken care of when uploading the database.
Serialized data in post_meta for post_type="attachment" saves serialized data for 3 or 4 dimensions of the images.
Usernames or email ids of users when importing the data , can be same (Or wp performs the function of checking unique usernames and emails) then those users will not be imported (might be possible).
If I were you I'd do the following (slow but affords you the greatest chance of success)
First off, set up a third database somewhere. Cloud services would probably be ideal, since you could get a powerful server with an SSD for a couple of hours. You'll need that horsepower.
Second, we're going to mysqldump the first DB and pipe the output into our cloud DB.
mysqldump -u user -ppassword dbname | mysql -u root -ppass -h somecloud.db.internet
Now we have a full copy of DB #1. If your cloud supports snapshotting data, be sure to take one now.
The last step is to write a PHP script that, slowly but surely, selects the data from the second DB and writes it to the third. We want to do this one record at a time. Why? Well, we need to maintain the relationships between records. So let's take comments and posts. When we pull post #1 from DB #2 it won't be able to keep record #1 because DB #1 already had one. So now post #1 becomes post #132. That means that all the comments for post #1 now need to be written as belonging to post #132. You'll also have to pull the records for the users who made those posts, because their user IDs will also change.
There's no easy fix for this but the WP structure isn't terribly complex. Building a simple loop to pull the data and translate it shouldn't be more then a couple of hours of work.
If I understand you, to merge local and live database, until now I'm using other software such as NavicatPremium, it has Data Sycn feature.
This can be achieved live using spring-xd, create a JDBC Stream to pull data from one db and insert into the other. (This acts as streaming so you don't have to disturb any environment)
The first thing you need to do is asses if it would be easier to do some copy-paste data entry instead of a migration script. Sometimes the best answer is to suck it up and do it manually using the CMS interface. This avoids any potential conflicts with merging primary keys, but you may need to watch for references like the creator of a post or similar data.
If it's just outright too much to manually migrate, you're stuck with writing a script or finding one that is already written for you. Assuming there's nothing out there, here's what you do...
ALWAYS MAKE A BACKUP BEFORE RUNNING MIGRATIONS!
1) Make a list of what you need to transfer. Do you need users, posts, etc.? Find the database tables and add them to the list.
2) Make a note all possible foreign keys in the database tables being merged into the new database. For example, wp_posts has post_author referencing wp_users. These will need specific attention during the migration. Use this documentation to help find them.
3) Once you know what tables you need and what they reference, you need to write the script. Start by figuring out what content is new for the other database. The safest way is to do this manually with some kind of side-by-side list. However, you can come up with your own rules on how to automatically match table rows. Maybe to check for $post1->post_content === $post2->post_content in cases the text needs to be the same. The only catch here is the primary/foreign keys are off limits for these rules.
4) How do you merge new content? The general idea is that all primary keys will need to be changed for any new content. You want to use everything except for the id of post and insert that into the new database. There will be an auto-increment to create the new id, so you wont need the previous id (unless you want it for script output/debug).
5) The tricky part is handling the foreign keys. This process is going to vary wildly depending on what you plan on migrating. What you need to know is which foreign key goes to which (possibly new) primary key. If you're only migrating posts, you may need to hard-code a user id to user id mapping for the post_author column, then use this to replace the values.
But what if I don't know the user ids for the mapping because some users also need to be migrated?
This is where is gets tricky. You will need to first define the merge rules to see if a user already exists. For new users, you need record the id of the newly inserted users. Then after all users are migrated, the post_author value will need to be replaced when it references a newly merged user.
6) Write and test the script! Test it on dummy databases first. And again, make backups before using it on your databases!
I've done something simillar with ETL (Extract, Transform, Load) process when I was moving data from one CMS to another.
Rather than writing a script I used a Pentaho Data Integration (Kettle) tool.
The Idea of ETL is pretty much straight forward:
Extract the data (for instance from one database)
Transform it to suit your needs
Load it to the final destination (your second database).
The tool is easy to use and it allows you to experiment with various steps and outputs to investigate the data. When you design a right ETL proces, you are ready to merge those databases of yours.
How can this be done without running a tonne of mysql commands?
No way. If both local and web sites are running at the same time how can you prevent not having the same ids' with different content?
so if you want to do this you can use mysql repication.i think it will help you to merge with different database mysql.
I have just transferred a backup of a WordPress site to a new domain. I loaded all of the sites files using FTP and I connected the main database (called boxedsco_master.sql) to the new site's database using phpMyAdmin. The new prefix for the databases is "boxeipxy_" rather than "boxedsco_". I also updated wp-config.php to connect the master database, which worked fine.
However there was also a second database called boxedsco_boxes.sql. Which I also uploaded using phpMyAdmin, and is now called "boxeipxy_boxes.sql".
The site is mainly working, however, there are no notification emails when an order is placed, either to me customer or my client. I believe this could be because the second database is not correctly connected, as the database contains tables called things like "wp_easycontactforms_customforms".
How do I connect this second database? Would there be a php file to update with the new database name? I can't find a php file that references "boxedsco_boxes.sql" in order to update it?
Using Wordpress 3.8.5 and WooCommerce 2.0.20
No wonder no-one responded! There should not be 2 databases! I thought it was very strange!
The email notifications issue was a bug with woocommerce not liking the send-from address! Changing it to "wordpress#[yourdomain].com" fixed the issue completely!
Thanks anyway folks!
the fact that you have a two files with .sql suffixes doesn't mean there are two databases. Crack open the files and look inside. They are most likely two dumps from the same database; have a look if they contain the same data and/or tables or not.
As the philosopher says, "a poem about the moon is not moon". A sqldump isn't a database, it's a "representation" of a database. Most likely you have two poems about the same moon.
I'm making a Flex 4 application and using ZendAMF to interact with a MySQL database. I got Flex to generate most of the services code for me (which utilizes mysqli) and roughly edited some of the code (I'm very much a novice when it comes to PHP). All works fine at this point.
My problem is - currently the application inserts ~400 records into the database when a user is created (it's saving their own data for them to load at a later date) but it does this with separate calls to the server - i.e. each record is sent to the server, then saved to the database and then the next one is sent.
This worked fine in my local environment, but since going on a live webserver it only adds these records some of the times. Other times it will totally ignore it. I'm assuming it's doing this because the live database doesn't like getting spammed with hundred of requests at practically the same time
I'm thinking a more efficient way would be to package all of these records into an array, send that to the server just the once and then get the PHP service to do multiple inserts on each item in the array. The problem is, I'm not sure how to go about coding this in PHP using mysqli statements.
Any help or advice would be greatly appreciated - thanks!
Read up on LOAD DATA LOCAL INFILE. It seems it's just what you need for inserting multiple records: it inserts many records from a file (though not an array, unfortunately) into your table in one operation.
It's also much, much faster to do multiple-record UPDATEs with LOAD DATA LOCAL INFILE than with one-per-row UPDATEs.
ee you should handle the user defaults separate and only store the changes
and if they are not saved you must check warnings form mysql or errors form php, data don't disappear
try
error_reporting(-1);
just before insering
and
mysqli::get_warnings()
aferwards
Hi I've got a tricky question (aren't they all tricky?)
I'm converting a database driven site that uses php, to a site being built with Dashcode.
The current site selects data held in a mySQL database and dynamically creates the page content. Originally this was done to reduce site maintenance because all the current content could be maintained and checked offline before uploading to the live database, therefore avoiding code changes.
In dashcode you can work from a JSON file as a datasource - which is fine, it works - except for the maintenance aspect. The client is not willing (and I understand why) to update several hundred lines of fairly structured JS object code when the database holds the data and is updated from elsewhere.
So - What's the best way to get Dashcode to link to the database data?
Where are you getting the the JSON from? Is that being generated from the original MySQL? Could you not generate the JSON from the MySQL and therefore use the original maintenance procedure prior to uploading to MySQL?
For my projects I usually create a php intermediate that when accessed logs into the MySQL database and phrases the results into xml in the body of the page. Just point daschcode to the php file in the data source. Parameters can even be passed into the php script through with GET through the url in the data source.