Retrieving values from a PostgreSQL TEXT column with PHP - php

We have an app based on JSF and JBoss that stores textual data into a regular PostgreSQL database. When we are using our system via the JSF application, everything works perfectly, (i.e. we are able to retrieve, save and do everything else through the system).
We also have a PHP website, however, that retrieves information from the same database. The problem is that when it asks for information from any column of type TEXT, the database only outputs a series of numbers (it is using the value as if were a BLOB or CLOB, I think). Because when I look directly into the database, I see the same number.
How should I go about solving this?

I think that you would only need to emulate something like a CLOB if you are storing so much text that a streaming API made sense. I have yet to see a case for this in PostgreSQL, but if you did, you could use LOB's and just unescape in your app (they are basically BLOB's).
To pull a value from the db, it is generally well documented in PHP's documentation, but if you are using the pg_ functions, keep in mind that they do not loop over the result set for you (you need to do that yourself). The modules in PEAR etc. may wrap that for you so make sure to consult your framework's documentation.

Related

How to merge local and live databases?

We've been developing for Wordpress for several years and whilst our workflow has been upgraded at several points there's one thing that we've never solved... merging a local Wordpress database with a live database.
So I'm talking about having a local version of the site where files and data are changed, whilst the data on the live site is also changing at the same time.
All I can find is the perfect world scenario of pulling the site down, nobody (even customers) touching the live site, then pushing the local site back up. I.e copying one thing over the other.
How can this be done without running a tonne of mysql commands? (it feels like they could fall over if they're not properly checked!) Can this be done via Gulp's (I've seen it mentioned) or a plugin?
Just to be clear, I'm not talking about pushing/pulling data back and forth via something like WP Migrate DB Pro, BackupBuddy or anything similar - this is a merge, not replacing one database with another.
I would love to know how other developers get around this!
File changes are fairly simple to get around, it's when there's data changes that it causes the nightmare.
WP Stagecoach does do a merge but you can't work locally, it creates a staging site from the live site that you're supposed to work on. The merge works great but it's a killer blow not to be able to work locally.
I've also been told by the developers that datahawk.io will do what I want but there's no release date on that.
It sounds like VersionPress might do what you need:
VersionPress staging
A couple of caveats: I haven't used it, so can't vouch for its effectiveness; and it's currently in early access.
Important : Take a backup of Live database before merging Local data to it.
Follow these steps might help in migrating the large percentage of data and merging it to live
Go to wp back-end of Local site Tools->Export.
Select All content radio button (if not selected by default).
This will bring an Xml file containing all the local data comprised of all default post types and custom post types.
Open this XML file in notepad++ or any editor and find and replace the Local URL with the Live URL.
Now visit the Live site and Import the XML under Tools->Import.
Upload the files (images) manually.
This will bring a large percentage of data from Local to Live .
Rest of the data you will have to write custom scripts.
Risk factors are :
When uploading the images from Local to Live , images of same name
will be overriden.
Wordpress saves the images in post_meta generating a serialized data for the images , than should be taken care of when uploading the database.
Serialized data in post_meta for post_type="attachment" saves serialized data for 3 or 4 dimensions of the images.
Usernames or email ids of users when importing the data , can be same (Or wp performs the function of checking unique usernames and emails) then those users will not be imported (might be possible).
If I were you I'd do the following (slow but affords you the greatest chance of success)
First off, set up a third database somewhere. Cloud services would probably be ideal, since you could get a powerful server with an SSD for a couple of hours. You'll need that horsepower.
Second, we're going to mysqldump the first DB and pipe the output into our cloud DB.
mysqldump -u user -ppassword dbname | mysql -u root -ppass -h somecloud.db.internet
Now we have a full copy of DB #1. If your cloud supports snapshotting data, be sure to take one now.
The last step is to write a PHP script that, slowly but surely, selects the data from the second DB and writes it to the third. We want to do this one record at a time. Why? Well, we need to maintain the relationships between records. So let's take comments and posts. When we pull post #1 from DB #2 it won't be able to keep record #1 because DB #1 already had one. So now post #1 becomes post #132. That means that all the comments for post #1 now need to be written as belonging to post #132. You'll also have to pull the records for the users who made those posts, because their user IDs will also change.
There's no easy fix for this but the WP structure isn't terribly complex. Building a simple loop to pull the data and translate it shouldn't be more then a couple of hours of work.
If I understand you, to merge local and live database, until now I'm using other software such as NavicatPremium, it has Data Sycn feature.
This can be achieved live using spring-xd, create a JDBC Stream to pull data from one db and insert into the other. (This acts as streaming so you don't have to disturb any environment)
The first thing you need to do is asses if it would be easier to do some copy-paste data entry instead of a migration script. Sometimes the best answer is to suck it up and do it manually using the CMS interface. This avoids any potential conflicts with merging primary keys, but you may need to watch for references like the creator of a post or similar data.
If it's just outright too much to manually migrate, you're stuck with writing a script or finding one that is already written for you. Assuming there's nothing out there, here's what you do...
ALWAYS MAKE A BACKUP BEFORE RUNNING MIGRATIONS!
1) Make a list of what you need to transfer. Do you need users, posts, etc.? Find the database tables and add them to the list.
2) Make a note all possible foreign keys in the database tables being merged into the new database. For example, wp_posts has post_author referencing wp_users. These will need specific attention during the migration. Use this documentation to help find them.
3) Once you know what tables you need and what they reference, you need to write the script. Start by figuring out what content is new for the other database. The safest way is to do this manually with some kind of side-by-side list. However, you can come up with your own rules on how to automatically match table rows. Maybe to check for $post1->post_content === $post2->post_content in cases the text needs to be the same. The only catch here is the primary/foreign keys are off limits for these rules.
4) How do you merge new content? The general idea is that all primary keys will need to be changed for any new content. You want to use everything except for the id of post and insert that into the new database. There will be an auto-increment to create the new id, so you wont need the previous id (unless you want it for script output/debug).
5) The tricky part is handling the foreign keys. This process is going to vary wildly depending on what you plan on migrating. What you need to know is which foreign key goes to which (possibly new) primary key. If you're only migrating posts, you may need to hard-code a user id to user id mapping for the post_author column, then use this to replace the values.
But what if I don't know the user ids for the mapping because some users also need to be migrated?
This is where is gets tricky. You will need to first define the merge rules to see if a user already exists. For new users, you need record the id of the newly inserted users. Then after all users are migrated, the post_author value will need to be replaced when it references a newly merged user.
6) Write and test the script! Test it on dummy databases first. And again, make backups before using it on your databases!
I've done something simillar with ETL (Extract, Transform, Load) process when I was moving data from one CMS to another.
Rather than writing a script I used a Pentaho Data Integration (Kettle) tool.
The Idea of ETL is pretty much straight forward:
Extract the data (for instance from one database)
Transform it to suit your needs
Load it to the final destination (your second database).
The tool is easy to use and it allows you to experiment with various steps and outputs to investigate the data. When you design a right ETL proces, you are ready to merge those databases of yours.
How can this be done without running a tonne of mysql commands?
No way. If both local and web sites are running at the same time how can you prevent not having the same ids' with different content?
so if you want to do this you can use mysql repication.i think it will help you to merge with different database mysql.

Recreate a database using existing php code

So I have an old website which was coded over an extended period of time but has been inactive for 3 or so years. I have the full PHP source to the site, but the problem is I do not have a backup of the database any longer. I'm wondering what the best solution to recreating the database would be? It is a large site so manually going through each PHP file and trying to keep track of which tables are referenced is no small task. I've tried googling for the answer but have had no luck. Does anyone know of any tools that are available to help extract this information from the PHP and at least give me the basis of a database skeleton? Otherwise, has anyone ever had to do this? Any tips to help me along and possibly speed up the process? It is a mySQL database I'm trying to use.
The way I would do it:
Write a subset of SQLi or whatever interface was used to access the DB to intercept all DB accesses.
Replace all DB accesses with the dummy version of yours.
The basic idea is to emulate the DB so that the PHP code runs long enough to activate the various DB accesses, which in turn will allow you to analyze the way the DB is built and used.
From within these dummy functions:
print the SQL code used
regenerate just enough dummy results to let the rest of the code run, based on the tables and fields mentioned in the query parameters and the PHP code that retrieves them (you won't learn much from a SELECT *, but you can see what fields the PHP code expects to get from it)
once you have understood enough of the DB structure, recreate the tables and let the original code work on them little by little
have the previous designer flogged to death for not having provided a way to recreate the DB programatically
There are currently two answers based on the information you provided.
1) you can't do this
PHP is a typeless language. you could check you sql statements for finding field and table names. but it will not complete. if there is a select * from table, you can't see the fields. so you need to check there php accesses the fields. maybe by name or by index. you could be happy if this is done by name, because you can extract the name of the fields. finally the data types will missing. also missing: where are is an index on, what are primary keys, constrains etc.
2) easy, yes you can!
because your php is using a modern framework with contains a orm. this created the database for you. a meta information are included in the php classes/design.
just check the manual how to recreate the database.

Using gettext and database driven translations simultaneously

I am currently developing a website in PHP and I decided to go with gettext to manage the translations. I set up a nice Pootle server so that I can easily manage the translations and a bash script that runs with cron that extracts all of the values from the PHP files to be translated and creates the .pot translation template file.
So far, so good. However, I just remembered that part of the text of the site is stored in a database. Let's call it "products" for simplicity. I want the product description, name, and a few other fields to be translatable, but it would be great if I could have a centralized way to translate them without having to create a separate interface just to translate the database entries. Since Pootle is already set up, it would be nice to be able to use that.
I thought of two solutions:
Forget using a database and use only PHP arrays
Write a script that will extract all of the values from the database and generate a file that will then be scanned by the aforementioned bash script and add the values to the pot file, and another script that will run just after the bash script to re-update all of the values in the DB.
Neither of these solutions really seem to be ideal. The first one would be easy to set up and easy to use with Pootle, but I lose all flexibility that comes with using a DBMS and I would have to import the entire array every time I want to use it. Loss of functionality isn't really that bad, because I (currently) am not performing any advanced calculations on the rows, basically just SELECTs and that's it. The second one, could work, but would take significantly more planning (and coding) to set up correctly.
Are there any other ways that I'm missing that would give me the flexibility of a database, but allow me to easily translate it in a centralized place along with the rest of the site, like Pootle?
You can generate pot/po files directly from the database and feed them into Pootle then. Then you would be able to use gettext functions directly on values returned from database.
As an example, you can look at phpMyAdmin, where we use similar approach to translate structured text file.

Displaying results of many sql queries without writing PHP code?

Sorry if this question might sound stupid to you guys, but am total newbie to programming, apart from knowing SQL, the thing is i have been given a MYSQL database containing various information about kids diseases and a web interface written in php to create reports from the database that can be accessed via the interface. there are almost 25 different variables that need to be computed in the report, i have written sql queries to compute all these values, but i don't know anything about PHP, if i have all these queries isn't there a way for me to combine all these sql queries to be display results on a webpage and come up with this report without writing PHP code?
Thanks for your
again very sorry if this is too basic.
As mr_jp suggests, phpmysqladmin provides a simple front end for running queries, but also changing the data and modifying the schema. Although you can restrict named users to only have SELECT privilege, they'll still need to know SQL to run the queries.
It's not that hard to build a front end to take a set of parameters, substitute them into a SELECT statement and send the output to a formatted table. There are lots of datagrid tools (e.g. phplens, phpgrid, have a google for 'mysql datagrid' for more) which will handle the formatting of a MySQL resultset (or just download it as CSV - your browser should be able to transfer the data into your spreadsheet program automatically).
There are a couple of report generators for PHP - but the last time I looked at this in any depth, I wasn't overly impressed.
Your web host would probably have phpmyadmin installed. Try getting access from the web host.
You can enter your queries there and export the results as html, csv, excel and others.
You could write Python. Or Ruby. Or something you know. ;-)
But you need something to output your queried data.
If you just want to check the results by yourself without having the needs to publish that directly, you might use some MySQL query browser or administrator like phpMyAdmin or the MySQL Workbench. Those tools allow you to query the database but display the returned data only as raw tables. If you need some styling or your own layout, you'll have to use an own application or edit the exported data manually (e.g. using a CSV export and re-open it using some spreadsheet application like Excel or Calc).
The combination PHP + MySQL is a very popular one and it's highly recommended that you use them together.
The code that you will need to write in order to display that information using PHP is pretty straightforward and not very hard. If you do know some basic programming concepts, you can learn to do that in a matter of hours. PHP is well known for its extremely accessible learning curve. There are thousands of code samples online that you can look at to see how this is done.

ODBC: when is the best time to create my database?

I have a windows program which generates PGP forms which will be filled in later.
Those PHP forms will populate a database. It looks very much like MySql, but I can't be certain, so let's call it ODBC.
And, yes, it does have to be a windows program.
There will also be PHP forms which query the database - examine which tables and fields it contains and then generates forms which can be used to search the database (e.g, it finds a table with fields "employee_name", etc and generates a form which lets you search based on employee name.
Let's call that design time and run time.
At design time, some manager or IT guy or similar gets to define the nature of the database and at runtime 1) a worker fills in the form daily and 2) management can extract reports.
Here's my question: given that the database is defined at "design time" (and populated at run time), where and how is best to do so?
1 I could use an ODBC interface from the windows program, but I am having difficulty finding something good to work with Delphi. Things like ADO and firebird tend to expect you to already have a database and allow you to manipulate it, but I can find no code example of how to create a database and some tables, so ...
2 I could used DOS commands from Delphi in my windows program. I just tried and got a response to MySql --version, but am not sure if MySql etc are more interactive. That is, can I use a script file or a very long stacked command with semicolons and returns separating? e.g 'CREATE DATABASE db; CREATE TABLE t1;'
3) Since the best way to work with databases seems to be PHP, perhaps my windows program could spit out a PHP page which would, when run in a browser, create the database.
I have tried to make this as uncomplicated as I can, but please feel free to ask questions. It may be that there are several valid ways, but there is probably one 'better' solution in terms of ease of implementation or maintenance.
Better scratch option 3. What if the user later wants to come back and have the windows program change the input form? It needs to update the database too.
Creating a database is usually a database administrator task. Unless it is a local database, maybe an embedded one, the user would need to know where and how create the database on the remote server, and she can have no clue about it. Where to store the database files? Which disks are available? And there could be many more parameters to set (memoery buffers size, etc.), users to be created and so on. And also you need very elevate privileges to be able to create a database, not something you give to average users or applications.
Thereby usually you ask the database administrator to create your database/schema, he will give you the credentials you need to connect, and then your application (or its setup) will create and initialize the needed objects (tables, etc.). Creating table (and other object) is usually as simple as running "CREATE TABLE...." statements. Just remember SQL takes one command only, if you need to run several commands you have to send them one after another yourself, although there are Delphi components which are able to split a script in commands and run one after another.

Categories