Although I find quite a lot resources related to this question but none of them is precisely giving the answer to a Multilingual CMS using Zend Framework.
There are many zend translate adapters available in zend framwork. But the one (zend sql adapter), which is most need for database(mysql) driven websites, is not yet released.
For those multilingual websites, which are not database driven, contents can be placed in files (xml, mo, or any other) and one of the zend translate adapter is used to process the contents to display the correct language.
How we will deal with database driven multilingual website? Previously we were habit of using php with well-designed multilingual database keeping each article (page) in table with every required translation. If we will do the same by using zend framework, would that be over kill or slow website? We still make use of zend cache to make it faster, but we won't able to take the advantage of zend translate. Later, when the zend translate adapter for sql will be available, would that be easy to switch that multi-lingual content managed system by using zend translate.
Has anyone tried this? What could be the pros and corns?
Another solution could be keeping our well design multilingual database and generate xml based language files on every change admin make using GUI in admin area. And then use one of zend translate adapter to handle these xml files. I guess that could be overkill, killing a bird with cannon :)
When I am talking about placing the whole page's content in database. It can include some html tags such as b, span, br, p etc. How well the zend translate can deal contents with html tags in it?
If someone already has implemented this before, what could be the best way to deal with Multi-lingual content managed website using zend framework.
Any expert opinion!
There are many zend translate adapters available in zend framwork. But the one (zend sql adapter), which is most need for database(mysql) driven websites, is not yet released.
For those multilingual websites, which are not database driven, contents can be placed in files (xml, mo, or any other) and one of the zend translate adapter is used to process the contents to display the correct language.
These are wrong assumptions. It is not said, that DB driven application needs to use DB driven translate system. You can easily use static-files system.
How we will deal with database driven multilingual website? Previously we were habit of using php with well-designed multilingual database keeping each article (page) in table with every required translation.
I think that you are a little bit mistaken - I understand that you would like to use Translate for the dynamic content of your page (articles). Translate is rather designed to internationalize the views - the static content. I mean things like login or register or welcome text, etc. And these really should be in the files (consider files a static cache) rather than in the DB, because of huge load it would make (DB should be cached anyway). The articles stored in the DB is a different thing, and what you want to achieve is multilangual page content. You can handle that easily without Translate (remember, Translate is good for views!), simply add a country/language flag to your tables and retrieve suitable (filtered for given language) data through your model. It is really straightforward and it doesn't need any backend for translation.
I am not sure how Translate works, but I can assume that it checks the language and then loads whole translation file and stores it in script memory as a collection (or simple associative array) just to provide quick and robust translation mechanism (notice, that it wouldn't need to call a DB or a file for every given key, because all of them would be in the memory). Keeping the whole pages, articles this way wouldn't make sense at all, mainly because you need only 1-2 articles per page (why waste memory then?) and sometimes a hundreds of localized view strings (so you don't want to make a call to a DB or file for each of them)
Another solution could be keeping our well design multilingual database and generate xml based language files on every change admin make using GUI in admin area. And then use one of zend translate adapter to handle these xml files. I guess that could be overkill, killing a bird with cannon :)
If talking about translations for a static content - it really is very common solution: keeping the translations in the DB for easy access/change and generating the XML/CSV/whatever files when change occurs.
When I am talking about placing the whole page's content in database. It can include some html tags such as b, span, br, p etc. How well the zend translate can deal contents with html tags in it?
It would probably deal good, but still, you are thinking about dynamic content. Static content should be formatted inside the view. So, dead end I guess.
Bottom line: Using Translate for all of that you mentioned, it would be killing a bird with a cannon :)
Related
Though there are lot of similar questions already asked here, I didn't find the answer i was looking for..
What's the best way to develop multi-language application, It should be very fast.. and i don't know how much text i will translate.
Method 1: create and keep all the text in an array for every language i want to support and include that file everywhere.
Method 2: Use gettext (.MO, .PO files)
Method 3: Store all the translations in a text file and write a function to go through all the text and when matched display its value
Method 4: Store all the text and its translations in database, But i don't think it will be faster than storage in Filesystem.
Method 5: Same as method 1 but i will create multiple files per language just to keep everything structured.
Though all of these will work, Which do you guys think will be the fastest method and do let me know if i missed any method.
This is a complicated problem and its not always as obvious as you might think. You may in some cases, with right to left languages or particular cultural reasons, need to develop separate layouts for a particular region.
Regardless of which method you choose, you will want to cache all of or parts of your pages and use a cached version if available before regenerating the page again.
I would probably avoid 3 and 4. You don't want to be reading from the disk more than you have to. If you can cache translation arrays in memcached, you can save yourself disk access in loading translation tables.
As a person managing localization projects for developers, I have to say that both sides (translators and developers) have been very happy with Gettext (.po files). It's relatively simple to implement in your code (basically a wrapper around any text you want localized), it's seamlessly fast, and most importantly: it scales and updates flawlessly.
The main advantage is that lots of tools exist for creating, updating, managing, and translating .po/.pot files, including the cross-platform PoEdit. When you have dozens of languages to do, it's as easy as extracting the latest .pot file and sending that to the translation team. They'll return individual files for each language. I haven't seen many systems that can scan and locate new strings as easily or present them to translators for use as simply.
I would recommend looking at PHP Frameworks which support multiple languages. Investigate the most popular first, Zend, Symphony and Yii. I have used Yii before and this has multi language support.
http://www.yiiframework.com/extension/yii-multilanguage/
I am creating a website and it has to be multi-language. The translation has to be made and prefixed (NO auto-translations api). My question is, what is more efficient?:
Create one file set for each language.
Create one file set and show text through PHP constants.
I also thought of making a MySql query to get an array with all translations at the beginning of the document.
Note:* There will not be really large texts.
Longer term, you're best option is going to be using one file set for each language. If you use an industry standard format, such as GNU gettext, PHP has built in support. Also, 3rd party translation companies and translation tools generally support the format, so long term site maintenance requires less dependencies on developers.
I'm using Zend Translate for a website I'm working on.
Most of the components in Zend Framework can be used as standalone components. I'm using the full stack but it shouldn't be a major problem to use only Zend Translate.
As for using a database to get translations, I think it depends on the type of content you are dealing with. For instance, for Joomla! there are components that store different versions of the same article, in different languages.
I would recommend Zend Translate, as you have different options to get the translations from: PHP arrays, INI files, gettext, xml.
You can event extend the adapter class to create a database backend adapter.
Hope it helps.
I think it will help you: http://www.youtube.com/watch?v=v7vCp_TFcdU
It uses session to store the chosen language and use an array for each language.
I have another codeigniter CMS suggestion question. Pretty much I am just looking for a CMS that allows my client to easily add in content that doesn't necessarily need to be tied into a page. I pretty much want a MYSQL database with a gui that allows the client to upload content to certain tables in the db. I don't want any themes attached to the cms as the site code will all be custom built and I would prefer to write all the db calls to pull data for specific pages. I just need a way for the client to easily upload data to a table where I can create a model to pull the data.
I have heard of FuelCMS, Ionize, and PyroCMS but all these seem like they have too much. I am looking for a pretty barebones db that has a gui and good documentation for the api's. That's all.
Thanks!
I know you wouldn't think of ExpressionEngine as "lightweight" but from the perspective that you are speaking, it is...
It allows complete control of content separate from any design's or concepts of "pages". It's power is that you define "objects" or channels that contain specific information that you then take and construct the pages around. Channels are a more user friendly concept of the MYSQL tables you're talking about.
DownsideYou get good support because it's not free, but and worth the money.
It's module development pattern is familiar if you are a codeigniter developer also.
Are there any PHP based CMS which could be integrated with an existing database? My client already has a big inventory solution which was written in VBA. Now we need to setup a web based shop for them and we are thinking of setting up a CMS from the shelf.
Is there any way we can integrate the current database scheme with that of CMS ?
That depends entirely on what the big inventory solution does, where it is running, how it is structured, and how the two solutions are supposed to coexist.
Usually though, you will want to make use of existing export/import functions on both sides (e.g. XML) instead of having two applications meddle with the same data base. That often ends in tears.
The issue is that the CMS would have to understand how the database was laid out. Even having a different number of columns would prevent the CMS from understanding the structure of the tables. A CMS isn't a human. It just interacts with data how it's told, it doesn't ever interpret or understand the data.
Your best bet would be to first install a simple CMS with a table prefix (something like installing WordPress with "wp_" as the table prefix) to prevent it from over-writing any existing tables. After this you would need to write a plugin for the CMS which tells it how to read your database. It may be possible to find a plugin which already does what you want and then modify it to use your table design rather than the intended one.
In either case, though, the information in a database is part of the site's content. As such it's the job of the content management system to both create and maintain it. Creating it outside of the CMS will generally confuse it and require some work to integrate it.
first of all, just for curiosity, why cms and not framework? Using framework will ensure you can fully integrate your existing database with less effort (in terms of long-time consistency and reliability).
Still, if you want to stay with CMS, I would recommend Drupal. They have nice feature which allow you to interact with multiple databases without modifying your current data structure. Furthermore, you can build your own customized modules or even installation profile to suits your need.
From your experience, is it better to use 1 language file or multiple smaller langauge files for each language in a PHP project using the gettext extension? I am not even sure if it is possible to use multiple files, it is hard for me to test since the server caches the language files.
I am doing multiple languages on a social network site, so far just the signup page which is about 1 out of 200 pages to go and it has 35 text strings to translate, at this pace the language file for each language wold be really large so I was thinking maybe it would be better to do different language files for differnt pages or perhaps sections like forums section and blogs section but if it makes no difference then I would ratther not waste my time in making multiple smaller files for each language.
I realize every situation is different and the only real answer is to test it but I am hoping to avoid that this time and just get some oppinions of people more experienced, this is my first time using gettext, thanks
I would have the language files module based. With gettext you need to specify locale for each language. It would fit best to have a separate .po/.mo files for each module or big parts of your site.
That's my opinion. :-)
I typically automate the process and have multiple languages in multiple files by using a database to edit the site (using a simple db lookup). This lets me hire translators to come in and verify the current translation easily. Deploying to production then is simply turning the database into a set of language files.
From experience i would break the languages down on a per file basis as the management overhead becomes heavy and there is great scope for duplication and mistakes.
The other advantage it that by using a directory structure and naming convention the correct language can be selected programatically more easily than the large file and it is easier to write management tools at a later stage in the project.
It is also worth looking at some of the formats other people use. Many of the Frameworks use this sort of structure, Dashcode, Symfony, Zend etc. And there is an xml format xliff which is built to handle translation and integrates with many of the tools that translators use.
Multiple files are the best way to go, but things can get disorganized.
We've just launched a free new service called String which solves most of the problems of managing multiple language files - like a basecamp for localization. You can either import existing files, or start from scratch with keys and strings in the system. When you're ready, you can export the files again to run your app. It works with PHP (array), PHP (define), po, yaml, ini and .strings formats.
String allows you to collaborate with translators easily - you just invite them to a project and set their language permissions. Translators can leave comments and questions on each string if they need more info - and you can revert strings back using the History function if things aren't quite right.
Anyway enough sales pitch!
Check it out at http://mygengo.com/string - we'd love your feedback.