php performance: template files in database or file? - php

I just wonder what is best practice to store template files? In CMS I have using templates and some of parameters are stored in database... But there are some issues then i need to change something in templates or change one of parameter in many pages. Site has 100K unique visits every day... and I don't really want to make experiments with site. Just whant to know what is best for performance, to store templates and parameters in database or in file?

Database access will on the whole be quicker (for many concurrent reads) than disks (unless highly mirrored). And it is more scalable but this depends on configuration making this highly subjective.
extended
Because your files will be quite small, memcache+SQL backend is still better than JBOD or sync of directories between nodes. Unless you want a SAN/NAS but that will work out more expensive if you just want to serve a bunch of small text segments. This is based on the fact you are probably already using an RDBMS of some kind.
Really, it depends on too many factors to go into.

Related

Physical directories for posts and users

Is there anything wrong in the practice that consists of creating physical directories for each post (which is created) and each user (who registers) on a website?
For an architecture which resembles:
www.site.com/u/john
www.site.com/u/mike
www.site.com/post/za634df
www.site.com/post/df124zs
(PHP, Linux based FYI).
I indeed always thought I needed a physical folder if I wanted a post URL to be able to be shared on Facebook for example etc. Is this true?
Plus, by doing so, would I run into problems? slow downs? server directories limits? etc.
I don't think it's a terribly good idea. For one, it's orders of magnitudes slower to manipulate the filesystem than to make an entry in a database, so there's this obvious performance loss, and the number of subdirectories may be limited depending on the filesystem (which may cause portability issues on the long run). There may be other effects, but those would come up with only really big traffic.
But the design perspective is much more important here: the physical file structure is a wholly different layer than the URI layout, which is an abstraction for it. By tying your URI scheme to the physical layout you discard the advantages of this abstraction. As usual with design principles, this may not seem like a big deal if your project is small, short-lived, or you're in a big hurry and don't care; but on the long run, keeping the separation of concerns can be quite vital for everybody's sanity.
That said, your idea may have merits if properly implemented, though in my opinion it'd be better to use a database and an URL-rewriting engine (like Apache's mod_rewrite) to achieve the same effect. And even if you do end up creating folders for everything, make absolutely sure that the procedure is properly abstracted and no piece of code relies on stuff being in the same directory, and no piece of code actually manipulates the filesystem dirctly (relying instead on a single unified helper class to do it).
Oh and, no, Facebook can't see what's behind your URI scheme.
Why Would you want to make several directories for each user and add same codes in each directory when you can simply handle them all using rewrite url from a single script .
Several scripts will also cost you much-much more storage and will make you cry while updating or adding some new feature in your site.

Whats the best & fastest method to support multi language in PHP Application

Though there are lot of similar questions already asked here, I didn't find the answer i was looking for..
What's the best way to develop multi-language application, It should be very fast.. and i don't know how much text i will translate.
Method 1: create and keep all the text in an array for every language i want to support and include that file everywhere.
Method 2: Use gettext (.MO, .PO files)
Method 3: Store all the translations in a text file and write a function to go through all the text and when matched display its value
Method 4: Store all the text and its translations in database, But i don't think it will be faster than storage in Filesystem.
Method 5: Same as method 1 but i will create multiple files per language just to keep everything structured.
Though all of these will work, Which do you guys think will be the fastest method and do let me know if i missed any method.
This is a complicated problem and its not always as obvious as you might think. You may in some cases, with right to left languages or particular cultural reasons, need to develop separate layouts for a particular region.
Regardless of which method you choose, you will want to cache all of or parts of your pages and use a cached version if available before regenerating the page again.
I would probably avoid 3 and 4. You don't want to be reading from the disk more than you have to. If you can cache translation arrays in memcached, you can save yourself disk access in loading translation tables.
As a person managing localization projects for developers, I have to say that both sides (translators and developers) have been very happy with Gettext (.po files). It's relatively simple to implement in your code (basically a wrapper around any text you want localized), it's seamlessly fast, and most importantly: it scales and updates flawlessly.
The main advantage is that lots of tools exist for creating, updating, managing, and translating .po/.pot files, including the cross-platform PoEdit. When you have dozens of languages to do, it's as easy as extracting the latest .pot file and sending that to the translation team. They'll return individual files for each language. I haven't seen many systems that can scan and locate new strings as easily or present them to translators for use as simply.
I would recommend looking at PHP Frameworks which support multiple languages. Investigate the most popular first, Zend, Symphony and Yii. I have used Yii before and this has multi language support.
http://www.yiiframework.com/extension/yii-multilanguage/

Storing website parameters in a database or flat files

For the most part, the 3 sites for an organization I run have a single MySQL database that they share. This allows them to interact with each other nicely.
I have a bunch of simple parameters that the sites need to know about, and I wasn't sure what the best route to take is:
Make a table with 2 fields (key, value) where I store the params
Store the values in one or many flat files
They each have advantages and disadvantages.
The database allows a single entry to be used for all three sites (however, this doesn't occur often), all the information is centralized, and the interface is already well defined.
The flat files are easier to work with as FTP and a text editor can be used in addition to website administration, the flat files can be written as PHP meaning the site doesn't have to do any parsing (just need to include the file and use the variables), but they can't be shared between sites.
I can go on and on. What do you think is the better route to take?
My opinion is to use a database if you have the chance. It's easier on the long run. You have built the website, with the flat files, but now the customer wants an additional page with slightly other parameters, so you have to add a new file. Now you are done with this and he ask you again.. well, you get the point.
It is not organised at all. So if you have the chance to use a database, use it. There is reason why it is invented.
But just to get my thoughts clear and firmly know what you are talking about, please tell me more about the settings you would like to store. I can imagine that you are talking about some global variables, or maybe even going to use a define(), but it is also possible you want to store strings.
So please define "bunch of simple parameters" for us.

Sharing model data between web applications

I'm looking for the best possible way of sharing model data between two MVC (I'm using Symfony) driven web sites.
Background information
We have two web sites A and B. The same software is used for both sites, but there are different customers and data. Customers are allowed to release content. Now we're going to introduce a new payment option with the advantage that the user's content is released on both web sites automatically.
Implementation ?!
I have three ideas for the implementation:
Using the same database for both applications. Then I would have to extend some tables by one column which indicates the appropriate target web site (A/B).
I think that this would be bad design. A lot of code has to be rewritten in order to exclude records from query result sets, which does not belong to the respective web site.
Using two databases.
In my opinion, this would decrease performance significantly and would be very hard to implement. Data has always to be requested twice. Also, in future there may be web sites C,D,E...
Synchronizing two databases via web-service.
Some data would be stored twice. Therefore, all operations on such a piece of data has to be performed twice (create, read, update, destroy).
Now I'm stuck, because each solution has serious disadvantages.
Do you have any ideas? If not, which one do you think is the best of mine?
I think your first option is the best. You're going to reduce duplicate data as much as possible and you should have the best performance. You will have to add an extra check to exclude the records not belonging to each particular website but all solutions will require work.

PHP performance considerations?

I'm building a PHP site, but for now the only PHP I'm using is a half-dozen or so includes on certain pages. (I will probably use some database queries eventually.)
Are simple include() statements a concern for speed or scaling, as opposed to static HTML? What kinds of things tend to cause a site to bog down?
Certainly include() is slower than static pages. However, with modern systems you're not likely to see this as a bottleneck for a long time - if ever. The benefits of using includes to keep common parts of your site up to date outweigh the tiny performance hit, in my opinion (having different navigation on one page because you forgot to update it leads to a bad user experience, and thus bad feelings about your site/company/whatever).
Using caching will really not help either - caching code is going to be slower than just an include(). The only time caching will benefit you is if you're doing computationally-intensive calculations (very rare, on web pages), or grabbing data from a database.
Sounds like you are participating in a bit of premature optimization. If the application is not built, while performance concerns are good to be aware of, your primary concern should be getting the app written.
Includes are a fact of life. Don't worry about number, worry about keeping your code well organized (PEAR folder structure is a lovely thing, if you don't know what I'm talking about look at the structure of the Zend Framework class files).
Focus on getting the application written with a reasonable amount of abstraction. Group all of your DB calls into a class (or classes) so that you minimize code duplication (KISS principles and all) and when it comes time to refactor and optimize your queries they are centrally located. Also get started on some unit testing to prevent regression.
Once the application is up and running, don't ask us what is faster or better since it depends on each application what your bottleneck will be. It may turn out that even though you have lots of includes, your loops are eating up your time, or whatever. Use XDebug and profile your code once its up and running. Look for the segments of code that are eating up a disproportionate amount of time then refactor. If you focus too much now on the performance hit between include and include_once you'll end up chasing a ghost when those curl requests running in sync are eating your breakfast.
Though in the mean time, the best suggestions are look through the php.net manual and make sure if there's a built in function doing something you are trying to do, use it! PHP's C-based extensions will always be faster than any PHP code that you could write, and you'll be surprised how much of what you are trying to do is done already.
But again, I cannot stress this enough, premature optimization is BAD!!! Just get your application up off the ground with good levels of abstraction, profile it, then fix what actually is eating up your time rather than fixing what you think might eat up your time.
Strictly speaking, straight HTML will always serve faster than a server-side approach since the server doesn't have to do any interpretation of the code.
To answer the bigger question, there are a number of things that will cause your site to bog down; there's just no specific threshold for when your code is causing the problem vs. PHP. (keep in mind that many of Yahoo's sites are PHP-driven, so don't think that PHP can't scale).
One thing I've noticed is that the PHP-driven sites that are the slowest are the ones that include more than is necessary to display a specific page. OSCommerce (oscommerce.com) is one of the most popular PHP-driven shopping carts. It has a bad habit, however, of including all of their core functionality (just in case it's needed) on every single page. So even if you don't need to display an 'info box', the function is loaded.
On the other hand, there are many PHP frameworks out there (such as CakePHP, Symfony, and CodeIgniter) that take a 'load it as you need it' approach.
I would advise the following:
Don't include more functionality than you need for a specific page
Keep base functions separate (use an MVC approach when possible)
Use require_once instead of include if you think you'll have nested includes (e.g. page A includes file B which includes file C). This will avoid including the same file more than once. It will also stop the process if a file can't be found; thus helping your troubleshooting process ;)
Cache static pages as HTML if possible - to avoid having to reparse when things don't change
Nah includes are fine, nothing to worry about there.
You might want to think about tweaking your caching headers a bit at some point, but unless you're getting significant hits it should be no problem. Assuming this is all static data, you could even consider converting the whole site to static HTML (easiest way: write a script that grabs every page via the webserver and dumps it out in a matching dir structure)
Most web applications are limited by the speed of their database (or whatever their external storage is, but 9/10 times that'll be a database), the application code is rarely cause for concern, and it doesn't sound like you're doing anything you need to worry about yet.
Before you make any long-lasting decisions about how to structure the code for your site, I would recommend that you do some reading on the Model-View-Controller design pattern. While there are others this one appears to be gaining a great deal of ground in web development circles and certainly will be around for a while. You might want to take a look at some of the other design patterns suggested by Martin Fowler in his Patterns of Enterprise Application Architecture before making any final decisions about what sort of design will best fit your needs.
Depending on the size and scope of your project, you may want to go with a ready-made framework for PHP like Zend Framework or PHP On Trax or you may decide to build your own solution.
Specifically regarding the rendering of HTML content I would strongly recommend that you use some form of templating in order to keep your business logic separate from your display logic. I've found that this one simple rule in my development has saved me hours of work when one or the other needed to be changed. I've used http://www.smarty.net/">Smarty and I know that most of the frameworks out there either have a template system of their own or provide a plug-in architecture that allows you to use your own preferred method. As you look at possible solutions, I would recommend that you look for one that is capable of creating cached versions.
Lastly, if you're concerned about speed on the back-end then I would highly recommend that you look at ways to minimize your calls your back-end data store (whether it be a database or just system files). Try to avoid loading and rendering too much content (say a large report stored in a table that contains hundreds of records) all at once. If possible look for ways to make the user interface load smaller bits of data at a time.
And if you're specifically concerned about the actual load time of your html content and its CSS, Javascript or other dependencies I would recommend that you review these suggestions from the guys at Yahoo!.
To add on what JayTee mentioned - loading functionality when you need it. If you're not using any of the frameworks that do this automatically, you might want to look into the __autoload() functionality that was introduced in PHP5 - basically, your own logic can be invoked when you instantiate a particular class if it's not already loaded. This gives you a chance to include() a file that defines that class on-demand.
The biggest thing you can do to speed up your application is to use an Opcode cache, like APC. There's an excellent list and description available on Wikipedia.
As far as simple includes are concerned, be careful not to include too many files on each request as the disk I/O can cause your application not to scale well. A few dozen includes should be fine, but it's generally a good idea to package your most commonly included files into a single script so you only have one include. The cost in memory of having a few classes here and there you don't need loaded will be better than the cost of disk I/O for including hundreds of smaller files.

Categories