Update
I have looked into various php frameworks (yii framework looks particularly good/interesting) and they seem to offer some very good features compared to simple template engines (including MVC and other features like database integration). I am certain that some from of separation between code and display is needed. Whether that is just plain php (as a template engine), a template engine (e.g smarty) or a framework probably depends a lot on the application and the company/programmers choice. (and an issue I can continue to research in my own time, and will probably not get definite answers for)
I still have one remaining question. Is it possible to have a multi-tiered setup (as in multiple separate individual servers) where one runs php code ("application logic"), which is outputted in a form such as XML or JSON or any other data interchange format, that then gets sent to the web/HTML server which takes the output from the php code and converts it into HTML so that the total average number of pages served per a second is faster than just using a single tier. (even if that single tier got all the servers from the two separate tiers combined for itself. Also I'm guessing the parsing time for XML (and probably JSON) would decrease it, but a new protocol between the two tiers could be used that is optimized for this purpose.)
I was pondering HTML and code separation (both to implement MVC and allow web designers (as in looks/views) and web developers (as in code/controllers) to work independently) and I thought it should be possible to have application servers that run PHP (Application/Business Logic/Controller) and web servers that take the output from the application servers and insert it into HTML markup (Looks/Views).
In theory it would work a bit like the separation of an application server and a database server, while for a single request it might be slightly slower for that one user due to network overhead you can handle considerably more simultaneous requests with two small servers than one big server. For example the application server could send it's processed (view independent) information to the web server that would then insert that into the HTML (which could be different depending on the client, e.g mobile browsers). It may be possible to cache the HTML in ram but not the dynamic content so that even if the page looks fairly different for every user (e.g Facebook) some speed benefit is still gained compared to a single HTML/PHP combo. It would also, of course, separate the HTML from the application code (PHP).
I looked into this and I came across numerous PHP template engines which would facilitate the separation of HTML from code, however some of them are considerably slower than using just PHP and "compiling" the template doesn't seem to make that much of a difference (and would prevent using separate web/HTML and code/PHP servers). While using PHP itself as a template engine might work ok for me as a single developer it will not work in an environment with separate web designers.
So to summarize I am kind of looking for/to create a combination of a MVC-framework and template engine/system that will facilitate HTML and code separation and view and model/controller separation that will actually be faster than just using a single tier. Or, more accurately, have better scalability than a single tier (although I wouldn't expect much of a speed decrease, if any, for single pages).
Have you looked into one of the gazillion PHP frameworks around? Most of them sport all of this in one form or another. Heck, I've written a few myself, although with a strict XML templating engine to get away from greedy PHP developers. :) (xSiteable if you want to peek at the code)
MVC is nice and all, but there's many ways of skinning this cat, and to be frank, MVC (or any of its many incarnations, mutations and variations) gives separation at the cost of complexity, so you'll certainly not make it faster per se. The only thing I can think of is that your template engine spits out (ie. writes to disk in a cached fashion) pure PHP files based on backend business logic, and then you can use various accelerators with good use. You still have to decide on what the templating environment should be, though. HTML with interspersed notation or PHP, XML, or something else?
A compiler is easy enough to make, but I'm a bit wary. I doubt you'll get much speed improvements (at least not compared to the added complexity) that way over a well cached templating engine, but it's certainly doable.
But why not just use PHP itself, and simple Apache rewrite rules (using 'uri' as parameter)?
$theme = 'default' ;
$dbase = 'mysql; '
$logic = $_REQUEST['uri'] ; // or some other method, like __this__ with starting folder snipped
include 'themes/top.html' ;
include 'logic/'.$logic.'/header.php' ;
include 'themes/mid.html' ;
include 'logic/'.$logic.'/menu.php' ;
include 'themes/section1.html' ;
include 'logic/'.$logic.'/section1.php' ;
include 'themes/section2.html' ;
include 'logic/'.$logic.'/section2.php' ;
include 'themes/section3.html' ;
include 'logic/'.$logic.'/section3.php' ;
include 'themes/bottom.html' ;
include 'logic/'.$logic.'/footer.php' ;
include 'themes/end.html' ;
It's brute, fast, and does provide what you want, although it's not elegant nor pretty nor, uh, recommended. :)
Related
It has recently been highlighted (in my previous questions) that the way I have designed web applications is not ideal.
Consider the following. I am working on a multi-user website with lots of different sections including profiles and forums and support tickets. The structure is as follows:
A main page in which all the other pages are included or *required_once* we'll call it home.php.
In home.php, one of the first things loaded is router.php, this handles every single $_GET and $_POST that the user could possibly produce, and every form and process is sorted via a main variable called $data_process. Router.php is essentially just one giant switch() statement for $data_process. This parses all the data and gives a result.
Next included is header.php, which will not only process the neccessary variables for the page that will be loaded but also set up the header and decided exactly what is going to be shown there, e.g. menu, user info, and information about the page currently viewing (i.e. Home > Support > View Ticket).
Then the page is loaded according to $page variable. A simple include.
Then footer.php, then close.
And so the dynamic website is created. I was told this is bad practice by a user named #HorusKol. I am very pleased with this website as it is the most streamlined and easy to write website I have ever used. If this is still bad code design? What is perfect code design?
PS - can anyone recommend any good easy to read books that explain PHP, MySQL and design structure for me?
It is bad design because you process a lot of data that is perhaps not necessary in the rest of the process. The router should only process the url, processing of post data is handled somewhere else. Only include what you need, including everything makes things slow.
A better way is to structure you app more in different parts. A router that is processing the url, a controller that runs a action based on a routed request, a view that processes all html and pages, a model for accessing data. MVC is what comes in mind.
There is no such thing is the perfect code design.
There's no canonical definition of "good design" - the best you can hope for is that your design balances the various forces on your project in the optimum way, Forces on your project might be maintainability, performance, scalability, extensibility - classic non-functional requirements - but also things like search engine optimization, standards compliance and accessibility (things that apply to web projects in particular).
If all your URLS are of the form "www.mysite.com/home.php?action=getDetails&productID=123", your search engine friendliness is pretty low. It's far better to have semantic URLs - "www.mysite.com/products/DesktopPc/details.php". You can achieve this through cunning .htaccess trickery in your current design.
From a maintainability point of view, your design has some issues. If I've understood it correctly, adding a new page to the site requires you to modify the code in several different source files - router.php (your giant switch statement), the page itself, and probably the header.php as well. That indicates that the code is tightly coupled. Modifying the giant switch statement sounds like a likely source of entertaining bugs, and the combination of the router and the header, manipulating the variables, plus the actual page itself seems a little fragile. This is okay if you're the only person working on the project, and you're going to be around for the long term; if that's not the case, it's better to use an off-the-shelf framework (MVC is the current favourite; Zend, Symphony and Cake all do this well in PHP) because you can point new developers at the documentation and expect them to get up to speed.
One of the biggest enemies of maintainability is complexity - complex code is harder to work with, and harbours more bugs. There's a formal metric for complexity, and I'm pretty sure your switch statement scores very highly on that metric - in itself not necessarily a huge problem, but definitely something to keep an eye on. Lots of MVC frameworks avoid this by having the routing defined as data rather than code (i.e. have the routes in a configuration file), and/or by using convention over configuration (i.e. if the request is for page "productDetails", include the file "/inc/productDetails.inc").
Extensibility could be another concern - imagine having to expose your site content as JSON or XML as well as HTML; in your current design, that would require a lot of change, because every single item in your page processing pipeline cares and needs to know. The home.php needs to know not to send HTML, the header and footer need to know, the router needs to understand how to handle the additional data type, almost certainly making the switch statement even bigger. This again may not be a big deal.
Both extensiblity and maintainability are helped by being able to unit test your code. Test Driven Development turns this into a whole routine in its own right; I'm guessing that testing your application is hard - but that's just a guess; it depends more on how you've factored the individual lumps of code than what you've described above. However, another benefit of MVC is that it makes it easy to write unit tests for key parts of your system.
So, if the forces on your project don't include an emphasis on maintainability or extensibility, and you can handle the SEO aspect, I don't think your design is necessarily "bad"; even if you do care about those things, there are other things you can do to accommodate those forces - write documentation, hire lots of cheap coders.
The best way to get up to speed with these design topics are not books on PHP or MySQL; I'd recommend "Refactoring" and "Patterns of enterprise application architecture" by Martin Fowler, "Design Patterns" by Gamma et al. and Code Complete by McConnell (though that's a touch out of date by now).
What are the advantages and disadvantages of separating PHP and HTML content?
The advantages mostly pertain to code readability, which in large applications plays a huge part in the maintenance of the application.
The disadvantages are that it sometimes makes advanced functionality hard to execute. Most of the time, it can be done and still keep the two separate, but it's often much simpler and easier to just insert php snippets into html code or vice-versa if its just a small amount of code.
It's a trade off between ease of execution in certain cases and readability. For the most part, I would recommend trying to keep them separate.
HTML is just for the representation of your results / forms etc (your view), see MVC pattern for more info.
If you separate it from your business logic, you'll be able to generate other views easily (e.g. JSON for Javascript).
If you're using a templating engine as well, your HTML/CSS gurus can work on the look and feel independently as well.
Separating program logic (the PHP part) from presentation (the HTML part) is beneficial for several reasons:
It allows you to change the presentation without affecting the program's inner workings, that is, you're not altering what it does by changing the layout
It allows for independent testing of both parts: you can execute parts of the program logic from a test script and inspect the results, which means you can automate a large portion of your testing
Maintenance becomes much easier, because you have less code to look through when looking for errors
For larger teams, the application can be structured in a way that allows designers (that is, people with little understanding of programming) to alter the HTML part independently without much risk of breaking the program logic
It enables you to focus on one problem at a time. You don't want to burden your mind with HTML details while debugging algorithms, and vv.
Code reuse: If your presentation layer delegates its calls to the logic layer, instead of doing it itself, chances are you'll be reusing that logic elsewhere; having it in the logic layer means you can just call it instead of copy-pasting all over the place (which in turn leads to maintenance nightmares)
There is a great advantage to separating the two because you can edit html code without breaking the PHP code. Smarty is a good template engine to learn.
The main reason is your code will be ugly if you merge those php (business logic) with the html (presentation) together, which in turn become hard to read and become hard to maintain. It won't be a problem if your web application is a simple one. But if it's a enterprise scale project, maintaining this merged code will be a nightmare for anyone.
Beside, the kind of programming you do
for each part and sometimes the
language you do it in are different.
Separating the two allows one to use
the best tools for that particular
part.
source: http://www.paragoncorporation.com/ArticleDetail.aspx?ArticleID=21
if u want to divide the php code and html, use any php frameworks, such as codeigniter, cakephp, zend, yii, etc., the main advantage is, if ur going to change the site design not your functionality, at that time it will be very useful and also we can develop the code in reusable manner.
I'm working on a project that allows multiple users to submit large data files and perform operations on them. The "backend" which performs these operations is written in Perl while the "frontend" uses PHP to load HTML template files and determines which content to deliver. Data is stored in a database (MySQL, SQLite, Oracle) and while there is data which has not yet been acted upon, Perl adds it to a running queue which delivers data to other threads based on system load. In addition, there may be pre- and post-processing of the data before and after the main Perl script operates (the specifications are unclear) so I may want to allow these processors to be user-selectable plugins. I had been writing this project in a more procedural fashion but I am quickly realizing the benefit of separating concerns as to limit the scope one change has on the rest of the project.
I'm quite unexperienced with design patterns and am curious what the best way to proceed is. I've heard MVC thrown around quite a bit but I am unsure of how to apply it. Specifically, what are some good options to structure this code (in terms of design patterns and folder hierarchy)? How can I achieve this with both PHP and Perl while minimizing duplicated code between languages? Should I keep my PHP files in the top level so I don't have ugly paths in the URL?
Also, if I want to provide interchangeable databases, does each table need its own DAO implementation?
This is really several questions, but here goes:
MVC
To implement MVC you'll need to have a clear idea of what the Model, View and Controller sections are responsible for. You can read about this for yourself, but in this case:
The model should only contain the code which performs operations on the data files, i.e. the back-end Perl scripts
The view will be the HTML templates only. No PHP logic should be embedded in them except what's necessary to display the page.
The controller will be the rest of the application: the parts that connect the PHP front-end to the Perl back-end, and possibly the Perl scripts that poll for new files.
For every php, html or perl file you create, it should be absolutely clear to which section it belongs, in its entirety. Never mix up model, view or controller code in the same file.
There should be no reason why you can't continue writing procedurally. You don't necessarily need a framework either; It may help you to slot things into place, but may also take some time to learn.
MVC is a more of a separation of concerns that you need to keep in mind. A good way to think about it is: "Can each of these components work separately from the others", e.g.:
can you write a 'mocked up' (example) data file, and have the Perl scripts process it without running any of the PHP code?
can you request an operation from the front-end, and have it deliver all the parameters to a single place, ready for a single Perl routine to pick up, without running and Perl code?
You shouldn't need to worry about code duplication if the PHP and Perl scripts are doing totally different things (PHP is only setting up users parameters and input files, and Perl is only taking those parameters and files, and outputting new files).
As for folder hierarchy, if you're not using any existing framework, the most important thing is just that it makes sense, is consistent, and that you document your decisions, e.g. in a readme file.
Ugly URLs
You don't need to put your php files in any particular place. Use Apache rewrite rules to make the URLs pretty afterwards. (rewrite rules generator - see the "301 Redirect File" section). But a good MVC framework will solve this for you.
User-selectable plugins
Be careful not to optimise too early. You may be able to develop the new pre/post-processing steps yourself, and just put them in a list for users to select from.
MVC frameworks are a great tool, for any web developer, that provides well-defined separation indispensable for keeping your code organized.
For PHP I recommend Zend Framework
Database schema will change depending on what platform you use.
hi
I am working on a great website (social network with php) and I've decided to create only one php page, (index.php), but this php page will contain php if conditions and statments of the $_GET value,and will display the page requered (but in the same page index.php).
This means that the code(javascript+xhtml+php) will be very huge (nearly all the project in one page).
I will also use the Htaccess to rewrite the urls of those pages to avoid any malicious requests (so it will appear just like a normal website).
But, before doing so, I just want to know about the advantages and downsides of this technique, seeing it from all other sides (security, server resources, etc...)
thank you
I think what you're trying to do is organize your code properly and effectively, which I commend.
However if I understand correctly, you're going to put all of your javascript, html, and PHP in one file, which is really bad. You want your code to be modular, not lumped together in a single file.
I think you should look into using a framework (eg Zend) - PHP frameworks are specifically designed to help your code remain organized, modular, and secure. Your intent (organizing your code effectively) is great, but your idea for how to organize your code isn't very good. If you're absolutely adament about not using a framework (for example if this is a learning/school project), you should at least make sure you're following best practices.
This approach is not good because of server resource usage. In order to get access to say jQuery.js your web server is going to:
Determine that jQuery.js actually passes through index.php
Pass index.php through the php parser
Wait for php to generate a response.
Serve that response.
Or, you could serve it this:
Determine jQuery.js exists in /var/www/mysite/jQuery.js
Serve it as the response.
Likewise for anything that's "static" i.e. isn't generated from PHP directly. The bigger the number of ifs in the PHP script, the more tests will need be done to find your file.
You do not need to pass your static content through some form of url routing; only your dynamic content. For real speed, its better to generate responses ready as well, called caching, particularly if the dynamic content is expensive in terms of cpu cycles to generate. Other caching techniques include leaving frequently accessed database data in memory, which is what memcached does.
If you're developing a social network, these things really do matter. Heck, facebook wrote a PHP-to-C++ compiler to save clock cycles.
I second the framework recommendation because it really will make code organisation easier and might integrate with a caching-based solution.
In terms of PHP frameworks, there are many. Here's a list of many web application frameworks in many languages and from the same page, the PHP ones. Take a look and decide which you like best. That's what I did and I ended up learning Python to use Django.
Came by this question searching so since the best answer is old, here is more modern one, from this question
Why use a single index.php page for entire site?
A front controller (index.php) ensures that everything that is common to the whole site (e.g. authentication) is always correctly handled, regardless of which page you request. If you have 50 different PHP files scattered all over the place, it's difficult to manage that. And what if you decide to change the order in which the common library files get loaded? If you have just one file, you can change it in one place. If you have 50 different entry points, you need to change all of them.
Someone might say that loading all the common stuff all the time is a waste of resources and you should only load the files that are needed for this particular page. True. But today's PHP frameworks make heavy use of OOP and autoloading, so this "waste" doesn't exist anymore.
A front controller also makes it very easy for you to have pretty URLs in your site, because you are absolutely free to use whatever URL you feel like and send it to whatever controller/method you need. Otherwise you're stuck with every URL ending in .php followed by an ugly list of query strings, and the only way to avoid this is to use even uglier rewrite rules in your .htaccess file. Even WordPress, which has dozens of different entry points (especially in the admin section), forces most common requests to go through index.php so that you can have a flexible permalink format.
Almost all web frameworks in other languages use single points of entry -- or more accurately, a single script is called to bootstrap a process which then communicates with the web server. Django works like that. CherryPy works like that. It's very natural to do it this way in Python. The only widely used language that allows web applications to be written any other way (except when used as an old-style CGI script) is PHP. In PHP, you can give any file a .php extension and it'll be executed by the web server. This is very powerful, and it makes PHP easy to learn. But once you go past a certain level of complexity, the single-point-of-entry approach begins to look a lot more attractive.
It will be a hell of a mess.
You also wont be able to upgrade parts of the website or work on them without messing with the whole thing.
You will not be able to apply some programming architecture like MVC.
It could theoretically be faster, because you have only one file that needs to be fetched from disk, but only under the assumption that all or at least almost all the code is going to be executed.
So you will have to load and compile the whole file for every single request, also the parts that are not needed. so it will slow you down.
What you however CAN do is have a single point of entry where all requests originate from. That helps controlling a lot and is called a bootstrap file.
But most importantly:
Why would you want that?
From what I know most CMSes (and probably all modern ones) are made so that the requested page is the same index.php, but that file is just a dispatcher to other sections. The code is written properly in different files that are built together with includes.
Edit: If you're afraid your included scripts are vulnerable the solutions is trivial. Put them outside of the web root.
Simplistic example:
<?php
/* This folder shouldn't even be in the site root,
it should be in a totally different place on the server
so there is no way someone could request something from it */
$safeRoot = '/path/to/safe/folder/';
include $safeRoot.'all_pages_need_this.php'; // aka The Bootstrap //
switch($_GET['page']){
case 'home':
include $safeRoot.'home.module.php';
break;
case 'blog':
include $safeRoot.'blog.module.php';
break;
case 'store':
include $safeRoot.'store.module.php';
break;
default:
include $safeRoot.'404.module.php';
}
This means that the code(javascript+xhtml+php) will be very huge (nearly all the project in one page).
Yes and it'll be slow.
So you're not going to have any HTML cacheing?
It's all purely in one file, hard to update and slow to interpret? geesh, good luck.
What you are referring to is called single point of entry and is something many web applications (most notably the ones built following the MVC pattern) use.
The code of your point of entry file doesn't have to be huge as you can simply include() other files as needed. For example:
<?php
if ($_GET['module'] == 'messages') {
include('inbox.php');
}
if ($_GET['module'] == 'profile') {
include('profile.php');
} etc..
I'm building a PHP site, but for now the only PHP I'm using is a half-dozen or so includes on certain pages. (I will probably use some database queries eventually.)
Are simple include() statements a concern for speed or scaling, as opposed to static HTML? What kinds of things tend to cause a site to bog down?
Certainly include() is slower than static pages. However, with modern systems you're not likely to see this as a bottleneck for a long time - if ever. The benefits of using includes to keep common parts of your site up to date outweigh the tiny performance hit, in my opinion (having different navigation on one page because you forgot to update it leads to a bad user experience, and thus bad feelings about your site/company/whatever).
Using caching will really not help either - caching code is going to be slower than just an include(). The only time caching will benefit you is if you're doing computationally-intensive calculations (very rare, on web pages), or grabbing data from a database.
Sounds like you are participating in a bit of premature optimization. If the application is not built, while performance concerns are good to be aware of, your primary concern should be getting the app written.
Includes are a fact of life. Don't worry about number, worry about keeping your code well organized (PEAR folder structure is a lovely thing, if you don't know what I'm talking about look at the structure of the Zend Framework class files).
Focus on getting the application written with a reasonable amount of abstraction. Group all of your DB calls into a class (or classes) so that you minimize code duplication (KISS principles and all) and when it comes time to refactor and optimize your queries they are centrally located. Also get started on some unit testing to prevent regression.
Once the application is up and running, don't ask us what is faster or better since it depends on each application what your bottleneck will be. It may turn out that even though you have lots of includes, your loops are eating up your time, or whatever. Use XDebug and profile your code once its up and running. Look for the segments of code that are eating up a disproportionate amount of time then refactor. If you focus too much now on the performance hit between include and include_once you'll end up chasing a ghost when those curl requests running in sync are eating your breakfast.
Though in the mean time, the best suggestions are look through the php.net manual and make sure if there's a built in function doing something you are trying to do, use it! PHP's C-based extensions will always be faster than any PHP code that you could write, and you'll be surprised how much of what you are trying to do is done already.
But again, I cannot stress this enough, premature optimization is BAD!!! Just get your application up off the ground with good levels of abstraction, profile it, then fix what actually is eating up your time rather than fixing what you think might eat up your time.
Strictly speaking, straight HTML will always serve faster than a server-side approach since the server doesn't have to do any interpretation of the code.
To answer the bigger question, there are a number of things that will cause your site to bog down; there's just no specific threshold for when your code is causing the problem vs. PHP. (keep in mind that many of Yahoo's sites are PHP-driven, so don't think that PHP can't scale).
One thing I've noticed is that the PHP-driven sites that are the slowest are the ones that include more than is necessary to display a specific page. OSCommerce (oscommerce.com) is one of the most popular PHP-driven shopping carts. It has a bad habit, however, of including all of their core functionality (just in case it's needed) on every single page. So even if you don't need to display an 'info box', the function is loaded.
On the other hand, there are many PHP frameworks out there (such as CakePHP, Symfony, and CodeIgniter) that take a 'load it as you need it' approach.
I would advise the following:
Don't include more functionality than you need for a specific page
Keep base functions separate (use an MVC approach when possible)
Use require_once instead of include if you think you'll have nested includes (e.g. page A includes file B which includes file C). This will avoid including the same file more than once. It will also stop the process if a file can't be found; thus helping your troubleshooting process ;)
Cache static pages as HTML if possible - to avoid having to reparse when things don't change
Nah includes are fine, nothing to worry about there.
You might want to think about tweaking your caching headers a bit at some point, but unless you're getting significant hits it should be no problem. Assuming this is all static data, you could even consider converting the whole site to static HTML (easiest way: write a script that grabs every page via the webserver and dumps it out in a matching dir structure)
Most web applications are limited by the speed of their database (or whatever their external storage is, but 9/10 times that'll be a database), the application code is rarely cause for concern, and it doesn't sound like you're doing anything you need to worry about yet.
Before you make any long-lasting decisions about how to structure the code for your site, I would recommend that you do some reading on the Model-View-Controller design pattern. While there are others this one appears to be gaining a great deal of ground in web development circles and certainly will be around for a while. You might want to take a look at some of the other design patterns suggested by Martin Fowler in his Patterns of Enterprise Application Architecture before making any final decisions about what sort of design will best fit your needs.
Depending on the size and scope of your project, you may want to go with a ready-made framework for PHP like Zend Framework or PHP On Trax or you may decide to build your own solution.
Specifically regarding the rendering of HTML content I would strongly recommend that you use some form of templating in order to keep your business logic separate from your display logic. I've found that this one simple rule in my development has saved me hours of work when one or the other needed to be changed. I've used http://www.smarty.net/">Smarty and I know that most of the frameworks out there either have a template system of their own or provide a plug-in architecture that allows you to use your own preferred method. As you look at possible solutions, I would recommend that you look for one that is capable of creating cached versions.
Lastly, if you're concerned about speed on the back-end then I would highly recommend that you look at ways to minimize your calls your back-end data store (whether it be a database or just system files). Try to avoid loading and rendering too much content (say a large report stored in a table that contains hundreds of records) all at once. If possible look for ways to make the user interface load smaller bits of data at a time.
And if you're specifically concerned about the actual load time of your html content and its CSS, Javascript or other dependencies I would recommend that you review these suggestions from the guys at Yahoo!.
To add on what JayTee mentioned - loading functionality when you need it. If you're not using any of the frameworks that do this automatically, you might want to look into the __autoload() functionality that was introduced in PHP5 - basically, your own logic can be invoked when you instantiate a particular class if it's not already loaded. This gives you a chance to include() a file that defines that class on-demand.
The biggest thing you can do to speed up your application is to use an Opcode cache, like APC. There's an excellent list and description available on Wikipedia.
As far as simple includes are concerned, be careful not to include too many files on each request as the disk I/O can cause your application not to scale well. A few dozen includes should be fine, but it's generally a good idea to package your most commonly included files into a single script so you only have one include. The cost in memory of having a few classes here and there you don't need loaded will be better than the cost of disk I/O for including hundreds of smaller files.