web link management - php vs database - php

I am new at web development, and I have started to work on a small website just recently. Now the problem is, that since it is my first time, I move my pages a lot on the server, because of which I have to keep updating all the other pages that link to it. So, I was thinking of a dynamic way of linking the pages, so that I don't have to update at several places, but only at one.
How it is going to work is that,
there is going to be a separate database kind of thing that will contain all the webpages' updated address, and a unique key to identify them. eg. page12345 = "/about/us.php"
and anywhere where I want to include a link to the page, instead of typing .., I'll have to type something like .., or something like that
This method will also enable me to assign tags/categories to pages, and, or add other properties to them. And, I'll probably also use it for media files as well later.
Now, the thing is, I can think of only two ways to do so, one is using an array in PHP, and other is using MySQL database. The array will probably be too much to handle when the site grows and there are, like thousands of pages, on ther other hand, MySQL database will probably prove to be slower, and at the same time more of a hassle.
So what is it that you suggest? Which will be more efficient. Or is there a better way, I am open to any other ideas that you may have.

The typical way to manage that is to not worry about URLs manually at all and leave up to a router. In the end, URLs are just a technical implementation detail of the HTTP protocol. What you really want to do is identify specific pages/actions uniquely. Have a look at any reverse-routing capable router; here the Symfony implementation:
blog_show:
path: /blog/{slug}
defaults: { _controller: 'BlogController::showAction' }
Read this blog post.
This is admittedly a very high level abstraction, using YAML for specifying routes and Twig for templating with a custom defined function. However, it hopefully demonstrates the goal: don't worry about URLs much at all in your actual links. You need to have one canonical place where URLs are defined (the path in the above example), everywhere else you just refer to your target page by name (blog_show here). If you need to move URLs around, there's exactly one place where you need to do so. The thing in the middle that makes this work is the router.

Related

Storing website parameters in a database or flat files

For the most part, the 3 sites for an organization I run have a single MySQL database that they share. This allows them to interact with each other nicely.
I have a bunch of simple parameters that the sites need to know about, and I wasn't sure what the best route to take is:
Make a table with 2 fields (key, value) where I store the params
Store the values in one or many flat files
They each have advantages and disadvantages.
The database allows a single entry to be used for all three sites (however, this doesn't occur often), all the information is centralized, and the interface is already well defined.
The flat files are easier to work with as FTP and a text editor can be used in addition to website administration, the flat files can be written as PHP meaning the site doesn't have to do any parsing (just need to include the file and use the variables), but they can't be shared between sites.
I can go on and on. What do you think is the better route to take?
My opinion is to use a database if you have the chance. It's easier on the long run. You have built the website, with the flat files, but now the customer wants an additional page with slightly other parameters, so you have to add a new file. Now you are done with this and he ask you again.. well, you get the point.
It is not organised at all. So if you have the chance to use a database, use it. There is reason why it is invented.
But just to get my thoughts clear and firmly know what you are talking about, please tell me more about the settings you would like to store. I can imagine that you are talking about some global variables, or maybe even going to use a define(), but it is also possible you want to store strings.
So please define "bunch of simple parameters" for us.

Why should MVC for websites require a single point of entry?

I see many MVC implementations for websites have a single-entry point such as an index.php file and then parses the URL to determine which controller to run. This seems rather odd to me because it involves having to rewrite the URL using Apache rewrites and with enough pages that single file will become bloated.
Why not instead just to have the individual pages be the controllers? What I mean is if you have a page on your site that lists all the registered members then the members.php page users navigate to will be the controller for the members. This php file will query the members model for the list of members from the database and pass it in to the members view.
I might be missing something because I have only recently discovered MVC but this one issue has been bugging me. Wouldn't this kind of design be preferable because instead of having one bloated entry-file that all pages unintuitively call the models and views for a specific page are contained, encapsulated, and called from its respective page?
From my experience, having a single-entry point has a couple of notorious advantages:
It eases centralized tasks such as resource loading (connecting to the db or to a memcache server, logging execution times, session handling, etc). If you want to add or remove a centralized task, you just have to change a singe file, which is the index.php.
Parsing the URL in PHP makes the "virtual URL" decoupled from the physical file layout on your webserver. That means that you can easily change your URL system (for example, for SEO purposes, or for site internationalization) without having to actually change the location of your scripts in the server.
However, sometimes having a singe-entry point can be a waste of server resouces. That applies obviously to static content, but also when you have a set of requests that have a very specific purpose and just need a very little set of your resorces (maybe they don't need DB access for instance). Then you should consider having more than one entry point. I have done that for the site I am working on. It has an entry point for all the "standard" dynamic contents and another one for the calls to the public API, which need much less resources and have a completely different URL system.
And a final note: if the site is well-implemented, your index.php doesn't have to become necessarily bloated :)
it is all about being DRY, if you have many php files handling requests you will have duplicated code. That just makes for a maintenance nightmare.
Have a look at the 'main' index page for CakePHP, https://github.com/cakephp/cakephp/blob/master/app/webroot/index.php
no matter how big the app gets, i have never needed to modify that. so how can it get bloated?
When deeplinking directly into the controllers when using an MVC framework it eliminates the possibility of implementing controller plugins or filters, depending on the framework you are using. Having a single point of entry standardizes the bootstrapping of the application and modules and executing previously mentioned plugins before a controller is accessed.
Also Zend Framework uses its own URL rewriting in the form of Routing. In the applications that use Zend Framework I work on have an .htaccess file of maybe 6 lines of rewriterules and conditions.
A single entry point certainly has its advantages, but you can get pretty much the same benefit from a central required file at the top of every single page that handles database connections, sessions, etc. It's not bloated, it conforms to DRY principles (except for that one require line), it seperates logic and presentation and if you change file locations, a simple search and replace will fix it.
I've used both and I can't say one is drastically better or worse for my purposes.
Software engineers are influencing the single point of entry paradigm. "Why not instead just to have the individual pages be the controllers?"
Individual pages are already Controllers, in a sense.
In PHP, there is going to be some boilerplate code that loads for every HTTP request: autoloader require statement (PSR-4), error handler code, sessions, and if you are wise, wrapping the core of your code in a try/catch with Throwable as the top exception to catch. By centralizing code, you only need to make changes in one place!
True, the centralized PHP will use at least one require statement (to load the autoloader code), but even if you have many require statements they will all be in one file, the index.php (not spread out over a galaxy of files on under the document root).
If you are writing code with security in mind, again, you may have certain encoding checks/sanitizing/validating that happens with every request. Using values in $_SERVER or filter_input_array()? Then you might as well centralize that.
The bottom line is this. The more you do on every page, the more you have a good reason to centralize that code.
Note, that this way of thinking leads one down the path of looking at your website as a web application. From the web application perspective, a single point of entry is justifiable because a problem solved once should only need to be modified in one place.

is this technique ok with frameworks: storing all texts in seperate plain text file?

I am considering to use a php framework (never used one before), I know I have to abandon the way I am used to work (I can deal with this), among many concerns a first thing that comes in my mind is this:
I have two techniques I like to use (have used them for years)..
I always use index.php?somePage.php as href so never loads another page other that index.php, then index.php includes the somePage.php
I always used to store all texts, any text, titles, button names, link names, stories, articles, anything, in a single (or more) files or in GLOBALS array (depending on size).
I want to ask is this aproach wrong, is there any better way?
Second, from what I have read freamworks have some rules, does my approach create some conflict? I am thinking of KISS_MVC framework cause it declares to be easy for framework beginers.
I have no experience with frameworks and I am concerned, about all this, I can't wait for the day I will feel like home with using a framework.
Thank you all in advance!
That pattern is known as front controller - it gets all requests and routes them internally (not based on file loaded). That pattern is fine, it should look like /index.php/whatever to which you can then patch over with .htaccess to make /whatever (Examine $_SERVER['REQUEST_URI']).
Database is a good place to store large amounts of text. Files you create rarely are - they offer none of the advantages of using a database. $GLOBALS rarely is a good place to store data. You generally should keep as little as possible available globally. You can make a registry class to store global stuff if you need.
The best way to know what does and doesn't work with a framework is to try and get familiar with a popular one such as Zend, Kohana, Yii, etc.

Why use a single index.php page for entire site?

I am taking over an existing PHP project. I noticed that the previous developer uses a one index.php page for the entire site, currently 10+ pages. This is the second project that I have seen done like this. I don't see the advantage with this approach. In fact it seems like it over complicates everything because now you can't just add a new page to the site and link to it. You also have to make sure you update the main index page with a if clause to check for that page type and then load the page. It seems if they are just trying to reuse a template it would be easier to just use includes for the header and footer and then create each new page with those files referenced.
Can someone explain why this approach would be used? Is this some form of an MVC pattern that I am not familiar with? PHP is a second language so I am not as familiar with best practices.
I have tried doing some searches in Google for "single index page with php" and things like that but I can not find any good articles explaining why this approach is being used. I really want to kick this old stuff to the curb and not continue down that path but I want to have some sound reasoning before making the suggestion.
A front controller (index.php) ensures that everything that is common to the whole site (e.g. authentication) is always correctly handled, regardless of which page you request. If you have 50 different PHP files scattered all over the place, it's difficult to manage that. And what if you decide to change the order in which the common library files get loaded? If you have just one file, you can change it in one place. If you have 50 different entry points, you need to change all of them.
Someone might say that loading all the common stuff all the time is a waste of resources and you should only load the files that are needed for this particular page. True. But today's PHP frameworks make heavy use of OOP and autoloading, so this "waste" doesn't exist anymore.
A front controller also makes it very easy for you to have pretty URLs in your site, because you are absolutely free to use whatever URL you feel like and send it to whatever controller/method you need. Otherwise you're stuck with every URL ending in .php followed by an ugly list of query strings, and the only way to avoid this is to use even uglier rewrite rules in your .htaccess file. Even WordPress, which has dozens of different entry points (especially in the admin section), forces most common requests to go through index.php so that you can have a flexible permalink format.
Almost all web frameworks in other languages use single points of entry -- or more accurately, a single script is called to bootstrap a process which then communicates with the web server. Django works like that. CherryPy works like that. It's very natural to do it this way in Python. The only widely used language that allows web applications to be written any other way (except when used as an old-style CGI script) is PHP. In PHP, you can give any file a .php extension and it'll be executed by the web server. This is very powerful, and it makes PHP easy to learn. But once you go past a certain level of complexity, the single-point-of-entry approach begins to look a lot more attractive.
Having a single index.php file in the public directory can also protect against in the case of the php interpreter going down. A lot of frameworks use the index.php file to include the bootstrap file outside of the doc root. If this happens, the user will be able to see your sourcecode of this single file instead of the entire codebase.
Well, if the only thing that changes is the URL, It doesn't seem like it's done for any reason besides aesthetic purposes...
As for me - single entry point can help you to have better control of your application: it helps to handle errors easily, route requests, debug application.
A single "index.php" is an easy way to make sure all requests to your application flow through the same gate. This way when you add a second page you don't have to make sure bootstrapping, authentication, authorization, logging, etc are all configured--you get it for free by merit of the framework.
In modern web frameworks this could be using a front controller but it is impossible to tell since a lot of PHP code/developers suffer from NIH syndrome.
Typically such approaches are used when the contents of the pages are determined by database contents. Thus all the work would get done in a single file. This is seen often in CMS systems.

Serving multiple site with one drupal (not using multi site)

I am looking for expert advice on how to best serve multiple sites with one Drupal instance (using Pressflow 6.x). Let's consider the company needing this is called "ABC Group of Companies" and it has 3 sister concerns. So, altogether there will be four sites:
www.abcgroup.com
www.company-a.com
www.company-b.com
www.company-c.com
Here are the things that are most interesting:
The users will be shared among all
the sites
Each site will "mostly" host their own content (say the welcome text on home page, or menu items - different for each site)
Some contents, will be shown in all of the sites (say, a company-wide notice....or an employee directory)
The theme for each site will be different
Now, I am thinking of having DNS entry so each of the domain point to the same Drupal installation and when Drupal gets bootstrapped, I would like to sniff into the $_SERVER array to know which site is being hit. I'd then like to load the theme accordingly, show the contents specific to that site, and also show the contents that are shared with all the sites.
To make this happen, so far I have created a node type called "Site" and have created four contents for each of the sites. Then for each other content type (say, Page) I have put a node reference to the "Site" content type with multiple value so when creating a new content, the administrator can specify in which site that content will be showed. However, after that I am stuck.
I have tried to understand Contexts, Spaces, PURL - but haven't figured them out fully yet and I believe I could use the community power to help me out. What do you think is the best approach to handle this scenario ?
It'd be greatly helpful if anybody can suggest a direction.
Regards,
Emran
The way you are suggesting is certainly a way that you could do it, but have you considered domain access? I have used it in the past and found it to be very useful. there is also quite a large collection of modules which work with it. Different themes, Options as to which nodes should appear on which sites and shared users are all features that it has.
Hope this helps!
http://drupal.org/project/domain
First up, I strongly second hookds suggestion of using Domain Access Module for this (+1). It has extensive support/features for your scenario and already covers most of the hard parts you'd need to solve yourself otherwise.
Second, if you insist on trying to do this yourself, I can assure you that it is possible, as we have done something pretty similar recently (some special requirements ruled out domain access), but it was a lot of work, especially when functionality provided by contributed modules would not fit well into our 'unusual' scenario.
Given the multitude of special cases you'd have to cover, it is hard to point out a general direction (apart from suggesting to use Domain Access Module ;) but one major point would be to check out the custom_url_rewrite_inbound()/custom_url_rewrite_outbound() function combo. These will allow you to do pretty low level URL manipulations for incoming requests, as well as for URLs generated for output, both of which you'll need to do if you you want to serve multiple domains from the same instance.
Did I mention that you should check out Domain Access Module before you try to build this yourself?
It sounds like there will be virtually no content shared between these sites. Will you be wanting a single login across all sites?
Remember, Domain Access uses 1 shared database.
You could also just do a regular multi-site install, and share certain tables.
I give Domain Access two thumbs up, but just make sure you really need what it actually does.
Also, I would look into the Feeds.module. You can pull content from anywhere (especially another drupal site) and it imports it directly and creates nodes and fields automatically from it.

Categories