A PHP newbie question. I am trying to make my first web page using a php class for navigation and some php for including html files - trying to be efficient. It all works fine at the top of the directory structure but I don't know how to make it work in all the sub-folders.
The (linux) server is a fileshare and I have access to a root folder with a www folder for serving web pages.
I have trawled this site and the web but I am completely confused about how to specify the path to the php and class files to be included in each page.
I know this is really basic php but I am really old (well, almost 60) and learning is getting harder.
The most efficient way to do it is as suggested by Jeremy Miller in the comment, by use of a variable indicating your root folder (absolute path), and referring to it each time. Otherwise, you should know that "../path" notation indicates the parent folder in PHP, and you can use it multiple times to traverse up the tree: "../../path" and so on.
Related
Goal: to require files from an old monolithic code base that is still an active website, and use those files as logic for an API running on the same server in its own instance of PHP. Hoping to leverage the old codes deeply buried business logic. As the old code base must run as we transition it away.
Problem: The old code base uses $_SERVER[‘DOCUMENT_ROOT’] in its require paths. So the new API can’t include those files because it thinks that the document root is the root of its own web root. The old code files contain MANY requires, and those required pages include many requires, so going through and replacing the document root var with relative paths would be a mammoth job. For every new route I write I need to go through all of the old code requires and changes the paths to relative.
Using curl from the new API works fine, but I’m trying to avoid the overhead.
I thought of using an $_ENV var but again it’s the server environment and it has the same problem.
Is there a way to circumvent this problem?
The easiest by far would be to write a script that automatically updates all the require paths. This is probably far better than any workaround you can come up with.
I saw alot of frameworks like Laravel, Zend, Symfony .... and I noticed that they put php files in the root directory, but when I saw WordPress and vBulletin and alot of famous scripts, and I noticed that they put the php files in the public directory.
I wanna know what is the best place to put my PHP files. Root directory or public_html directory? Which is better? and Why! And what is the difference between them?
Assuming by "root directory" you mean somewhere outside of the web server's document root, and that "public_html" is the web server's document root...
It's best practice to only have scripts that must be directly accessible within your web server's doc root. If something happens where PHP gets disabled, you don't want the whole world downloading copies of your source code. Scripts can include files from wherever they can access on disk, so it's possible to only put a loader in the web server's doc root and keep the whole application out of there. This is common and best practice.
Wordpress likely doesn't do this by default because most folks installing Wordpress don't really know what they're doing. It's easier for them to just extract a tarball to one place and not have to worry about it. The ones that do know what they are doing can customize installation paths if desired.
Here is my (probably very silly) newbie method of working on projects.
I have a directory C:\websites\ and I keep separate folders for all my sites in there.
Whenever I want to work on one of those sites, I copy and paste the files for that site from it's directory within C:\websites\ - for example:
C:\websites\website_one\*.* (just the files, not the directory) - and put those files straight in C:\Program Files\Apache Software Foundation\Apache2.2\htdocs\ - then open those files in notepad++.
It is extremely annoying to have to move the website I no longer want to work on for that day back to C:\websites\, and to move a different website into htdocs\ every time I want to work on a different website.
Could anyone help me with the standard way of structuring the directories of your local development sites?
I want to be able to access/work on my different projects more easily, and also be able to upload any of them for online testing and not have to change any links/includes because the directory structure doesn't match.
Locally, should I just replicate the web hosts directory structure in htdocs? For example:
htdocs\public_html\website_one
htdocs\public_html\website_two
Or something similar?
Thanks for any insight into this.
Since you work with Apache anyways, why not work with sub domains or url rewrites?
Heres a link to set up sub domains.
http://www.sitepoint.com/forums/showthread.php?394472-Setting-up-Subdomains-on-Localhost-Apache
Each site can be in its own folder, do keep in mind that includes or links should be based on the root of the website, not the root of the webserver.
We have a custom PHP/MySQL CMS running on Linux/Apache thats rolled out to multiple sites (20+) on the same server. Each site uses exactly the same CMS files with a few files for each site being customised.
The customised files for each site are:
/library/mysql_connect.php
/public_html/css/*
/public_html/ftparea/*
/public_html/images/*
There's also a couple of other random files inside /public_html/includes/ that are unique to each site. Other than this each site on the server uses the exact same files. Each site sitting within /home/username/. There is obviously a massive amount of replication here as each time we want to deploy a system update we need to update to each user account. Given the common site files are all stored in SVN it would make far more sense if we were able to simply commit to SVN and deploy to a single location direct from there. Unfortunately, making a major architecture change at this stage could be problematic. In my mind the ideal scenario would mean creating an account like /home/commonfiles/ and each site using these common files unless an account specific file exists, for example a request is made to /home/user/public_html/index.php but as this file doesnt exist the request is then redirected to /home/commonfiles/public_html/index.php. I know that generally this approach is possible, similar to how Zend Framework (and probably others) redirect all requests that dont match a specific file to index.php. I'm just not sure about how exactly to go about implementing it and whether its actually advisable. Would really welcome any input/ideas people have got.
EDIT AllenJB's comment reminded me that we have previously explored AliasMatch as a potential solution to this, we ended up with an general.conf file for a user that looked something like this:
php_admin_value open_basedir "/home/commonfi:/home/usertes:/usr/lib/php:/usr/local/lib/php:/tmp"
php_admin_value include_path "/home/commonfi"
AliasMatch (.*).php /home/commonfi/public_html/$1.php
AliasMatch (.*).html /home/commonfi/public_html/$1.html
You can set this up via the Apache configuration - you probably want Alias, but there are several options:
http://httpd.apache.org/docs/2.2/urlmapping.html
You certainly can build a "cascading" system as you describe (load local file, if that doesn't exist, load global file). The complexity is that the files are loaded in different ways (using include() in PHP, through the web, ... maybe even more ways?)
Filesystem includes
If the includes/ consist of files containing one PHP class each, you could use an autoloader like Zend Framework does. The autoloader would look first for a custom version of the include file, and if it doesn't find one, include the global version instead. I happen to have such an autoloader handy if you need code to start with.
If the includes don't match the one-class-per-file structure, you would have to build a custom include() function that fetches the local version of the file or, failing that, the global one.
Pseudo-code:
function fetch_path($name)
{
if (file_exists(LOCAL_DIRECTORY."/$name")) return LOCAL_DIRECTORY."/$name";
if (file_exists(GLOBAL_DIRECTORY."/$name")) return GLOBAL_DIRECTORY."/$name";
return false;
}
Web resources
The second part is going to be the web part (i.e. Web URLs with local or global files). I think this should be pretty easily solvable using the -f switch in a .htaccess file. You would build a rule that rewrites failed requests (!-f) to the local web resources directory (example.com/css/main_stylesheet.css) to the global one /home/commonfiles/public_html/main_stylesheet.css). You would need to fiddle around with Apache's server config to be able to rewrite local requests to the commonfiles directory, but it should be possible.
That is maybe worth a separate question.
I'm running the same php script on many domains on an apache2 server. Only some of the files change between domains, and the rest are always the same. Right now, every time I set up a new domain, I copy all the files, but I'd really like to have these common files in one place so any changes would affect all domains.
I've thought of using a bunch of symlinks to point at common files. Is this an ok approach, or are there some simple edits I can make to the php scripts or apache configuration files to make this more efficient?
Thanks!
The way I do this kind of thing is to create a "common" directory, where I place all the file that can be shared between each site. Then I simply include them wherever they are needed.
This is pretty good because allows to add features across multiple sites.
I'd suggest abstracting the common code into a set of 'library' scripts. Placing these in a common directory, and making that available by modifying PHP's include_path variable. This means you most likely won't have to modify your current scripts, while still removing the need to have more than one copy.
This path could (and probably should) be outside of your public directories. This enhances the security of your websites by not making them directly available to outside users.
This can be a bit tricky, as the application almost needs to know you're doing this. IME, it works best when you can divide the app into common code and instance code in two separate directory trees. The common code also needs to not do anything silly like include a file that has to be in the instance tree.
A single point of entry to load the common code is also a big bonus because then you can chain a few very small files: the instance code includes one in it's own directory; that file includes a file outside the instance code; that file then either loads the entry point file for the common code, or loads another that does. Now this is only one way to do it, but it means you have just one file that needs to know where the common code is (so you can move it if you have to with minimal effort), and if you do it right, all the various instance code trees load it, albeit indirectly.
You could have a library directory that sits above all of your sites, and a config file that states which library files your sites should include by default. You can then have another config file within each site that overrides the global config. These config files can be used to generate include('../../lib/*.php') statements to build the basic function toolkit needed for each site.
some_high_level_directory/
-> lib/
->*.php (library files)
-> config.php (global library includes)
-> site_1/
-> config.php (library includes that only relate to site_1)
-> www/
-> site_2/
-> config.php (library includes that only relate to site_2)
-> www/
-> etc, etc
Hopefully that makes sense... :)