Are symbolic links harmful for understanding and maintaining code? - php

I use symbolic links in my web project. There is a source folder and an additional folder for an email task which is executed by a service.
Both the website and the task are written in PHP and use my util.php, sql_functions.php and config.php files.
Rather than hardcoding the locations of these three files, I created symbolic links to these three utility files and some directories so that they are available from each of my subdirectories. The code works.
I also created a batch file which automatically creates these links and documented the installation procedure.
Below you can find a screenshot of the folder that contains my email task. sql_functions.php loads the configuration data and is being used by email.php to access the SQL Server. The symlink to the sql folder also helps.
All of these folders are in the same repository, so there is no real risk that any dependencies might not be loaded.
It just feels like dirty programming.

Although symbolic links by themselves shouldn't be harmful to the understanding and maintaining of the codebase, your case in kind of brutal. Puting a symlink in everyone of your files looks like overkill there. There are more options than just choosing betwen hardcodes paths and such a violent solution.
For example, you could set the path of these files in a constant that is loaded at the begining of each files through a require(), or many others solutions. In the end, it depends on the way your website works, but I doubt your solution is the most flexible you could come up with.

Related

Why to have "build/" folder with PHP project and phing

What is a benefit of having "build/" folder where all the sources will be placed and "built"?
Maybe it's a silly question, but I'm trying to understand Continuous Integration with PHP. Any example of build.xml for phing uses such build/ folder, but what's a sense in that for PHP where a checked out project doesn't require a compilation, only a basic configuration. Copying it all into build/ will just complicate the things, because you'll have doubled files and +1 folder to the web root path (if you'd like to have web UI to run selenium tests on)
Particularly I need phing for two cases:
1) let new user setup his first installation (or update old), right on a working copy
2) run unit/func-tests, phpcc, phpcs, phpdoc etc (all that usually on CI server)
Should I have "build/" for the second task? What is the best practice for PHP?
There are several good reasons to have a build directory (i.e., deployment to multiple environments, performing some text replacement, minimizing and combining CSS and JS, optimizing images, handling of config files etc.)
However, these may not apply in your use cases. There is no rule saying you need this directory. Depending on your thinking on testing in production, a build directory may be a good reason to keep this directory.

Consolidate multiple site files into single location

We have a custom PHP/MySQL CMS running on Linux/Apache thats rolled out to multiple sites (20+) on the same server. Each site uses exactly the same CMS files with a few files for each site being customised.
The customised files for each site are:
/library/mysql_connect.php
/public_html/css/*
/public_html/ftparea/*
/public_html/images/*
There's also a couple of other random files inside /public_html/includes/ that are unique to each site. Other than this each site on the server uses the exact same files. Each site sitting within /home/username/. There is obviously a massive amount of replication here as each time we want to deploy a system update we need to update to each user account. Given the common site files are all stored in SVN it would make far more sense if we were able to simply commit to SVN and deploy to a single location direct from there. Unfortunately, making a major architecture change at this stage could be problematic. In my mind the ideal scenario would mean creating an account like /home/commonfiles/ and each site using these common files unless an account specific file exists, for example a request is made to /home/user/public_html/index.php but as this file doesnt exist the request is then redirected to /home/commonfiles/public_html/index.php. I know that generally this approach is possible, similar to how Zend Framework (and probably others) redirect all requests that dont match a specific file to index.php. I'm just not sure about how exactly to go about implementing it and whether its actually advisable. Would really welcome any input/ideas people have got.
EDIT AllenJB's comment reminded me that we have previously explored AliasMatch as a potential solution to this, we ended up with an general.conf file for a user that looked something like this:
php_admin_value open_basedir "/home/commonfi:/home/usertes:/usr/lib/php:/usr/local/lib/php:/tmp"
php_admin_value include_path "/home/commonfi"
AliasMatch (.*).php /home/commonfi/public_html/$1.php
AliasMatch (.*).html /home/commonfi/public_html/$1.html
You can set this up via the Apache configuration - you probably want Alias, but there are several options:
http://httpd.apache.org/docs/2.2/urlmapping.html
You certainly can build a "cascading" system as you describe (load local file, if that doesn't exist, load global file). The complexity is that the files are loaded in different ways (using include() in PHP, through the web, ... maybe even more ways?)
Filesystem includes
If the includes/ consist of files containing one PHP class each, you could use an autoloader like Zend Framework does. The autoloader would look first for a custom version of the include file, and if it doesn't find one, include the global version instead. I happen to have such an autoloader handy if you need code to start with.
If the includes don't match the one-class-per-file structure, you would have to build a custom include() function that fetches the local version of the file or, failing that, the global one.
Pseudo-code:
function fetch_path($name)
{
if (file_exists(LOCAL_DIRECTORY."/$name")) return LOCAL_DIRECTORY."/$name";
if (file_exists(GLOBAL_DIRECTORY."/$name")) return GLOBAL_DIRECTORY."/$name";
return false;
}
Web resources
The second part is going to be the web part (i.e. Web URLs with local or global files). I think this should be pretty easily solvable using the -f switch in a .htaccess file. You would build a rule that rewrites failed requests (!-f) to the local web resources directory (example.com/css/main_stylesheet.css) to the global one /home/commonfiles/public_html/main_stylesheet.css). You would need to fiddle around with Apache's server config to be able to rewrite local requests to the commonfiles directory, but it should be possible.
That is maybe worth a separate question.

is it right to save media file of project into svn repository?

I know that I can save all my projects files into repository so deployment of new version of software become only using svn export into properly directory.
But I have a feeling that it is not right way and for media files I should use some other utilities for deployment like rsync. But it is also a problem with double-side sync = I like to keep backup of full projects into some security space (not only live server).
So main question is what is the right way and project's directory structure for web-application in PHP?
Ehh, a complex one.
First of all, if you have such possibility, it's good to split 'code' and 'web' part. Something like that.
web/
web/css/
web/upload
code/
code/lib/
code/actions
This gets PHP out of web root. It's safer (attacer will not be able to access your files by entering URL in browser). BUT - this requires appropriate application design (for example Symfony framework gives you similar layout).
Second thing - there's nothing wrong (in my opinion) with binary files inside SVN repo. It all depends, though, what files we are talking about. If not user-uploaded content - go ahead. The less complex is deployment, the less chance something goes wrong.
BTW: You can always opt-out some folders contents from svn so user files won't mess up with your files.
So one thing you have to keep in your design is to separate user entered content of your content (the best is to create special folders for users and opt them out SVN).
It's absolutely right to put media files into source control (whether svn or something else). It's probably a good idea to put media files somewhere separate from your .php files though.
Why have a two step deployment (svn up and then rsync or similar) when you can do it in one step?
It's not wrong to have your media files into your SVN repository, as your medias' version is linked to the rest of your software's.
Besides, if you want to have backups of your svn repository, you can use the svnsync command to have some other box mirror you "main" subversion repository.
About structure, the best practice mainly depends on the use case you are facing. You most likely want to organize your files by modules and content-type (so having media separated from code, ...).
It's fine to put the media files in svn too.
You could use for example external svn links to blend the media files into your tree, so you have only one checkout to do and you have full subversion support and you do not clutter your sourcecode repository.
In my opinion, it depends on what type of media you are talking about. If this is something static, like images, javascript, css, and so forth - something that isn't temporary, then put it in subversion. However, if it is something that will change, such as an ad, I don't think there is much reason to subversion it. Just set the directory propset to ignore and upload the files manually with rsync, scp or ftp.

Codeigniter Shared Resources - Opinions Wanted

I run multiple websites all running off of a single installation of CodeIgniter on my server (separate application directories and a single system directory). This has been working fabulously and I don't see any reason to change it at this point.
I find myself writing library classes to extend/override CI all of the time and many times if I find a bug or improve effeciency I have to go back to several websites to make the same adjustments at risk of a typo that breaks one of the websites. Because of this it requires that I change each file and then test that site for bugs.
I have been pondering a solution of using a single libraries directory in a central location and symlinking all of my websites to that central directory. Then when I make a file change it will immediately propagate to all of the downstream websites. It will still require that I test each one for errors, but I won't have to make the changes multiple times. Anything that is specific to a single website will either be a non-shared file (still in the linked directory just not used elsewhere) or can be put in a local helper.
Also, I keep separate 'system' directories by CI version so I can migrate my websites independently if necessary--this central libraries file would be attached to a specific version to reduce possible breaks.
Does anyone see potential issues or pitfalls from taking this approach? Has anyone accomplished this in another direction that I should consider?
Thanks in advance!
I think this actually makes sense :] Go for it. Even on official CodeIgniter page, they mention it's possible.
Also, I don't see one reason why there should be any problem.
Edit: they touch the problem of multiple sites here: http://codeigniter.com/user_guide/general/managing_apps.html
also:
http://codeigniter.com/wiki/Multiple_Applications/
http://www.exclusivetutorials.com/setting-multiple-websites-in-codeigniter-installation/
How to Handle Multiple Projects in CodeIgniter?
http://codeigniter.com/forums/viewthread/56436/
I have a single system directory and separate application directories for my CI apps. In order to share libraries and some view templates between my apps, I have created a "Common" directory, in the same folder as the CI system and with the same structure as a regular app folder and used symlinks, but you can modify the Loader class so that it looks in the Common folder too. My setup looks something like this:
/var/CodeIgniter/
/var/Common/
/var/Common/config/
/var/Common/controllers/
...
/var/Common/libraries/
...
/var/www/someapp/
/var/www/someotherapp/
...
I'm not sure how you handle publishing your sites (assuming you actually do any of that), but I'd look into version control. For example, in SVN you can make external to another svn directory (or file) and then just update the current svn directory which grabs the external file. This approach gains one benefit from the others, which is when you modify the common library, the others aren't immediately affected. This prevents unwanted breaks before you have time to go test all the sites using the common library. You can then just update each site's folder whenever you are ready to test the changes. This is "more work", but it prevents code duplication AND unwanted breaks.
I wrote a MY_Loader to do exactly that.
http://ellislab.com/forums/viewthread/136321/

What is the best way to set up shared php script files on a webserver?

I'm running the same php script on many domains on an apache2 server. Only some of the files change between domains, and the rest are always the same. Right now, every time I set up a new domain, I copy all the files, but I'd really like to have these common files in one place so any changes would affect all domains.
I've thought of using a bunch of symlinks to point at common files. Is this an ok approach, or are there some simple edits I can make to the php scripts or apache configuration files to make this more efficient?
Thanks!
The way I do this kind of thing is to create a "common" directory, where I place all the file that can be shared between each site. Then I simply include them wherever they are needed.
This is pretty good because allows to add features across multiple sites.
I'd suggest abstracting the common code into a set of 'library' scripts. Placing these in a common directory, and making that available by modifying PHP's include_path variable. This means you most likely won't have to modify your current scripts, while still removing the need to have more than one copy.
This path could (and probably should) be outside of your public directories. This enhances the security of your websites by not making them directly available to outside users.
This can be a bit tricky, as the application almost needs to know you're doing this. IME, it works best when you can divide the app into common code and instance code in two separate directory trees. The common code also needs to not do anything silly like include a file that has to be in the instance tree.
A single point of entry to load the common code is also a big bonus because then you can chain a few very small files: the instance code includes one in it's own directory; that file includes a file outside the instance code; that file then either loads the entry point file for the common code, or loads another that does. Now this is only one way to do it, but it means you have just one file that needs to know where the common code is (so you can move it if you have to with minimal effort), and if you do it right, all the various instance code trees load it, albeit indirectly.
You could have a library directory that sits above all of your sites, and a config file that states which library files your sites should include by default. You can then have another config file within each site that overrides the global config. These config files can be used to generate include('../../lib/*.php') statements to build the basic function toolkit needed for each site.
some_high_level_directory/
-> lib/
->*.php (library files)
-> config.php (global library includes)
-> site_1/
-> config.php (library includes that only relate to site_1)
-> www/
-> site_2/
-> config.php (library includes that only relate to site_2)
-> www/
-> etc, etc
Hopefully that makes sense... :)

Categories