UNIX wrapper in PHP - php

For a simple web application, I'd like to be able to take advantage of several UNIX features that are stable, have a long history, and are well proven in production, rather than having to write my own code. Take users, for example. Rather than having an entire user, group, permission infrastructure in my webapp, I'd like to be able to simply piggyback on top of the equivalent features of UNIX.
Is there a PHP library that will allow me to register users, log them in, manage permissions, etc. ?

It's really not a good idea to fumble around with the user and permission settings of the actual system that is hosting your site. If you want to protect individual directories of your site, you're better off using .htaccess files. If OTOH you're working with virtual URLs, you'll have a hard time mapping the UNIX directory permissions to them anyway.

Based on your comment to deceze's answer, are you looking for something like PHP's filesystem functions?
Then, there is system and its related functions, which gives access to Unix commands, but I'd recommend other ways of doing things if possible.
Edit: In response to the comments about needing user and group functionality:
Firstly, in case your plan is to let web users have access to the whole file system (or even just their regular log in directories), I just want to advise against that - lots of security concerns (eg. if someone else gets into a user's account, they could delete everything to which they have access).
The little experience I have with handling users in PHP was in some beginner level training. In that class, we had our users in an SQL database and used sessions. We didn't use SSL, but I'd advise some sort of crypto when passing passwords around.
If you're using Apache, it can handle the authentication for you. Other server software can probably do the same, but Apache is all I've ever worked with. I can't address whether Apache can handle sessions itself and don't have the time to research it right now.

If php or your webserver is running with root rights it should be no problem to use this functions.
For security reasons I would strongly recommend to reimplement these things or using any existing php library instead!!

It seems there are standard functions for interfacing with Kerberos or Radius in php.
These both have a long history and are well proven in production, while being separate from the system users.

Related

Any PHP framework to use SECURELY on shared hosting?

Are there PHP frameworks that would allow me to generate an application and then use it SECURELY on a shared hosting, as far as a shared hosting security can be achieved? By this I mean, for example, not requiring any app/tmp directory with 777 access.
Not Symfony -> http://trac.symfony-project.org/wiki/SharedHostingNotSecure
Not CakePHP -> http://book.cakephp.org/view/911/Permissions
CodeIgniter -> "If you're a developer who lives in the real world of shared hosting accounts and clients with deadlines..." - looks promising, maybe this one? But I couldn't find anything specific to shared hosting file permissions in the documentation
Maybe ZendFramework? (I am not sure if it is the same category as PHP framework, looks like)
Any existing possible frameworks to use SECURELY on shared hosting??
Having to declare directories with 777 permissions is only a problem on cheapo and entry-level shared hosting systems. It's common to see the safe_mode hack and openbasedir restrictions in that area, which only prevent access via PHP but not other CGI interpreters.
Contemporary server setups use suexec/suphp, where every PHP scripts runs under the current accounts permissions. Therefore you don't need any world-write directories and most PHP application should be secure against cross-account tampering at least. The framework itself doesn't make a difference here.
You're looking at the right problem from the wrong point of view.
Having a directory with the rights 777 is not unsecure per se.
Having a 777'ed directory on a shared hosting is unsecure, because the http daemon is run for all clients under the same system account.
It is an intrinsic "feature" of shared hosting, that's why it's the cheapest. Yep, it's not cheap for nothing, it's cheap at the price of security.
If security is that important to you, buy a VPS. Nowadays VPSes are cheap enough.
I suggest Zend Framework. Doesn't require any file permissions as far as I know. Just needs some proper configuration. And yes, it is a PHP Framework. My entire library is under root owner, with read permissions for everyone and it works fine. Never needed to chmod anything. When it comes to libraries you can always define your own tmp path if it's needed by the class.
Most successful attacks happen because the user/admin sticks with the defaults; which are well-known. See Windows attacks. It's unbelievable how many "administrators" keep not only default URLs, default directory structures but also User IDs and passwords. On my website I see repeatedly some log in attempts at /wp-login.php with ID:PW as admin:admin. I don't even know if these are some defaults and my website is not even WP but it seems to be and I think those hackers will get lucky every once in a while.
Security I believe is always about raising the bar, make it more difficult. There is never a 100% guarantee of security. I think you can choose whatever framework or application but it is your job to make it more difficult by changing the defaults.
I can only speak for ZF and you can do some freaky configuration nobody will ever guess; unless your application has occasional errors and you show your error messages with full information.

Best methods to clean up a hacked site with no clean version available?

I have been asked to fix a hacked site that was built using osCommerce on a production server.
The site has always existed on the remote host. There is no offline clean version. Let's forget how stupid this is for a moment and deal with what it is.
It has been hacked multiple times and another person fixed it by removing the web shell files/upload scripts.
It is continually hacked often.
What can I do?
Because you cannot trust anything on the web host (it might have had a rootkit installed), the safest approach is to rebuild a new web server from scratch; don't forget to update all the external-facing software before bringing it online. Do all the updating on the happy side of a draconian firewall.
When you rebuild the system, be sure to pay special attention to proper configuration. If the web content is owned by a different Unix user than the web server's userid and the permissions on the files are set to forbid writing, then the web server cannot modify the program files.
Configure your web server's Unix user account so it has write access to only its log files and database sockets, if they are in the filesystem. A hacked web server could still serve hacked pages to clients, but a restart would 'undo' the 'live hack'. Of course, your database contents could be sent to the Yakuza or corrupted by people who think your data should include pictures of unicorns. The Principle of Least Privilege will be a good guideline -- what, exactly, does your web server need to access in order to do its job? Grant only that.
Also consider deploying a mandatory access control system such as AppArmor, SELinux, TOMOYO, or SMACK. Any of these systems, properly configured, can control the scope of what can be damaged or leaked when a system is hacked. (I've worked on AppArmor for ten years, and I'm confident most system administrators can learn how to deploy a workable security policy on their systems in a day or two of study. No tool is applicable to all situations, so be sure to read about all of your choices.)
The second time around, be sure to keep your configuration managed through tools such as as puppet, chef, or at the very least in a revision control system.
Update
Something else, a little unrelated to coming back online, but potentially educational all the same: save the hard drive from the compromised system, so you can mount it and inspect its contents from another system. Maybe there's something that can be learned by doing forensics on the compromised data: you might find that the compromise happened months earlier and had been stealing passwords or ssh keys. You might find a rootkit or further exploit tools. You might find information to show the source of the attack -- perhaps the admin of that site doesn't yet realize they've been hacked.
Be careful when inspecting hacked data -- that .jpg you don't recognize might very well be the exploit that cracked the system in the first place, and viewing it on a 'known good' system might crack it, too. Do the work with a hard drive you can format when you're done. (Virtualized or with a mandatory access control system might be sufficient to confine "passive" data-based hacks, but there's nothing quite like throwaway systems for peace of mind.)
Obtain a fresh copy of the osCommerce version the site was built with, and do a diff between the new fresh osCommerce and the hacked site. Also check for files which exist on the server but not in the osCommerce package.
By manually comparing the differences, you can track down all possible places the hack may have created or modified scripts.
I know this is a little late in the day to be offering this solution but the official fix from osCommerce developement is here:
http://library.oscommerce.com/confluence/display/OSCOM23/(A)+(SEC)+Administration+Tool+Log-In+Update
Once those code changes are applied then most of the actual work is in cleaning up the website. The admin login bypass exploit will be the cause that has allowed attackers to upload files via the file manager (usually) into directories that are writable, often the images directory.
There are other files that are often writable too which can have malicious code appended in them. cookie_usage.php and includes/languages/english/cookie_usage.php are the usual files that are affected, however on some server configurations, all site files can be susceptible.
Even though the official osCommerce fix is linked to above, I would also suggest to make this change as well: In the page above, scroll down till you see the link that says "Update PHP_SELF Value". Make those changes as well.
This will correct the way $PHP_SELF reports and prevent attackers from using malformed URLs in attempts to bypass the admin login.
I also suggest that you add htaccess basic authentication login to the admin directory.
Also check out an addon I authored called osC_Sec which is an all in one security fix, which while works on most php backed websystems, it is specifically designed to deal to the issues that exist in the older versions of osCommerce.
http://addons.oscommerce.com/info/8283

Best login management solution without database in PHP

I am seeking a simple and best solution to manage user logins (username & password) in a private network (localhost) web page without setting up MySQL or any such database systems? I am running Apache with PHP. It will be helpful if anybody give a PHP demo.
Depends on your webserver. Assuming apache, htpasswd for basic auth could be all you need.
Authentication, Authorization and Access Control gives the information you need to get started.
If you are using localhost, then wouldn't everyone have access to the filesystem?
I think the simplest solution is .htaccess though.
There are other solutions like that, but I can't find nicer links right now:
http://webscripts.softpedia.com/script/Authentication/AuthMan-Free-42216.html
http://www.phpkode.com./scripts/item/passwdauth/
http://www.hotscripts.com/listing/htaccess-manager-lite/
http://www.hotscripts.com/listing/needlock-access-management-system/
http://www.hotscripts.com/listing/dirlock/ (Ooops. Too many hotscripts links. So just saying: not affiliated with that. :] And certainly not endorsing it!)
Anyway. These scripts store usernames and passwords into a .htpasswd file. This can be used independently from the PHP script. .htaccess and .htpasswd is common authentication scheme for Apache webservers. See this tutorial: http://www.javascriptkit.com/howto/htaccess3.shtml or the Apache manual: http://httpd.apache.org/docs/2.0/howto/auth.html
You could use the class itself for authentication, but it doesn't look very current. The login management tool might be useful to you. (But again: there are probably easier ones to find.)
Have a look at www.mykonosphotographs.com/musas/
I'm still in the process of writing a download page, but I can email you the script if you're interested. Use the contact from on the site.
It's a major modification to a single-password system written by someone else, and I've added lots of extra functionality which you can use or not, as you like.
Currently free. It won't be as soon as I've got the downloads page sorted! ;-)
Edit - too late! The downloads page is now sorted.
New URL - http://www.myksoftware.com
nope.
unless you want to save all information to files (like csvs)
but this could be a security threat if anyone got access to the files (which is easier than gaining access to a database)
php doc for reading csv file
You can hide almost anything by various means.
For example, if your webspace provider permits it you can place files outside of your root directory. You can place files in a directory with some weird name like "jdf83usadj300p", with no link to them other than in php, which is parsed by serverside software without the clientside software ever seeing it. You can place passwords in a config file, if you call it something like "config.php" and make the first line:-
; <? php die(); ?>
Only an idiot would use the various javaScript "protection" methods out there like making the password the name of the next file. .htaccess can be cumbersome and can put people off. Far better make it attractive using server-side software.
If passwords are likely to be attacked, ALWAYS encrypt them. NEVER pass passwords from one page to another by GET method unless already encrypted. POST is more secure. md5 encrypting for most purposes is good, with encryption in-built in PHP and many algorithms out there freely available for javaScript. SHA-256 is almost the best there is fo almost any purpose. I don't think the US DoD or the UK MOD would be satisfied with it, but your local sports club's website (or similar) isn't exactly going to be a high-profile target for hackers.
Incidentally, blasteralfred, I sent you an email update. Hope you got it. Good luck!

Methods for caching PHP objects to file?

In ASPNET, I grew to love the Application and Cache stores. They're awesome. For the uninitiated, you can just throw your data-logic objects into them, and hey-presto, you only need query the database once for a bit of data.
By far one of the best ASPNET features, IMO.
I've since ditched Windows for Linux, and therefore PHP, Python and Ruby for webdev. I use PHP most because I dev several open source projects, all using PHP.
Needless to say, I've explored what PHP has to offer in terms of caching data-objects. So far I've played with:
Serializing to file (a pretty slow/expensive process)
Writing the data to file as JSON/XML/plaintext/etc (even slower for read ops)
Writing the data to file as pure PHP (the fastest read, but quite a convoluted write op)
I should stress now that I'm looking for a solution that doesn't rely on a third party app (eg memcached) as the apps are installed in all sorts of scenarios, most of which don't have install rights (eg: a cheap shared hosting account).
So back to what I'm doing now, is persisting to file secure? Rule 1 in production server security has always been disable file-writing, but I really don't see any way PHP could cache if it couldn't write. Are there any tips and/or tricks to boost the security?
Is there another persist-to-file method that I'm forgetting?
Are there any better methods of caching in "limited" environments?
Serializing is quite safe and commonly used. There is an alternative however, and that is to cache to memory. Check out memcached and APC, they're both free and highly performant. This article on different caching techniques in PHP might also be of interest.
Re: Is there another persist-to-file method that I'm forgetting?
It's of limited utility but if you have a particularly beefy database query you could write the serialized object back out to an indexed database table. You'd still have the overhead of a database query, but it would be a simple select as opposed to the beefy query.
Re: Is persisting to file secure? and cheap shared hosting account)
The sad fact is cheap shared hosting isn't secure. How much do you trust the 100,500, or 1000 other people who have access to your server? For historic and (ironically) security reasons, shared hosting environments have PHP/Apache running as a unprivileged user (with PHP running as an Apache module). The security rational here is if the world facing apache process gets compromised, the exploiters only have access to an unprivileged account that can't screw with important system files.
The bad part is, that means whenever you write to a file using PHP, the owner of that file is the same unprivileged Apache user. This is true for every user on the system, which means anyone has read and write access to the files. The theoretical hackers in the above scenario would also have access to the files.
There's also a persistent bad practice in PHP of giving a directory permissions of 777 to directories and files to enable the unprivileged apache user to write files out, and then leaving the directory or file in that state. That gives anyone on the system read/write access.
Finally, you may think obscurity saves you. "There's no way they can know where my secret cache files are", but you'd be wrong. Shared hosting sets up users in the same group, and most default file masks will give your group users read permission on files you create. SSH into your shared hosting account sometime, navigate up a directory, and you can usually start browsing through other users files on the system. This can be used to sniff out writable files.
The solutions aren't pretty. Some hosts will offer a CGI Wrapper that lets you run PHP as a CGI. The benefit here is PHP will run as the owner of the script, which means it will run as you instead of the unprivileged user. Problem averted! New Problem! Traditional CGI is slow as molasses in February.
There is FastCGI, but FastCGI is finicky and requires constant tuning. Not many shared hosts offer it. If you find one that does, chances are they'll have APC enabled, and may even be able to provide a mechanism for memcached.
I had a similar problem, and thus wrote a solution, a memory cache written in PHP. It only requires the PHP build to support sockets. Other then that, it is a pure php solution and should run just fine on Shared hosting.
http://code.google.com/p/php-object-cache/
What I always do if I have to be able to write is to ensure I'm not writing anywhere I have PHP code. Typically my directory structure looks something like this (it's varied between projects, but this is the general idea):
project/
app/
html/
index.php
data/
cache/
app is not writable by the web server (neither is index.php, preferably). cache is writable and used for caching things such as parsed templates and objects. data is possibly writable, depending on need. That is, if the users upload data, it goes into data.
The web server gets pointed to project/html and whatever method is convenient is used to set up index.php as the script to run for every page in the project. You can use mod_rewrite in Apache, or content negotiation (my preference but often not possible), or whatever other method you like.
All your real code lives in app, which is not directly accessible by the web server, but should be added to the PHP path.
This has worked quite well for me for several projects. I've even been able to get, for instance, Wikimedia to work with a modified version of this structure.
Oh... and I'd use serialize()/unserialize() to do the caching, although generating PHP code has a certain appeal. All the templating engines I know of generate PHP code to execute, making post-parse very fast.
If you have access to the Database Query Cache (ie. MySQL) you could go with serializing your objects and storing them in the DB. The database will take care of holding the query results in memory so that should be pretty fast.
You don't spell out -why- you're trying to cache objects. Are you trying to speed up a slow database query, work around expensive object instantiation, avoid repeated generation of complex page, maintain application state or are you just compulsively storing away objects in case of a long winter?
The best solution, given the atrocious limitations of most low-cost shared hosting, is going to depend on what you're trying to accomplish. Going for bottom of the barrel shared-hosting means you have to accept that you won't be working with the best tools. The numbers are hard to quantify, but there's a trade off between hosting costs, site performance & developer time (ie - fast, cheap or easy).
It's in theory possible to store objects in sessions. That might get you past the file writing disabled problem. Additionally you could store the session in a mysql memory backed table to speed up the query.
Some hosting places may have APC compiled in.. That would allow you to store the objects in memory.

Securing DB and session-data on a PHP shared host

I wrote a PHP web-application using SQLite and sessions stored on filesystem.
This is functionally fine and attractively low maintenance. But, now it needs to run on a shared host.
All web-applications on the shared host run as the same user, so my users' session data is vulnerable, as is the database, code, etc.
Many recommend storing sessions in DBMS such as MySQL in this situation. So at first I thought I will just do that, and move the SQLite data into MySQL too. But then I realized the MySQL credentials need to be readable by the web application user, so I'm back to square one.
I think the best solution is to use PHP as a CGI so it runs as different user for each web-application. This sounds great, but my host does not do this it uses mod_php. Are there any drawbacks from an admin's point-of-view for enabling this? (performance, backward compatibility, etc)? If not then I will ask them to enable this.
Otherwise, is there anything I can do to secure my database and session data in this situation?
As long as your code is running as the shared web user, anything stored on the server is going to be vulnerable. Any other user could write a PHP script to examine any readable file on the server, including your data and PHP code.
If your hosting provider will allow it, running as PHP as a CGI under a different user will help, but I expect there will be a significant performance hit, as each request will require a new process to be created. (You could look at FCGI as a better-performing alternative.)
The other approach would be to set a cookie based on something the user provides, and use that to encrypt session data. For instance, when the user logs in, take a hash of their username, password (as just supplied by them) and the current time, encrypt the session data with the hash, set a cookie containing the hash. On the next request, you'll get the cookie back, which you can then use to decrypt the session data. Note however that this will only protect the current session data; your user table, other data, and code will still be vulnerable.
In this situation, you need to decide whether the tradeoff of the low cost of shared hosting is acceptable considering the reduced security it provides. This will depend on your application, and it may be that rather than trying to come up with a complex (and possibly not even very effective) way to add security, you're better off just accepting the risk.
I don't view security as all or nothing. There are steps you can take. Give the web db user only the permissions it needs. Store passwords as hashes. Use openid login so users provide their credentials over SSL.
PHP on cgi can be slower and some hosts may simply not want to support more than one environment.
You may need to stick with your host for some reason, but generally there are so many available that it is a good reminder for people to compare functionality and security as well as cost. I have noticed many companies starting to offer virtual machine hosting -- nearly dedicated server level security in terms of isolating your code from other users -- at what is to me reasonable cost.
A shared host is no way to run a web site if you are conscious about privacy and security of your data from the sites that you share the server with. Anything accessible to your web application is fair game for the others; it'll only be a matter of time before they can access it (assuming they do have incentive to do that to you).
"you can place your DB connection variables in a file below the web root. this will at least protect it from web access. if you're going to use file based sessions as well, you can set the session path in your user's directory and again outside the web root."
I don't have an account so I can't downvote that.. but seriously it is not even relevant to the question.
Duh you store stuff outside the webroot. That goes for any hosting scenario and is not specific to shared hosting. We're not talking about protecting from outsiders here. We're talking about protecting from other applications on the same machine.
To the OP I think PHP as CGI is the most secure solution, as you already suggested yourself. But as someone else said there is a performance hit with this.
Something you might look at is moving your sessions and db to MySQL and using safe_mode and/or open_basedir.
I would solve the problem with a infrasturcture change instead of a code one.
Consider upgrading to a VPS server. Nowdays you can get them very inexpensive. I've seen VPS's starting # 10$/mo.

Categories