Methods for caching PHP objects to file? - php

In ASPNET, I grew to love the Application and Cache stores. They're awesome. For the uninitiated, you can just throw your data-logic objects into them, and hey-presto, you only need query the database once for a bit of data.
By far one of the best ASPNET features, IMO.
I've since ditched Windows for Linux, and therefore PHP, Python and Ruby for webdev. I use PHP most because I dev several open source projects, all using PHP.
Needless to say, I've explored what PHP has to offer in terms of caching data-objects. So far I've played with:
Serializing to file (a pretty slow/expensive process)
Writing the data to file as JSON/XML/plaintext/etc (even slower for read ops)
Writing the data to file as pure PHP (the fastest read, but quite a convoluted write op)
I should stress now that I'm looking for a solution that doesn't rely on a third party app (eg memcached) as the apps are installed in all sorts of scenarios, most of which don't have install rights (eg: a cheap shared hosting account).
So back to what I'm doing now, is persisting to file secure? Rule 1 in production server security has always been disable file-writing, but I really don't see any way PHP could cache if it couldn't write. Are there any tips and/or tricks to boost the security?
Is there another persist-to-file method that I'm forgetting?
Are there any better methods of caching in "limited" environments?

Serializing is quite safe and commonly used. There is an alternative however, and that is to cache to memory. Check out memcached and APC, they're both free and highly performant. This article on different caching techniques in PHP might also be of interest.

Re: Is there another persist-to-file method that I'm forgetting?
It's of limited utility but if you have a particularly beefy database query you could write the serialized object back out to an indexed database table. You'd still have the overhead of a database query, but it would be a simple select as opposed to the beefy query.
Re: Is persisting to file secure? and cheap shared hosting account)
The sad fact is cheap shared hosting isn't secure. How much do you trust the 100,500, or 1000 other people who have access to your server? For historic and (ironically) security reasons, shared hosting environments have PHP/Apache running as a unprivileged user (with PHP running as an Apache module). The security rational here is if the world facing apache process gets compromised, the exploiters only have access to an unprivileged account that can't screw with important system files.
The bad part is, that means whenever you write to a file using PHP, the owner of that file is the same unprivileged Apache user. This is true for every user on the system, which means anyone has read and write access to the files. The theoretical hackers in the above scenario would also have access to the files.
There's also a persistent bad practice in PHP of giving a directory permissions of 777 to directories and files to enable the unprivileged apache user to write files out, and then leaving the directory or file in that state. That gives anyone on the system read/write access.
Finally, you may think obscurity saves you. "There's no way they can know where my secret cache files are", but you'd be wrong. Shared hosting sets up users in the same group, and most default file masks will give your group users read permission on files you create. SSH into your shared hosting account sometime, navigate up a directory, and you can usually start browsing through other users files on the system. This can be used to sniff out writable files.
The solutions aren't pretty. Some hosts will offer a CGI Wrapper that lets you run PHP as a CGI. The benefit here is PHP will run as the owner of the script, which means it will run as you instead of the unprivileged user. Problem averted! New Problem! Traditional CGI is slow as molasses in February.
There is FastCGI, but FastCGI is finicky and requires constant tuning. Not many shared hosts offer it. If you find one that does, chances are they'll have APC enabled, and may even be able to provide a mechanism for memcached.

I had a similar problem, and thus wrote a solution, a memory cache written in PHP. It only requires the PHP build to support sockets. Other then that, it is a pure php solution and should run just fine on Shared hosting.
http://code.google.com/p/php-object-cache/

What I always do if I have to be able to write is to ensure I'm not writing anywhere I have PHP code. Typically my directory structure looks something like this (it's varied between projects, but this is the general idea):
project/
app/
html/
index.php
data/
cache/
app is not writable by the web server (neither is index.php, preferably). cache is writable and used for caching things such as parsed templates and objects. data is possibly writable, depending on need. That is, if the users upload data, it goes into data.
The web server gets pointed to project/html and whatever method is convenient is used to set up index.php as the script to run for every page in the project. You can use mod_rewrite in Apache, or content negotiation (my preference but often not possible), or whatever other method you like.
All your real code lives in app, which is not directly accessible by the web server, but should be added to the PHP path.
This has worked quite well for me for several projects. I've even been able to get, for instance, Wikimedia to work with a modified version of this structure.
Oh... and I'd use serialize()/unserialize() to do the caching, although generating PHP code has a certain appeal. All the templating engines I know of generate PHP code to execute, making post-parse very fast.

If you have access to the Database Query Cache (ie. MySQL) you could go with serializing your objects and storing them in the DB. The database will take care of holding the query results in memory so that should be pretty fast.

You don't spell out -why- you're trying to cache objects. Are you trying to speed up a slow database query, work around expensive object instantiation, avoid repeated generation of complex page, maintain application state or are you just compulsively storing away objects in case of a long winter?
The best solution, given the atrocious limitations of most low-cost shared hosting, is going to depend on what you're trying to accomplish. Going for bottom of the barrel shared-hosting means you have to accept that you won't be working with the best tools. The numbers are hard to quantify, but there's a trade off between hosting costs, site performance & developer time (ie - fast, cheap or easy).

It's in theory possible to store objects in sessions. That might get you past the file writing disabled problem. Additionally you could store the session in a mysql memory backed table to speed up the query.

Some hosting places may have APC compiled in.. That would allow you to store the objects in memory.

Related

Is there a better way to store my functions on one server to use on multiple servers?

Here is my quick code. This allows me to keep one file for my commonly used PHP functions, and retrieve it as needed. This is mostly my private work and not for clients.
$include_contents = '<?php '.file_get_contents("http://example.com/inc/functions.php");
$handle = fopen("_functions.php", 'w');
fputs($handle, $include_contents );
fclose($handle);
include('_functions.php');
Don't do it!
The better way of doing this is to just host all the files you need on the local filesystem and include/require them normally.
There are no advantages to the approach you're suggesting, and numerous disadvantages of such severity that there is nothing whatsoever to be gained by doing this.
If the host that provides all your includes goes down, then every site that uses them goes down too. Even if it doesn't go down the extra latency of doing an additional http request to include the remote file will add to the total execution time of your script.
Worse, is the remote server is compromised, or your DNS server is co-opted to point the remote site's domain to a malicious server, a hacker can introduce malicious code with ease.
Servers typically have huge amounts of storage available, there's no reason why you can't keep all your PHP scripts on the server that the pages they generate will be served from.
Note: if you insist in persisting with such insanity as this, you can enable allow_url_include and then you can include directly from a URL.
But like I said, this is a really really REALLY bad idea, and there's a reason why allow_url_include is off by default.
What you need is a deployment strategy. Composer should be a good starting point to investigate. It is a software, written in PHP, that aims to resolve external dependencies, but not at runtime, at installation time. Start with the link I gave you.
Beside from composer there are other deployment strategies and softwares for PHP, the most notable (but needs more effort to make it work) is PEAR or even PEAR2

Any PHP framework to use SECURELY on shared hosting?

Are there PHP frameworks that would allow me to generate an application and then use it SECURELY on a shared hosting, as far as a shared hosting security can be achieved? By this I mean, for example, not requiring any app/tmp directory with 777 access.
Not Symfony -> http://trac.symfony-project.org/wiki/SharedHostingNotSecure
Not CakePHP -> http://book.cakephp.org/view/911/Permissions
CodeIgniter -> "If you're a developer who lives in the real world of shared hosting accounts and clients with deadlines..." - looks promising, maybe this one? But I couldn't find anything specific to shared hosting file permissions in the documentation
Maybe ZendFramework? (I am not sure if it is the same category as PHP framework, looks like)
Any existing possible frameworks to use SECURELY on shared hosting??
Having to declare directories with 777 permissions is only a problem on cheapo and entry-level shared hosting systems. It's common to see the safe_mode hack and openbasedir restrictions in that area, which only prevent access via PHP but not other CGI interpreters.
Contemporary server setups use suexec/suphp, where every PHP scripts runs under the current accounts permissions. Therefore you don't need any world-write directories and most PHP application should be secure against cross-account tampering at least. The framework itself doesn't make a difference here.
You're looking at the right problem from the wrong point of view.
Having a directory with the rights 777 is not unsecure per se.
Having a 777'ed directory on a shared hosting is unsecure, because the http daemon is run for all clients under the same system account.
It is an intrinsic "feature" of shared hosting, that's why it's the cheapest. Yep, it's not cheap for nothing, it's cheap at the price of security.
If security is that important to you, buy a VPS. Nowadays VPSes are cheap enough.
I suggest Zend Framework. Doesn't require any file permissions as far as I know. Just needs some proper configuration. And yes, it is a PHP Framework. My entire library is under root owner, with read permissions for everyone and it works fine. Never needed to chmod anything. When it comes to libraries you can always define your own tmp path if it's needed by the class.
Most successful attacks happen because the user/admin sticks with the defaults; which are well-known. See Windows attacks. It's unbelievable how many "administrators" keep not only default URLs, default directory structures but also User IDs and passwords. On my website I see repeatedly some log in attempts at /wp-login.php with ID:PW as admin:admin. I don't even know if these are some defaults and my website is not even WP but it seems to be and I think those hackers will get lucky every once in a while.
Security I believe is always about raising the bar, make it more difficult. There is never a 100% guarantee of security. I think you can choose whatever framework or application but it is your job to make it more difficult by changing the defaults.
I can only speak for ZF and you can do some freaky configuration nobody will ever guess; unless your application has occasional errors and you show your error messages with full information.

PHP Code on Separate Server From Apache?

This is something I've never seen done, and I'm not turning up in my research, but my boss is interested in the idea. We are looking at some load balancing options, and wonder if it is possible to have apache and php installed on multiple servers, managed by a load balancer, but have all the actual php code on one server, with the various apache servers pointing to the one central code base?
For instance NFS mounts are certainly possible, but I wouldn't recommend it. A lot of the advantage of loadbalancing is lost, and you're reintroducing a single point of failure. When syncing code, and rsync cronjob can handle itself very nicely, or a manual rsync on deployment can be done.
What is the reason you would want this central code base? I'm about 99% sure there is a better solution then a single server dishing out code.
I believe it is possible. To add to Wrikken's answer, I can imagine NFS could be a good choice. However, there are some drawbacks and caveats. For one, when Apache tries access files on an NFS share that has gone away (connection dropped, host failed, etc) very bad things happen. Apache locks up, and continues to try to retrieve the file. The processes attempting to access the share, for whatever reason, do not die, and it is necessary to re-boot the server.
If you do wind up doing this, I would recommend an opcode cache, such as APC. APC will cache the pre-processed php locally, and eliminate round trips to your storage. Just be prepared to clear the opcode cache whenever you update your application/
PHP has to run under something to act as a web processor, Apache is the most popular. I've done NFS mounts across servers without problem. Chances are if NFS is down, the network is down. But it doesn't take long to do an rsync across servers to replicate files, and is really a better idea.
I'm not sure what your content is like, but you can separate static files like javascript, css and images so they are on their own server. lighttpd is a good, light weight web server for things like this. Then you end up with a "dedicated" php server. You don't even need a load balancer for this setup.
Keep in mind that PHP stores sessions on the local file system. So if you are using sessions, you need to make sure users always return to the same server. Otherwise you need to do something like store sessions in memcache.

UNIX wrapper in PHP

For a simple web application, I'd like to be able to take advantage of several UNIX features that are stable, have a long history, and are well proven in production, rather than having to write my own code. Take users, for example. Rather than having an entire user, group, permission infrastructure in my webapp, I'd like to be able to simply piggyback on top of the equivalent features of UNIX.
Is there a PHP library that will allow me to register users, log them in, manage permissions, etc. ?
It's really not a good idea to fumble around with the user and permission settings of the actual system that is hosting your site. If you want to protect individual directories of your site, you're better off using .htaccess files. If OTOH you're working with virtual URLs, you'll have a hard time mapping the UNIX directory permissions to them anyway.
Based on your comment to deceze's answer, are you looking for something like PHP's filesystem functions?
Then, there is system and its related functions, which gives access to Unix commands, but I'd recommend other ways of doing things if possible.
Edit: In response to the comments about needing user and group functionality:
Firstly, in case your plan is to let web users have access to the whole file system (or even just their regular log in directories), I just want to advise against that - lots of security concerns (eg. if someone else gets into a user's account, they could delete everything to which they have access).
The little experience I have with handling users in PHP was in some beginner level training. In that class, we had our users in an SQL database and used sessions. We didn't use SSL, but I'd advise some sort of crypto when passing passwords around.
If you're using Apache, it can handle the authentication for you. Other server software can probably do the same, but Apache is all I've ever worked with. I can't address whether Apache can handle sessions itself and don't have the time to research it right now.
If php or your webserver is running with root rights it should be no problem to use this functions.
For security reasons I would strongly recommend to reimplement these things or using any existing php library instead!!
It seems there are standard functions for interfacing with Kerberos or Radius in php.
These both have a long history and are well proven in production, while being separate from the system users.

Securing DB and session-data on a PHP shared host

I wrote a PHP web-application using SQLite and sessions stored on filesystem.
This is functionally fine and attractively low maintenance. But, now it needs to run on a shared host.
All web-applications on the shared host run as the same user, so my users' session data is vulnerable, as is the database, code, etc.
Many recommend storing sessions in DBMS such as MySQL in this situation. So at first I thought I will just do that, and move the SQLite data into MySQL too. But then I realized the MySQL credentials need to be readable by the web application user, so I'm back to square one.
I think the best solution is to use PHP as a CGI so it runs as different user for each web-application. This sounds great, but my host does not do this it uses mod_php. Are there any drawbacks from an admin's point-of-view for enabling this? (performance, backward compatibility, etc)? If not then I will ask them to enable this.
Otherwise, is there anything I can do to secure my database and session data in this situation?
As long as your code is running as the shared web user, anything stored on the server is going to be vulnerable. Any other user could write a PHP script to examine any readable file on the server, including your data and PHP code.
If your hosting provider will allow it, running as PHP as a CGI under a different user will help, but I expect there will be a significant performance hit, as each request will require a new process to be created. (You could look at FCGI as a better-performing alternative.)
The other approach would be to set a cookie based on something the user provides, and use that to encrypt session data. For instance, when the user logs in, take a hash of their username, password (as just supplied by them) and the current time, encrypt the session data with the hash, set a cookie containing the hash. On the next request, you'll get the cookie back, which you can then use to decrypt the session data. Note however that this will only protect the current session data; your user table, other data, and code will still be vulnerable.
In this situation, you need to decide whether the tradeoff of the low cost of shared hosting is acceptable considering the reduced security it provides. This will depend on your application, and it may be that rather than trying to come up with a complex (and possibly not even very effective) way to add security, you're better off just accepting the risk.
I don't view security as all or nothing. There are steps you can take. Give the web db user only the permissions it needs. Store passwords as hashes. Use openid login so users provide their credentials over SSL.
PHP on cgi can be slower and some hosts may simply not want to support more than one environment.
You may need to stick with your host for some reason, but generally there are so many available that it is a good reminder for people to compare functionality and security as well as cost. I have noticed many companies starting to offer virtual machine hosting -- nearly dedicated server level security in terms of isolating your code from other users -- at what is to me reasonable cost.
A shared host is no way to run a web site if you are conscious about privacy and security of your data from the sites that you share the server with. Anything accessible to your web application is fair game for the others; it'll only be a matter of time before they can access it (assuming they do have incentive to do that to you).
"you can place your DB connection variables in a file below the web root. this will at least protect it from web access. if you're going to use file based sessions as well, you can set the session path in your user's directory and again outside the web root."
I don't have an account so I can't downvote that.. but seriously it is not even relevant to the question.
Duh you store stuff outside the webroot. That goes for any hosting scenario and is not specific to shared hosting. We're not talking about protecting from outsiders here. We're talking about protecting from other applications on the same machine.
To the OP I think PHP as CGI is the most secure solution, as you already suggested yourself. But as someone else said there is a performance hit with this.
Something you might look at is moving your sessions and db to MySQL and using safe_mode and/or open_basedir.
I would solve the problem with a infrasturcture change instead of a code one.
Consider upgrading to a VPS server. Nowdays you can get them very inexpensive. I've seen VPS's starting # 10$/mo.

Categories