I'm working on an app using CakePHP, and I have some generic settings that admins can change that affect the whole site, but I'm not quite sure how to handle saving/editing the data. These settings are for site-control related things, such as enabling/disabling new user registrations, enabling/disabling creating of new posts/topics, etc...
Since these aren't specific to individual users and there is only 1 set of values for the entire site, is it more advantageous to just use an XML file or should I create a table (which would only have 1 row)?
If I go with the XML route, are there any security related issues I should be aware of?
I saw this post but the accepted answer was on a per-user basis. This is only 1 set for the entire site.
This sort of configuration is general best left inside a config file on disk.
Also, I wouldn't use XML. I would use JSON. It's much easier to deal with, and you get proper data types out of the box for normal things.
http://php.net/manual/en/function.json-encode.php
http://php.net/manual/en/function.json-decode.php
There are no special security issues to storing config on disk. Just keep it out of the web server's document root, like everything else that isn't intended to be loaded directly by the user. Your config and application code should be only writable by users with that privilege.
I think it should be added in database which make easier to make changes by administrators . It should be then cached and used for faster execution of settings. You can store cache in place you like and that is all you have to do.
Use http://book.cakephp.org/2.0/en/core-libraries/caching.html
Related
I need some help about the best way to persist an array of "settings" in php. The application load this settings every time that the page is "refreshed". I can't use databases and store the settings in a file is a bad idea if I need to check every time.
I think that a good solution is store the array (serialized) in a file and then read the file (only one time) and then "freeze" it in "a session"? But I don't know if its secure...
Can you help me?
EDIT:
i did forgot to say that the "settings" change more frecuently because the stored data, save a "cache table" of the rendered page modules. For this reason, i cant use a plain php file to store the settings.
There is no best way. You should choose from your needs. Each method has its own pros/cons. In case of database you will have network costs, but the opportunity to read settings from remote node or service may be beneficial. A file can be cached by php opcode cacher that will boost your app.
There is no need to "freeze" it in "a session".
I've read multiple comments about encrypting PHP session data, in case it is stored in a temp directory that is available on multiple accounts on a shared server. However, even if the data is encrypted, session_start() still generates filenames containing the session_id. For example,
sess_uivrkk2c5ksnv2hnt5rc8tvgi5
, where uivrkk2c5ksnv2hnt5rc8tvgi5 is the same session id I found in the cookie my browser received.
How is this problem typically addressed / could someone point me to an example? All of the simple examples I've found only address encrypting the data, not changing the filename.
Just to see what would happen, I made a SessionHandler wrapper that would do an MD5 hash on the $session_id variable before passing it on to its parent function, but that did not work. Instead, I ended up with two files: a blank one (with session_id as a part of its name) and a full one (with an MD5'ed session_id). Also, there was the problem of close() not accepting session_id as a parameter, so I couldn't pass it on to its parent.
EDIT: I 'm learning about php sessions, this isn't for a live commercial site, etc.
Yes, in some scenarios (i.e. a very incompetently configured server - although these do unfortunately exist) on a shared server your session data may be readable by other people. Trying to hide the session files by changing their names serves no useful purpose - this is described as "Security through Obscurity". Go and Google the phrase - it is usually described as an oxymoron.
If your question is how do you prevent other customers accessing your session data on a badly configured server then the sensible choices (in order of priority) are:
switch service provider
use a custom session handler to store the data somewhere secure (e.g. database) There are lots of examples on the web - quality varies
use a custom session handler to encrypt the data and use file storage. Again you don't need to write the code yourself - just scrutinize any candidates
If you want to find out if your provider might be a culprit - just have a look at the value of FILE. Does it look as if you have access to the root filesystem? Write a script which tries to read from outside your home directory. If you can't then the provider may have set an open_basedir restriction (it is possible to get around this - again Google will tell you how).
I'm using a Symfony 2 to generate my pages from data in a MySQL database. For most content, users have to be authenticated but the content itself does not change often and does not need to be customized for the users. So what's a good caching strategy for avoiding database calls while still maintaining the auth check?
Simply put, use Memcache to cache the SQL result-set for extended period of time.
Maybe this be too huge change, but the following scheme may be useful in the case:
Create several sets of pages, one for not-yet-authed users (let's put in the site root), and others for authenticated users that should see the same content (say, it two or more should see the same content when they are authenticated, then we'll create only one set for all of them), and put it into directory under root. Then form simple .htaccess/.htpasswd files for each of such 'for-authed-only' directory and then it'll be webserver's problem not your script.
Hope you got the idea. It is fuzzy to say, but will be easy to implement.
Example: say you care to allow only authenticated users to see page '/topsecret.html' on the site. Create dir (/authed), establish HTTP-auth on it, and put your topsecret.html into the dir (so it'll be '/authed/topsecret.html'). Now edit '/topsecret.html' and simple replace it's main content with 'sorry, please authenticate yourself' link that'll point to '/authed/topsecret.html'.
If you use Symfony2, you are using Doctrine2
if you use Doctrine2, caching should be enabled by default.
Choose your cache driver for your purposes and there should be no problem.
You might also be specifically interested in query result caching.
Do not use Doctrine without a metadata and query cache! Doctrine is
highly optimized for working with caches. The main parts in Doctrine
that are optimized for caching are the metadata mapping information
with the metadata cache and the DQL to SQL conversions with the query
cache. These 2 caches require only an absolute minimum of memory yet
they heavily improve the runtime performance of Doctrine. The
recommended cache driver to use with Doctrine is APC. APC provides you
with an opcode-cache (which is highly recommended anyway) and a very
fast in-memory cache storage that you can use for the metadata and
query caches
I solved this by using Zend_Cache inside the cacheable actions to store the rendered template result. I then create a new Response object from the cached content. If the cache is empty, I generate the content.
I thought of creating a plugin that checks for an annotation and stores the Response output automatically but it turned out that I only have 3-4 display actions that are cacheable and have very complex cache ID creation rules, so I put the caching logic directly into the controller code.
It appears that you have a lot of options for caching with symfony http://www.symfony-project.org/book/1_2/12-Caching (not for 2 but my guess is not a lot has changed).
You could put your heavy sql statements in its own script and turn caching on for that script
list:
enabled: on
with_layout: false # Default value
lifetime: 86400 # Default value
Further if you are sure that the generated tag won't change for a while you could use symfony to tell the user's browser not even to bother your server for the content which will cause the page to load nearly instananeously for the user.
$this->getResponse()->addCacheControlHttpHeader('max_age=1200'); // in seconds - less than 1 year seconds
Just make sure your max age is small enough that when something changes (say a code update) that the user doesn't get stuck with old page since there is no way to force them to request that page again short of changing the url.
I'm currently trying to create a CMS using PHP, purely in the interest of education. I want the administrators to be able to create content, which will be parsed and saved on the server storage in pure HTML form to avoid the overhead that executing PHP script would incur. Unfortunately, I could only think of a few ways of doing so:
Setting write permission on every directory where the CMS should want to write a file. This sounds like quite a bad idea.
Setting write permissions on a single cached directory. A PHP script could then include or fopen/fread/echo the content from a file in the cached directory at request-time. This could perhaps be carried out in a Mediawiki-esque fashion: something like index.php?page=xyz could read and echo content from cached/xyz.html at runtime. However, I'll need to ensure the sanity of $_GET['page'] to prevent nasty variations like index.php?page=http://www.bad-site.org/malicious-script.js.
I'm personally not too thrilled by the second idea, but the first one sounds very insecure. Could someone please suggest a good way of getting this done?
EDIT: I'm not in the favour of fetching data from the database. The only time I would want to fetch data from the database would be when the content is cached. Secondly, I do not have access to memcached or any PHP accelerator.
Since you're building a CMS, you'll have to accept that if the user wants to do evil things to visitors, they very likely can. That's true regardless of where you store your content.
If the public site is all static content, there's nothing wrong with letting the CMS write the files directly. However, you'll want to configure the web server to not execute anything in any directory writable by the CMS.
Even though you don't want to hit the database every time, you can set up a cache to minimize database reads. Zend_Cache works very nicely for this, and can be used quite effectively as a stand-alone component.
You should put your pages in a database and retrieve them using parameterized SQL queries.
I'd go with the second option but modify it so the files are retrieved using mod_rewrite rather than a custom php function.
I'm running a web application that allows a user to log in. The user can add/remove content to his/her 'library' which is displayed on a page called "library.php". Instead of querying the database for the contents of the users library everytime they load "library.php", I want to store it globally for PHP when the user logs in, so that the query is only run once. Is there a best practice for doing this? fx. storing their library in an array in a session?
Thanks for your time
If you store each user's library in a $_SESSION as an array, as you suggested (which is definitely possible) you will have to make sure that any updates the user makes to the library are instantly reflected to that session variable.
Honestly, unless there is some seriously heavy querying going on to fetch a library, or you have tons of traffic, I would just stick to 'execute query whenever the user hits library.php'.
Consider the size of the data. Multiply that by the maximum number of concurrent users.
Then compare that the to memory avaiable on your server. Also consider whether or not this is a shared server; other sites needs resources too.
Based on this, it is probably best to either create a file that can be used (as per Remi's comment), or remain in the default stateless form and read every time. I doubt that reading the data each time is creating much of an overhead.
When the user login you can generate a xml file (USER_ID.xml for instance) that you display with xslt.
http://php.net/manual/en/book.xslt.php
Each PHP script dies when it completes, so data can not be kept permanentely live in a web application as you would do in a PC application.
One way could be sessions, but it depends on the amount of data you want to save. According to your example you are talking about a library, so it sounds to me like big quantity of data need to be saved, in such case the DB is the way to go, and yes you have to query it each time.
Another way could be to save them in an array inside a php file, but in the same way you have to query the DB each time, you would have to include such php file each time.
Since this is a db performance optimization, I would suggest that you take a look at memcached which matches your problem perfectly:
memcached is [..] intended for use in speeding
up dynamic web applications by
alleviating database load.
I think it would be best to store it in a Session.
It the user logs in, the Session is being created and you can save data in it using the superglobal:
$_SESSION['key'] = "value";
You can also store Arrays or everything else there and it can be cleared if the user logs out.
you care for performance; Please note:
Session may use database or file to store data.
database is here to be used instead of files, for it's performance and abilities.
use database, it is designed to be used exactly in such situations!