It's been a very long time since I've been on here so I hope I'm doing this properly.
I'm working on a web project that is basically a twitch streamer's profile type of site. It's being designed to display only that streamers information about their stream. So no other users should be logging in via twitch or otherwise.
The problem I'm having is the recent changes to the twitch API requires OAuth to be utilized and the token resets after a period of time as it should.
The question really is this..
How would I go about privately storing a variable on the site? This variable would need to last around 30 - 60 days, not be stored anywhere other than the server, be inaccessible to anyone, and can be changed easily after the time period is up.
I was looking through APC and realized since I'm using php 7.2 that its been replaced with APCU. Reviewing details on APCU there would be problems with that information not being stored for the time frame I need and could possibly just up and get cleared. So that marked that out.
I thought about local file storage but I need to keep the information I'm gathering secret so local file is a nope.
I intend to release the source code once the project is finished so I don't want to use databases as it just makes it more complicated for the simplistic user.
sessions are stored as cookies and that puts the string in the hands of whoever might want to be malicious so thats a nope.
Long story short I'm trying to avoid the following.
local file storage
databases
sessions
any kind of caching that would be unreliable
I just need a step in the right direction.
Sorry about the lengthy post.
If you want minimal setup and infrastructure, sqlite can be a good option. It's still a database but it works within PHP directly and only requires a file to store the data in. This solution is very often used in mobile apps as well so developers can benefit from the power of SQL while keeping it simple for the user.
sqlitetutorial.net has good tutorials to get your started.
I believe I found the solution to the problem. I can still utilize files for the storage and if I place a .htaccess to the directory I wish to restrict with deny from all the server can still access the files and read them accordingly while at the same time restricting outside intrusion.
A little more trickery with some .htaccess and I can just send a 404 response so it looks like there's nothing special there instead of alerting anyone of the files or directories presence.
Thanks to SystemGlitch and imvain2 for making me think more on that problem so I could realize the solution possibilities.
Related
I am trying to configure a load-balanced environment using Yii 1.1.14 applications, but I seem to be having the problem where Yii does not keep a user logged in when the load balancer uses another node. Most of the time, when logging in, it will ask the user to login twice because it only logs in on one node, and then loads the page on another. Otherwise, it will ask the user to login again half-way through browsing.
The application is using DB sessions and I can see that the expire time is being updated in the database. Even in the case when it asks them to login again straight after they have already logged in, the session expire time is updated in the database. Does Yii do anything server dependent with the sessions?
I've searched around for hours but unable to find much on this topic, and wondering if anyone else has come across such problem.
On the server-side, I am using Nginx with PHP-FPM and Amazon's ELB as the load balancer. The work around (as a last resort) is to use sticky sessions on the load balancer, but then this does not work the best if a node was to go offline and force the user to use an alternative node.
Please let me know if I need to clarify anything better.
The issue was that the base path which was used to generate the application ID, prefixed to the authentication information in the session, did not match on each server. Amazon OpsWorks was deploying the code to the servers using the identical symlinked path, but the real path returned by PHP differed due to versioning and symlinking.
For example, the symlink path on both servers was '/app/current'. However, the actual path on one server was '/app/releases/2014010700' and the other was '/app/releases/2014010701', which was generating a different hash and therefore not working with the session.
Changing the base path to use the symlink path in my configuration file fixed the problem, whereas before it was using dirname() which was returning the real path of the symlinked contents. I also had to remove the realpath() function in setBasePath in the Yii framework.
The modifications I made to the Yii framework are quite specific for my issue, but for anyone else experiencing a similar issue with multiple nodes, I would double check to ensure each node contains the application in the exact same path.
Thank you to the following article: http://www.yiiframework.com/forum/index.php/topic/19574-multi-server-authentication-failure-with-db-sessions
Thought I'd answered this before, but took a bit to find my answer:
Yii session do not work in multi server
Short version: if you have Suhosin enabled, it's quite painful. Turn it off and things work much better. But yes, the answer is you can do ELB load balancing with Yii sessions without needing sticky sessions.
Have done some research and found some stuff that may be helpful.
I would like your opinion about my approaches on this.
THE GOAL
I will develop an application in PHP (That's the only language I know and unfortunately I don't have time to learn another one right now). I want this application to be able to run offline and locally to any pc. I will use Wamp server and cakePHP framework for this.
THE PROBLEM
This application will be for sale. So I will need some activation method to prevent each app from being used in multiple computers. I don't want something complicated or very very secure. I just need something simple, to prevent non-programmers to run this app in any computer. Of course, the more secure, the better! :)
POSSIBLE SOLUTIONS I AM THINKING OF
First of all, I am thinking to force users to activate their application, by going online during installation. That way they could get a unique KEY from my online database.
I found php's shell_exec command. So I am thinking, during online installation, to get the Host ID (Machine ID) of that computer, send it to my server and store it to my database next to a unique KEY. Then Machine ID and unique KEY can be stored to a php file. (Could I store it somewhere more secure? Maybe encrypt it?)
Every time the user opens the application, php will read machine ID. If not the same with the one stored in php file, an activation will be required. (Maybe could store computer's name too or some other id?)
Is that a good approach? Would it be possible?
Another approach I am thinking of, is to have a guy create a non php installation file. When run, will promp wamp installation and when installation finishes, will transfer all necessary files to wamp root folder (automatization for the user). I can only guess though this will work, as my knowledge over other languages is limited...
Could I benefit from this in validation terms? Can a non php file interact with my php application and validate it, for only one unique computer?
Any info will be very appreciated. I have just started building the application and want to know if there is a good way (or non) to secure it.
Thanks!
There is no point in all of this because if people want they can simply crack any of the copy protection methods you came up with. This also applies to any other app written in any other language. If people want to use it without permissions there are ways to do that.
There are some ways to obfuscate the code (see Is there a code obfuscator for PHP?) but these solutions are just silly because if people really want they can get the code in plain text anyways.
A better idea might be to run the app on your server and allow people to pay for it monthly, Software as a Service like Google Apps for Business.
I asked a recent question regarding the use of readfile() for remotely executing PHP, but maybe I'd be better off setting out the problem to see if I'm thinking the wrong way about things, so here goes:
I have a PHP website that requires users to login, includes lots of forms, database connections and makes use of $_SESSION variables to keep track of various things
I have a potential client who would like to use the functionality of my website, but on their own server, controlled by them. They would probably want to restyle the website using content and CSS files local to their server, but that's a problem for later
I don't want to show them my PHP code, since that's the value of what I'd be providing.
I had thought to do this with calls to include() from the client's server to mine, which at least keeps variable scope intact, but many sites (and the PHP docs) seem to recommend readfile(), file_get_contents() or similar. Ideally I'd like to have a simple wrapper file on the client's server for each "real" one on my server.
Any suggestions as to how I might accomplish what I need?
Thanks,
ColmF
As suggested, comment posted as an answer & modified a touch
PHP is an interpretive language and as such 'reads' the files and parses them. Yes it can store cached byte code in certain cases but it's not like the higher level languages that compile and work in bytecode. Which means that the php 'compiler' requires your actual source code to work. Check out zend.com/en/products/guard which might do what you want though I believe it means your client has to use the Zend Server.
Failing that sign a contract with the company that includes clauses of not reusing your code / etc etc. That's your best protection in this case. You should also be careful though, if you're using anything under an 'open source' license your entire app may be considered open source and thus this is all moot.
This is not a non-standard practice for many companies. I have produced software I'm particularly proud of and a company wants to use it. As they believe in their own information security for either 'personal' reasons or because they have to comply to a standard such as PCI there are times my application must run in their environments. I have offered my products as 'web services' where they query my servers with data and recieve responses. In that case my source is completely protected as this is no different than any other closed API. In every case I have licensed the copy to the client with provisions that they are not allowed to modify nor distribute it. This is a legal binding contract and completely expected from the clients side of things. Of course there were provisions that I would provide support etc etc but that's neither here nor there.
Short answers:
Legal agreement, likely your best bet from everyone's point of view
Zend guard like product, never used it so I can't vouch for it
Private API but this won't really work for you as the client needs to host it
Good luck!
If they want it wholly contained on their server then your best bet is a legal solution not a technical one.
You license the software to them and you make sure the contract states the intellectual property belongs to you and it cannot be copied/distributed etc without prior permission (obviously you'll need some better legalese than that, but you get the idea).
Rather than remote execution, I suggest you use a PHP source protection system, such as Zend Guard, ionCube or sourceguardian.
http://www.zend.com/en/products/guard/
http://www.ioncube.com/
http://www.sourceguardian.com/
Basically, you're looking for a way to proxy your application out to a remote server (i.e.: your clients). To use something like readfile() on the client's site is fine, but you're still going to need multiple scripts on their end. Basically, readfile scrapes what's available at a particular file path or URL and pipes it to the end user. So if I were to do readfile('google.com'), it would output the source code for Google's homepage.
Assuming you don't just want to have a dummy form on your clients' sites, you're going to need to have some code hanging out on their end. The code is going to have to intercept the form submissions (so you'll need a URL parameter on the page you're scraping with readfile to tell your code that the form submission URL is your client's site and not your own). This page (the form submission handler page) will need to make calls back to your own site. Think something like this:
readfile("https://your.site/whatever?{$_SERVER['QUERY_STRING']}");
Your site is then going to process the response and then pass everything back to your clients' sites.
Hopefully I've gotten you on the right path. Let me know if I was unclear; I realize this is a lot of info.
I think you're going to have a hard time with this unless you want some kind of funny wrapper that does curl type requests to your server. Especially when it comes to handling things like sessions and cookies.
Are you sure a PHP obfuscator wouldn't be sufficient for what you are doing?
Instead of hosting it yourself, why not do what most php applications do and simply distribute the program to your client with an auto-update feature? Hosting it yourself is complicated, from management of websites to who is paying for the hosting.
If you don't want it to be distributed, then find a pre-written license that allows you to do this. If you can't find one then it's time to talk to a lawyer.
You can't stop them from seeing your code. You can make it very hard for them to understand your code, which is a good second best. See our SD PHP Obfuscator for a tool that will scramble the identifiers and the whitespacing in the code, making it much more difficult to understand.
I am doing development work on a site with a strange server set up where sessions basically don't work. It's kind of a long story, but the main crux is it's a cluster of servers that are syncronized from an FTP server every few minutes. And for example, anything written to the filesystem in PHP gets deleted within 5 minutes.
So this means sessions don't work and I get some strange problems in phpMyAdmin, like it forgetting which page of a table I was on - I click 'next page' and end up back at the start again.
I've also tried SQL Buddy and am getting similar problems.
Are there any equivalents that don't use sessions? Doesn't need to be as full-featured as PMA, it's mainly for adding/editing some stuff.
There's always the MySQL GUI Tools.
You can make phpmyadmin use other authentication methods:
http://www.phpmyadmin.net/documentation/#authentication_modes
Depends how much security you need and how restricted you are, but 'config' authentication mode with a custom .htaccess sounds like it might work for you.
I don't know how hard it would be to plug this into phpMyAdmin, but PHP has a functionnality that allows sessions to be stored in another way than using files.
In your case, you already have a database server, obviously, so maybe you could create a "technical" database, and use it to store sessions ? This way, you would still be able use phpMyAdmin (which is quite a good tool), but your problem should be solved.
The PHP function you need to know to do that is session_set_save_handler :
session_set_save_handler() sets the
user-level session storage functions
which are used for storing and
retrieving data associated with a
session.
This is most useful when a
storage method other than those
supplied by PHP sessions is preferred.
i.e. Storing the session data in a
local database.
There are a couple of examples (take a look at the comments at the bottom of the page : some might be helpful)
For instance, Drupal uses this solution to store sessions into DB instead of files, by default.
Another solution would be to use memcached to store your sessions -- of course, if you don't have a memcached server at your disposal, this might be a bit harder than storing them in DB ^^
Or, of course, if you have access to your DB server via the network, you could install phpMyAdmin on your local computer, or use a tool like MySQL GUI Tools and its MySQL Query Browser.
I found a neat solution! SQLBuddy has a feature where you can put the password in the config file and it will use it automatically without a need to log in.
Obviously this is insecure by default, but coupled with a .htaccess and .htpasswd (which does work on the server) I've now got a secure login.
OK, so I've got this totally rare an unique scenario of a load balanced PHP website. The bummer is - it didn't used to be load balanced. Now we're starting to get issues...
Currently the only issue is with PHP sessions. Naturally nobody thought of this issue at first so the PHP session configuration was left at its defaults. Thus both servers have their own little stash of session files, and woe is the user who gets the next request thrown to the other server, because that doesn't have the session he created on the first one.
Now, I've been reading PHP manual on how to solve this situation. There I found the nice function of session_set_save_handler(). (And, coincidentally, this topic on SO) Neat. Except I'll have to call this function in all the pages of the website. And developers of future pages would have to remember to call it all the time as well. Feels kinda clumsy, not to mention probably violating a dozen best coding practices. It would be much nicer if I could just flip some global configuration option and VoilĂ - the sessions all get magically stored in a DB or a memory cache or something.
Any ideas on how to do this?
Added: To clarify - I expect this to be a standard situation with a standard solution. FYI - I have a MySQL DB available. Surely there must be some ready-to-use code out there that solves this? I can, of course, write my own session saving stuff and auto_prepend option pointed out by Greg seems promising - but that would feel like reinventing the wheel. :P
Added 2: The load balancing is DNS based. I'm not sure how this works, but I guess it should be something like this.
Added 3: OK, I see that one solution is to use auto_prepend option to insert a call to session_set_save_handler() in every script and write my own DB persister, perhaps throwing in calls to memcached for better performance. Fair enough.
Is there also some way that I could avoid coding all this myself? Like some famous and well-tested PHP plugin?
Added much, much later: This is the way I went in the end: How to properly implement a custom session persister in PHP + MySQL?
Also, I simply included the session handler manually in all pages.
You could set PHP to handle the sessions in the database, so all your servers share same session information as all servers use the same database for that.
A good tutorial for that can be found here.
The way we handle this is through memcached. All it takes is changing the php.ini similar to the following:
session.save_handler = memcache
session.save_path = "tcp://path.to.memcached.server:11211"
We use AWS ElastiCache, so the server path is a domain, but I'm sure it'd be similar for local memcached as well.
This method doesn't require any application code changes.
You don't mentioned what technology you are using for load balancing (software, hardware etc.); but in any case, the solution to your problem is to employ "sticky sessions" on the load balancer.
In summary, this means that when the first request from a "new" visitor comes in, they are assigned a specific server from the cluster: all future requests for the lifetime of their session are then directed to that server. In practice this means that applications written to work on a single server can be up-scaled to a balanced environment with zero/few code changes.
If you are using a hardware balancer, such as a Radware device, then the sticky sessions is configured as part of the cluster setup. Hardware devices usually give you more fine-grained control: such as which server a new user is assigned to (they can check for health status etc. and pick the most healthy / least utilised server), and more control of what happens when a server fails and drops out of the cluster. The drawback of hardware balancers is the cost - but they are worth it imho.
As for software balancers, it comes down to what you are using. For Apache there is the stickysession property on mod_proxy - and plenty of articles via google to get this working with the php session ( for example )
Edit:
From other comments posted after the original question, it sounds like your "balancing" is done via Round Robin DNS, so the above probably won't apply. I'll refrain from commenting further and starting a flame against round robin dns.
The easiest thing to do is configure your load balancer to always send the same session to the same server.
If you still want to use session_set_save_handler then maybe take a look at auto_prepend.
If you have time and you still want to check more solutions, take a look at
http://redis4you.com/articles.php?id=01..
Using redis you are fault tolerant. From my point of view, it could be better than memcache solutions because of this robustness.
If you are using php sessions you could share with NFS the /tmp directory, where I think the sessions are stored, between all the servers in the cluster. That way you don't need database.
Edited: You can also use an external service like memcachedb (persistent and fast) and store the session info in the memcachedb index and indentify it with a hash of the content or even the session ID.
When we had this situation we implemented some code that lives in a common header.
Essentially for each page we check if we know the session Id. If we dont we check if we're in the situation whehich you describe, by checking if we have stored sesion data in the DB.Otherwise we just start a new session.
Obviously this requires all relevant data to be copied to the DB, but if you encapsulate your session data in a seperate class then it works OK.
you could also try using memcache as session handler
Might be too late, but check this out: http://www.pureftpd.org/project/sharedance
Sharedance is a high-performance server to centralize ephemeral key/data
pairs on remote hosts, without the overhead and the complexity of an SQL
database.
It was mainly designed to share caches and sessions between a pool of web
servers. Access to a sharedance server is trivial through a simple PHP API and
it is compatible with the expectations of PHP 4 and PHP 5 session handlers.
When it comes to php session handling in the Load Balancing Cluster, it's best to have Sticky Sessions. For that ask the network of datacenter who is maintaining the load balancer to enable the sticky session. Once that is enabled you'll don't need worry about sessions at php end