I'm currently working through the process of scaling out our server setup and we're to the point where we need to reconfigure the sessions to be stored in a high availability solution (having people logged out if a server goes down is not an option). We're trying to use Redis because we're already using it for other parts of the site. The trouble I run into is that there doesn't appear to be any support for this. Before I create my own session handler class, I thought I would ask if anyone else knows if there is a project for this use case.
Related
I'm developing a web app using Laravel (a PHP framework). The app is going to be used by about 30 of my co-workers on their Windows laptops.
My co-workers interview people on a regular basis. They will use the web app to add a new profile to a database once they interview somebody for the first time and they will append notes to these profiles on subsequent visits. Profiles and notes are stored using MySQL, but since I'm using Laravel, I could easily switch to another database.
Sometimes, my co-workers have to interview people when they're offline. They might visit a group of interviewees, add a few profiles and add some notes to existing ones during a session without any internet access.
How should I approach this?
With a local web server on every laptop. I've seen applications ship with some kind of installer including a LAMP stack, but I can't find any documentation on this.
I could install the app and something like XAMPP on every laptop
myself. That would be possible, but in the future more people might use the app and not all of them might be located nearby.
I could use Service Workers, maybe in connection with a libray such
as UpUp. This seems to be the most elegant approach.
I'd like to give option (3) a try, but my app is database driven and I'm not sure whether I could realize this approach:
Would it be possible to write all the (relevant) data from the DB to - let's say - a JSON file which could be accessed instead of the DB when in offline mode? We don't have to handle much data (less than 100 small data records should be available during an interview session).
When my co-workers add profiles or notes in offline mode, is there any "Web Service" way to insert data into the db that has been entered?
Thanks
Pida
I would think of it as building the app in "two parts".
First, the front end uses ajax calls to the back end (which isn't anything but a REST API). If there isn't any network connection, store the data in the browser using local storage.
When the user later has network connection, you could send the data that exists in the local storage to the back end and clear the local storage.
If you add web servers on the laptops, the databases and info will only be stored on their local laptops and would not be synced.
You can build what you describe using service workers to cache your site's static content to make it available offline, and a specific fetch handler in the service worker to detect a failed PUT or POST and queue the data in IndexedDB. You'd then periodically check IndexedDB for any queued data when your web app is loaded, and attempt to resend it.
I've described this approach in more detail at https://developers.google.com/web/showcase/case-study/service-workers-iowa#updates-to-users-schedules
That article assumes the use of the sw-precache library for caching your site's static assets, and the sw-toolbox library to provide runtime fetch handlers that check for failed business-logic requests. It also uses a promise-based IndexedDB wrapper called simpleDB although I'd probably go with the more recent idb library nowadays.
I am trying to configure a load-balanced environment using Yii 1.1.14 applications, but I seem to be having the problem where Yii does not keep a user logged in when the load balancer uses another node. Most of the time, when logging in, it will ask the user to login twice because it only logs in on one node, and then loads the page on another. Otherwise, it will ask the user to login again half-way through browsing.
The application is using DB sessions and I can see that the expire time is being updated in the database. Even in the case when it asks them to login again straight after they have already logged in, the session expire time is updated in the database. Does Yii do anything server dependent with the sessions?
I've searched around for hours but unable to find much on this topic, and wondering if anyone else has come across such problem.
On the server-side, I am using Nginx with PHP-FPM and Amazon's ELB as the load balancer. The work around (as a last resort) is to use sticky sessions on the load balancer, but then this does not work the best if a node was to go offline and force the user to use an alternative node.
Please let me know if I need to clarify anything better.
The issue was that the base path which was used to generate the application ID, prefixed to the authentication information in the session, did not match on each server. Amazon OpsWorks was deploying the code to the servers using the identical symlinked path, but the real path returned by PHP differed due to versioning and symlinking.
For example, the symlink path on both servers was '/app/current'. However, the actual path on one server was '/app/releases/2014010700' and the other was '/app/releases/2014010701', which was generating a different hash and therefore not working with the session.
Changing the base path to use the symlink path in my configuration file fixed the problem, whereas before it was using dirname() which was returning the real path of the symlinked contents. I also had to remove the realpath() function in setBasePath in the Yii framework.
The modifications I made to the Yii framework are quite specific for my issue, but for anyone else experiencing a similar issue with multiple nodes, I would double check to ensure each node contains the application in the exact same path.
Thank you to the following article: http://www.yiiframework.com/forum/index.php/topic/19574-multi-server-authentication-failure-with-db-sessions
Thought I'd answered this before, but took a bit to find my answer:
Yii session do not work in multi server
Short version: if you have Suhosin enabled, it's quite painful. Turn it off and things work much better. But yes, the answer is you can do ELB load balancing with Yii sessions without needing sticky sessions.
I have a website that has a lot of traffic and the nature of the website means that it can have a lot of requests in a specific time period.
I use amazon beanstalk to manage the load balancer and instances.
I can have up to 20 instances running and because FOSUserBundle uses Sessions to hold the data I am loosing users logins etc.
I know EB has stickiness but due to the nature of the site it gets overwhelmed and sometimes doesnt forward the correct user to the correct instance so I am loosing users again. Amazon are no help at all.
Is there a way to override this to use secure cookies (i know cookies arent secure but I could create my own crypt/decrypt method)
Any suggestions would be helpful :)
I found away to essentially negate the sessions stored on one server. I remember doing this with a custom php system (using this php net session I built a few years ago but did not think it would work with symfony. Since posting this questions I found PdoSessionStorage basically storing your sessions on a Database instead of files on the server or instances.
Please choose your syfmony version as namespaces sometimes change version to version
Link to PdoSessionStorage on Symfony
Is there a way to store/manage PHP sessions in a similar way that the IIS (Session State Service) ?
I want to have multiple front end web servers for an multi domain e-commerce platform and manage the sessions centrally. The idea being that is a server goes down users with cart contents will not have to start a new session when they are shifted to a another web server.
I know cookies and URL parameters could do it to a point but that's not answering the question.
You can register a SessionHandlerInterface which is backed by a shared database (e.g. MySQL Cluster).
For anyone looking for this because they are moving to Amazon Web Services, there are two options/alternatives:
Use the DynamoDB session handler from the AWS SDK for PHP. This essentially has the same effect as session replication. However, there are monetary costs from DynamoDB, especially if you need locking.
Use session stickiness in the load balancer. This is simpler to set up, and free, but is probably not quite as scalable, as requests from old sessions can't just be sent on to newly started servers.
The most scalable option is of course to get rid of server-side sessions, but that is not always easy without huge changes in backends and frontends, and in some cases not even desirable because of other considerations.
We have an old legacy PHP application. Now I want to write a new application module using Ruby on Rails.
Deployment is a one problem. I guess that it should be possible to run PHP app (via mod_php) and RoR app (via mod_proxy / mongrel) on a one Apache server. I don't want to use mod_rails because it requires to run php via fcgi. So is a risk of breaking something. Both PHP and RoR will use the same DB.
The tricky part is how to pass login info from PHP application to RoR app. Users login into PHP and their info is stored in PHP session data. The RoR app will be placed in a subdirectory of main PHP app (eg www.example.com/railsapp). So RoR should receive all HTTP cookies. And the question is how to extract PHP session data from RoR app.
Above this is just my first idea which is rather bad because of possible race conditions between PHP mod and RoR. I can modify the PHP app to store some info in DB when a user logs in. But I don't know how to handle a case when PHP session data expired and some data in DB should be updated (logout a user).
Does anyone solved similar problem? Or at least can point a most promising direction?
Update: It should be possible to configure mod_php to store session data in sql DB. In this way there will be no race conditions. DB engine should prevent race conditions.
Update2: Actually it is possible to use mod_rails with Apache in prefork mode. Which is required by the mod_php. It is just recommended for mod_rails to run Apache in worker mpm mode. So the whole deployment of PHP / RoR apps is greatly simplified.
First, if you are placing the rails app in a sub directory it is possible to use mod_rails. In your configuration for the PHP site you can have a location that has a root for rails.
<Location /railsapp>
DocumentRoot /.../app/public
</Location>
To get a session over to the rails side, you could either create a connect page on the rails and call it from the PHP side and pass in some data to login. You just need to protect this page from any request not from the localhost (easy to do).
You could also switch rails to use a database to store its sessions, you should then be able to generate a session id, store it in a cookie with the correct name and secret, and create a session in the database manually with that id.
You can also (which I recommend) have proxy page on the rails side which logs the user in and redirects them to their desired page. You could do it like this (not actual working code, but you get the idea):
PHP
$key = md5hash($user_id . $user_password_hash . $timestamp)
$url = "/railsapp/proxy?userid=" . $user_id . "&key=" . $key . "&page=home%2Fwelcome"
Rails App
Rails
map.proxy 'proxy', :controller => 'proxy', :action => 'connect'
class ProxyController < ActionController::Base
def connect
key = ...
if params[:key] == key
login_user params[:userid]
redirect_to params[:page]
else
render :nothing, :status => 403
end
end
end
I've done a mixed PHP/RoR app before (old PHP code, new hawt RoR features, as stuff needed fixing it got re-implemented). It's not really all that hard -- you just serve up the PHP as normal files, and use a 404 handler to redirect everything else to the Rails app.
As far as the session data goes, you could stuff it into a DB, but if you're willing to write/find routines to read and write PHP's marshalled data formats, PHP uses flock() to ensure that there are no race conditions in reading/writing the session data file. Do the same thing in your Rails app for minimal pain.
First, it seems like you're asking for trouble by mixing the two technologies. My first suggestion is "don't do that."
However, since you're probably not going to listen to that advice I'll make a suggestion about your actual question. In PHP apps that I've seen store session data in the database I've noticed too approaches to cleaning the data. Both include always time stamping the records so you know how old they are, and refreshing that time stamping from time to time while the user is active (sometimes every page view, sometimes less often depending on expected load and query count).
If your app does relatively few database calls, and therefore has a little time to spare, you can run an extra query against your session table every page view to at least certain pages. This means an extra query and a busy application that's a problem. The alternative tends to be a cron job that runs periodically to clean the table of expired records. These periodic cleaning jobs also can get run only when specific other tasks are done (like a user log in, which is often a little slow anyway since you have to setup the session data).
Hey, Ive always thought this is actually a pretty common problem from those moving to ruby from php who have legacey apps/data in php and am surprised its not asked more.
I would be tempted to go with either of the following two approaches, both of which have worked well in the past. The second option is going to require a little more work and both of them will require modifications to your existing login code:
1) Use openid to handle your login so you dont have to worry about rolling your own solution. Either run your own openid server or use google, yahoo etc. Run each of your apps as a unique subdomain. There are openid plugins/code for rails and php and it is a tried and tested secure standard
2) Use memcached to store login sessions (as opposed to a db). Have a login page (either in you php app or rails app). Whether you ruby app or php app is accessed the oucome would be similar to this.
a) User tries to access a login protected page
b) App checks its own session data to see if user is logged in
c) If correct session data exists then user is logged in and app proceeds
d) If App cant find current login session checks for browser cookie
e) Then App checks memcache for this cookie key.
f) if the cookie key exists then the user must be logged in (otherwise redirects to login page)
g) app grabs user_id from memcached (stored as cookie key) so it knows which user is logged in
h) app sets login session so it doesnt have to go past c again unless session expires
This is an overly simplified version of what is happening but does work.
I would use memcached in this scenario because it has auto expiration of values which you define and is damn fast. Remember dont pass usernames and passwords between apps or even userids for that matter. Pass a unique key which is merely a pointer to information stored in your DB/memcached
1) I successfully use Passenger and mod_php simultaneously on a single prefork Apache, both on the development machine and on the server.
2) I'd connect the applications via a HTTP challenge/response sequence or maybe via shared memcached keys.