nodejs & socket.io vs php - auth & integrate to php app - php

Please, I need a push (or kick) because I am feeling lost.
I have to write some kind of portal, which I would like to do by using php+mysql via ajax.
There is no problem with that, but part of the portal should be working in realtime - so , because I've been messing around with node.js & socket.io for while, and I think its pretty awesome, I am going to use it.
The problematic part is that I would like to get pushed on the right way to solve this:
I'm going to authenticate users in that php "portal thing" by setting and checking php sessions - really simple way to log user in, log user out, saving hash and microtime hash to database etc.
But how should I use this kind of signature and authentication in socket communication?
Is something shown in diagram below reasonable and legal?
If anyone could redirect me somewhere or point out risks and things I should be worried about.
It may be all stupid and nonsense. My problem is that I am newbie with node&socket (Ive been coding some simple chats etc..).
Thanks for any suggestions!

Its great that you are using node.js. I have been playing with it recently. Its Awesome!
There is a way you can use the PHP session in Nodejs. All you need to do is use the separate DB for session storage which can be accessed by both PHP and NodeJS. So this will fix your user 'Authenticaion' in both side.
I chose Redis over Memcache for custom session storage because you don't want to lose all your existing session data on server restart. For Nodejs you can easily search, install and configure the Redis.
Refer here for more useful info http://ericterpstra.com/2013/03/use-redis-instead-of-mysql-for-codeigniter-session-data

Related

Electron desktop app with online server and database?

I am working on a desktop application with electron and I am considering online storage to store data. I would like to get some idea on the approach as I couldn't find reliable answers from google search.
Approach 1. electron app (front end ) + php (like purchase a hosting package from godaddy with a domain e.g: www.mysite.com)
with this approach I am planning to create api calls in php to perform basic CRUD.
is this is a good way?
will this affect the speed/load time?
are there better ways for this situation?
Thank you very much in advance for the help.
Well, this is not an easy topic. Your solution could work: you Electron app ask your server for data and store data to it. Anyway the best solution depends from your application.
The most important points that you have to ask yourself are:
How often do you need to reach your server ?
Your users could work without data from server ?
How long does it takes to read and store data on your server ? (it's different if you store some kb or many gb of data)
The data stored online must be shared with other users or every user has access to its own data ?
If all the information are stored in your server your startup have to wait for the request to complete but you can show a loader or something like this to mitigate the waiting.
In my opinion you have many choices, from the simplest (and slowest) to the most complex (but that mitigate network lag):
Simple AJAX requests to your server: as you described you will do some HTTP requests to your server and read and write data to be displayed on your application. Your user will have to wait for the requests to complete. Show them some loading animations to mitigate the wait.
There are some solutions that save the data locally to your electron installation and then sync them online, Have a check to PuchDB for an example
Recently I'm looking at GraphQL. GraphQL is an API to query your data. It's not that easy but it has some interesting features, it has an internal cache and it's already studied for optimistic update. You update your application immediately thinking that your POST will be fine and then if something goes wrong you update it accordingly.
I'd also like to suggest you to try some solutions as a service. You don't have a server already and you will have to open a new contract so why don't you check some dedicated service like Firebase? Google Firebase Realtime Database allows you to work in javascript (just one language involved in the project), sync your data online automatically and between devices without the need to write any webservice. I'have just played with it for some prototypes but it looks very interesting and it's cheap. It also has a free plan that it's enough for many users.
Keep in mind that if your user has access only to their data the fastest and easies solution is to use a database inside your electron application. A sqlite database, an IndexDB database or even serialize to JSON and then store everything in localstorage (if your data fits size limits).
Hope this helps

Best approach for sharing sessions between CodeIgniter and NodeJS?

I'm trying to get NodeJS and CodeIgniter to share data with each other (sessions & db data). I've googled around quite a bit for a solution to this problem, but I still haven't found the most convenient way to do things. It seems the most appropriate way is to use some sort of caching software, such as memcached or redis as wrappers are available for Node and PHP.
This is what I've thought of so far:
Client logs in as normal on CodeIgniter powered website. Session is created and added to memcached.
Client connects to a secure socket.io server using SSL.
Client sends raw cookie data to socket.io server.
Server does some string splitting with the cookie data and gets the session id.
Server checks the cache to see if session id exists. If yes, user is logged in!
User logs out on CodeIgniter site. Session data is destroyed.
However, there are a few problems that I can think of with this approach:
1. How would I cleanup expired sessions from Memcached? From what I can tell, there is no way of detecting expired sessions in CodeIgniter? - Edit - Just realized that I could set a timeout on the memcached data to solve this.
The CodeIgniter docs says that session ids change every five minutes. Wouldn't this kinda ruin my approach?
Is there a better solution out there? I'd like to hear what other options there are before I start implementing this.

Web Application Activation | Computer, Local Server

Have done some research and found some stuff that may be helpful.
I would like your opinion about my approaches on this.
THE GOAL
I will develop an application in PHP (That's the only language I know and unfortunately I don't have time to learn another one right now). I want this application to be able to run offline and locally to any pc. I will use Wamp server and cakePHP framework for this.
THE PROBLEM
This application will be for sale. So I will need some activation method to prevent each app from being used in multiple computers. I don't want something complicated or very very secure. I just need something simple, to prevent non-programmers to run this app in any computer. Of course, the more secure, the better! :)
POSSIBLE SOLUTIONS I AM THINKING OF
First of all, I am thinking to force users to activate their application, by going online during installation. That way they could get a unique KEY from my online database.
I found php's shell_exec command. So I am thinking, during online installation, to get the Host ID (Machine ID) of that computer, send it to my server and store it to my database next to a unique KEY. Then Machine ID and unique KEY can be stored to a php file. (Could I store it somewhere more secure? Maybe encrypt it?)
Every time the user opens the application, php will read machine ID. If not the same with the one stored in php file, an activation will be required. (Maybe could store computer's name too or some other id?)
Is that a good approach? Would it be possible?
Another approach I am thinking of, is to have a guy create a non php installation file. When run, will promp wamp installation and when installation finishes, will transfer all necessary files to wamp root folder (automatization for the user). I can only guess though this will work, as my knowledge over other languages is limited...
Could I benefit from this in validation terms? Can a non php file interact with my php application and validate it, for only one unique computer?
Any info will be very appreciated. I have just started building the application and want to know if there is a good way (or non) to secure it.
Thanks!
There is no point in all of this because if people want they can simply crack any of the copy protection methods you came up with. This also applies to any other app written in any other language. If people want to use it without permissions there are ways to do that.
There are some ways to obfuscate the code (see Is there a code obfuscator for PHP?) but these solutions are just silly because if people really want they can get the code in plain text anyways.
A better idea might be to run the app on your server and allow people to pay for it monthly, Software as a Service like Google Apps for Business.

Flash browser game - HTTP + PHP vs Socket + Something else

I am developing a non-real time browser RPG game (think Kingdom of Loathing) which would be played from within a Flash app. At first I just wanted to make the communication with server using simply URLLoader to tell PHP what I am doing, and using $_SESSION to store data needed in-between request.
I wonder if it wouldn't be better to base it on a socket connection, an app residing on a server written in Java or Python. The problem is I have never ever written such an app so I have no idea how much I'd have to "shift" my thoughts from simple responding do request (like PHP) to continuously working application. I won't hide I am also concerned about the memory and CPU usage of such Server app, when for example there would be hundreds of users connected. I've done some research.
I have tried to do some research, but thanks to my nil knowledge on the sockets subject I haven't found anything helpful. So, considering the fact I don't need real time data exchange, will it be wise to develop the server side part as socket server, not in plain ol' PHP?
Since your game isn't something that's working in realtime you probably don't need to go down the socket route, though it's certainly a viable option. The nice thing about sockets is that updates would be instant without requiring page refresh (or server poll), so you're right to at least consider it.
If you do want to do a more real-time server setup, you might consider using something like Electroserver - this abstracts out much of the setup for you so you don't have to write your own server from scratch, plus it's free up to a certain number of concurrent users if I recall correctly.
Finally, a third option you have is a modified POST approach using AMF. Look into AMFPHP, it lets you call methods on a PHP back-end directly from your flash application. A little bit faster and easier than simply using POST stuff, but not quite as seamless as a socket connection or a specifically built gaming server.
Lots of options out there, it sounds like you are aware of this and kudos for trying to come up with the best approach rather than just rolling with what you know! I hope this helps, let me know if you have any questions.
Here's a link to Electroserver - http://www.electro-server.com/

PHP sessions in a load balancing cluster - how?

OK, so I've got this totally rare an unique scenario of a load balanced PHP website. The bummer is - it didn't used to be load balanced. Now we're starting to get issues...
Currently the only issue is with PHP sessions. Naturally nobody thought of this issue at first so the PHP session configuration was left at its defaults. Thus both servers have their own little stash of session files, and woe is the user who gets the next request thrown to the other server, because that doesn't have the session he created on the first one.
Now, I've been reading PHP manual on how to solve this situation. There I found the nice function of session_set_save_handler(). (And, coincidentally, this topic on SO) Neat. Except I'll have to call this function in all the pages of the website. And developers of future pages would have to remember to call it all the time as well. Feels kinda clumsy, not to mention probably violating a dozen best coding practices. It would be much nicer if I could just flip some global configuration option and VoilĂ  - the sessions all get magically stored in a DB or a memory cache or something.
Any ideas on how to do this?
Added: To clarify - I expect this to be a standard situation with a standard solution. FYI - I have a MySQL DB available. Surely there must be some ready-to-use code out there that solves this? I can, of course, write my own session saving stuff and auto_prepend option pointed out by Greg seems promising - but that would feel like reinventing the wheel. :P
Added 2: The load balancing is DNS based. I'm not sure how this works, but I guess it should be something like this.
Added 3: OK, I see that one solution is to use auto_prepend option to insert a call to session_set_save_handler() in every script and write my own DB persister, perhaps throwing in calls to memcached for better performance. Fair enough.
Is there also some way that I could avoid coding all this myself? Like some famous and well-tested PHP plugin?
Added much, much later: This is the way I went in the end: How to properly implement a custom session persister in PHP + MySQL?
Also, I simply included the session handler manually in all pages.
You could set PHP to handle the sessions in the database, so all your servers share same session information as all servers use the same database for that.
A good tutorial for that can be found here.
The way we handle this is through memcached. All it takes is changing the php.ini similar to the following:
session.save_handler = memcache
session.save_path = "tcp://path.to.memcached.server:11211"
We use AWS ElastiCache, so the server path is a domain, but I'm sure it'd be similar for local memcached as well.
This method doesn't require any application code changes.
You don't mentioned what technology you are using for load balancing (software, hardware etc.); but in any case, the solution to your problem is to employ "sticky sessions" on the load balancer.
In summary, this means that when the first request from a "new" visitor comes in, they are assigned a specific server from the cluster: all future requests for the lifetime of their session are then directed to that server. In practice this means that applications written to work on a single server can be up-scaled to a balanced environment with zero/few code changes.
If you are using a hardware balancer, such as a Radware device, then the sticky sessions is configured as part of the cluster setup. Hardware devices usually give you more fine-grained control: such as which server a new user is assigned to (they can check for health status etc. and pick the most healthy / least utilised server), and more control of what happens when a server fails and drops out of the cluster. The drawback of hardware balancers is the cost - but they are worth it imho.
As for software balancers, it comes down to what you are using. For Apache there is the stickysession property on mod_proxy - and plenty of articles via google to get this working with the php session ( for example )
Edit:
From other comments posted after the original question, it sounds like your "balancing" is done via Round Robin DNS, so the above probably won't apply. I'll refrain from commenting further and starting a flame against round robin dns.
The easiest thing to do is configure your load balancer to always send the same session to the same server.
If you still want to use session_set_save_handler then maybe take a look at auto_prepend.
If you have time and you still want to check more solutions, take a look at
http://redis4you.com/articles.php?id=01..
Using redis you are fault tolerant. From my point of view, it could be better than memcache solutions because of this robustness.
If you are using php sessions you could share with NFS the /tmp directory, where I think the sessions are stored, between all the servers in the cluster. That way you don't need database.
Edited: You can also use an external service like memcachedb (persistent and fast) and store the session info in the memcachedb index and indentify it with a hash of the content or even the session ID.
When we had this situation we implemented some code that lives in a common header.
Essentially for each page we check if we know the session Id. If we dont we check if we're in the situation whehich you describe, by checking if we have stored sesion data in the DB.Otherwise we just start a new session.
Obviously this requires all relevant data to be copied to the DB, but if you encapsulate your session data in a seperate class then it works OK.
you could also try using memcache as session handler
Might be too late, but check this out: http://www.pureftpd.org/project/sharedance
Sharedance is a high-performance server to centralize ephemeral key/data
pairs on remote hosts, without the overhead and the complexity of an SQL
database.
It was mainly designed to share caches and sessions between a pool of web
servers. Access to a sharedance server is trivial through a simple PHP API and
it is compatible with the expectations of PHP 4 and PHP 5 session handlers.
When it comes to php session handling in the Load Balancing Cluster, it's best to have Sticky Sessions. For that ask the network of datacenter who is maintaining the load balancer to enable the sticky session. Once that is enabled you'll don't need worry about sessions at php end

Categories