I would like to know the concept of website replication where when a user sign up a subdomain is created and the user can create his website online. Just the way it is being done in www.empowerkit.com
Lots of ways to structure multi-tenant SaaS.
There is the single site, single database database approach:
Basically you have the website binding listen to * on a dedicated IP Address.
DNS Bind *.yourdomain.com to ip address
Have your signup function create a customer database record containing the subdomain.
Have routing code that can determine what the customer id is for a specific request on a subdomain, then use row-level filtering in your data repository to only return data for that specific customer id.
You could also have your code create a whole separate website on the filesystem, and add the appropriate virtualhosts entries for the new site.
Pros: Less database overheads from multiple database instances, easy to backup and restore (single database)
Cons: Bit more working getting the data access, harder to shard
Then there is single-site, multi-database:
Same as above, but the subdomain maps to a specific customer database.
Pros: Easier to shard, Customer database better segregated
Cons: High memory overhead, requires lots of database instances, hard to backup
Finally, multi-site multi-database
You create separate physical sites and add the appropriate virtualhost records for your clients during signup, as well as creating a separate database for that client
Pros: Simple application code, easy to scale, better user segregation, easier to customize "per user" if required.
Cons: Complex "signup code", requires lots of resources, requires your application to have write access to important configuration files, wastes lots of disk space.
It depends on how your application works and what per user customization you offer.
Well, for the subdomain part you could:
Use a database and .htaccess to redirect
Use your control panel API (like DirectAdmin API)
The user can create his website with a WYSIWYG (What You See Is What You Get) editor (as CMS) and save all the content in a SQL database which you can retrieve later on to display the page.
I don't think there's a simple answer to what you are asking. Most effective solutions actually have the concept of dealing with the sub domain built into the product.
It all depends on the environment you are going to deploy in and what runs your site.
Many CMSes can deal with serving different content for the subdomain by nature.
Related
Specifically, I am working with PHP and data on a Web application.
As far as I can tell, the main reasons to access user data are:
For logging in
For communication, including shipping products
It seems to me that the sort of data most likely to be of interested to hackers is user data, so it deserves extra protection.
Here is a possible technique. I normally use PDO for database work, so the following should apply to any supported database.
Create a database users with a single table of users.
Create a user restricted to the database
Create the PDO object accordingly
From the main login script, call an included script which authenticates the user.
For added security, this script may be outside of the web root?
Do the normal stuff using session variables. In particular store the relevant retrieved user info.
Go ahead with the rest of the data from the main database
I don’t think all hosted servers make it easy to store anything outside of the web root.
My wild guess is that the user database is less likely to be compromised if it’s not part of the data most frequently accessed.
I also imagine that separating the user data from the password into separate tables is also a good idea, but I’ll put that into another question.
I’m not suggesting it’s a perfect solution, but I am looking for better protection. The question is: would a technique such as this provide more security than keeping the user details in the main database?
You could write an authentication & authorisation service altogether. Look at OAuth. You authenticates with a username and password to this service and the auth server can generate tokens which enables access to your service.
https://en.wikipedia.org/wiki/OAuth
Yes, actually that is not a bad idea, but I would do this differently:
Create database, user account to it and user data tables necessary.
Create a different webhost(eg userapi.myhost), which could be even a local host. The new webhost should have its own web root in a complete different location from your main app.
Create an API to get/change user data with a script(or use a framework) in the new host.
Auth protect http requests to your API and restrict access to specific IP(even your local ip if the api is on the same server).
Your main app uses the new API to access/change user data.
So, eg a user tries to login to the main app with user-password -> main app sends his login to the api -> if correct, get the info.
This can secure against sql injections, since your main app can have a lot of libraries and services that do a lot of different queries to your database and maybe some of them have security holes.
I would suggest:
Keep the user API simple and tested.
Keep the user admin on the secondary app(additionally password protect and restrict access to it to your ip).
Unfortunately, this does not secure you completely. There is a case where a hacker might place his own scripts or change your code, "see" the login details and access user data through your main app. Well, there are a lot of other ways, and a lot of different solutions. (apache openbase_dir restriction, remove write permissions for "apache" user to your code, remove execution handlers from directories "apache" user can write, eg images, etc)
Since you are looking for a better protection, I would sure suggest to go for it, if you do not mind to invest a little more work and time to it.
I'm in the process of building a saas application with a master database for all transactions and the user base and seperate databases for each tenant. Each tenant is given an unique sub domain and based on that, the correct database is pointed to the tenant. The web application is built using php and MySQL is used for data storage.
I have couple of problems and to begin with;
When a user visits the tenants home page (E.g. Sub.abc.com), there will be a login functionality available. Since the username password and the user base is in the master database, how would I authenticate a user? Is it through a seperate database connection for the master database or via web apis? What is the best way?
There will be roles created by a tenant. So assume the staff for a specific tenant is retrieved by calling a web api and the role is stored in the tenant database along with the user id. Is this the right way forward because there is no direct foreign key mapping and also once the users are retrieved should I store in the tenant database or just hold it in memory?
In case the business owner wants a report about something specific for each tenant database, what would be the best way to grab all the data from each and every individual tenant database?
How can we capture each tenants database usage, file storage and show in the super admin?
how would I authenticate a user? Is it through a seperate database connection for the master database or via web apis? What is the best way?
Be careful with the "best way" questions - this would point this question towards being off topic as opinion based.
If your application is completely closed source (which I imagine it is), then it doesn't matter. You can access your master database directly if you want to, because all of your client implementations will be part of your overall software package.
That being said, for scalability and maintainabilty it's probably a better idea to build an authentication API interface and use that, so you don't need to manage the database connection component of your client software.
...the role is stored in the tenant database along with the user id. Is this the right way forward because there is no direct foreign key mapping and also once the users are retrieved should I store in the tenant database or just hold it in memory?
This is fine. The users are common to all of your implementations, but the roles are unique to each client implementation. This indicates that the way you're approaching it is correct in that you authenticate first, then assign the client's role to the user once logged in. You should check this against your local database (obviously), then assign it to the user's session so it sticks around for the duration of their visit.
In case the business owner wants a report about something specific for each tenant database, what would be the best way to grab all the data from each and every individual tenant database?
This is too broad to answer. You should also be careful here, because the client's data may not be yours to report on!
How can we capture each tenants database usage, file storage and show in the super admin?
Again, quite broad, but I assume that you could achieve something like this by querying the MySQL server for database statistics on each client database and do the same for the file storage by querying the server that hosts the client's files.
I hope this is helpful, please narrow down your questions a little more if you'd like more specific help.
Be careful with the "best way" questions - this would point this
question towards being off topic as opinion based.
If your application is completely closed source (which I imagine it
is), then it doesn't matter. You can access your master database
directly if you want to, because all of your client implementations
will be part of your overall software package.
That being said, for scalability and maintainabilty it's probably a
better idea to build an authentication API interface and use that, so
you don't need to manage the database connection component of your
client software.
Agreed. See the thing is for these kind of scenarios it is very difficult to find answers especially on multi-tenancy.
What do you mean by closed source?
If i can have authentication APIs to retrieve data from the master database, how would i host it in a way only the access is granted within the application and outside parties cannot access it? Later on we need to have services exposed to outside users as well. In this case is it better to expose this from now on or work on separate services later? What are the best neat PHP web service libraries do you know other than NuSOAP?
This is fine. The users are common to all of your implementations, but
the roles are unique to each client implementation. This indicates
that the way you're approaching it is correct in that you authenticate
first, then assign the client's role to the user once logged in. You
should check this against your local database (obviously), then assign
it to the user's session so it sticks around for the duration of their
visit.
That is correct! The users are common to all implementations and this is stored in the master database and not on individual tenant databases.By roles i meant, per tenant application, the tenant can create roles which are within the tenant application such as the billing unit, accounts, front desk, etc. So these user_id and role_id is stored in the tenant's database. That's why i asked since the user_id is not directly mapped to the master database's user table if it's okay.
This is too broad to answer. You should also be careful here, because
the client's data may not be yours to report on!
For now let's assume the super admin wants to know roles of every tenant.In this case how would i retrieve the data from all tenant databases? Do we use Views for this or since we are using mysql how can i achieve this?
Again, quite broad, but I assume that you could achieve something like
this by querying the MySQL server for database statistics on each
client database and do the same for the file storage by querying the
server that hosts the client's files.
Extremely sorry about this. We basically want to track the usage of tenant databases to see if they exceed the amount we agreed on. That's why. I hope there is a way to achieve this.
Also, for the site images, is it better to store on the hard disk or in the database as BLOBs? End of the day we need to think about Scalability as well as a way images will load with less bandwidth consumption?
I am looking at building an external site with a CMS, probably Drupal or ExpressionEngine. The problem is that our company already has a membership database that is designed to work with our existing enterprise software, currently the membership database consists of over 400k rows.
Migrating data from the database manually is not an option as modifications and new data must be accessible in real-time. Because the design of the external database will differ from the CMS's own I have decided the best way forward is to use two databases and force the CMS to use the external to read user information (cannot write to) and a local for everything else the CMS needs to do (read + write).
Is this feasible with these Drupal or ExpressionEngine? Ideally I need to be able to use hooks as I do not wan't to modify core CMS files. Sifting through the docs I am not able to find what I would hook into for ether CMS.
(Note: I know it is possible, but I want to know if it's feasible).
Finally if there is a better way of handling this situation please also chime in. Perhaps there is something at the database level to reference a column in an external database?
I'm clutching at straws someone can point me in the right direction I'm sure.
Edit: Moodle has this functionality built in. Moodle is not suitable to my needs but perhaps their documentation will help you understand my issue better: Moodle - External database authentication
If the data is to remain in an external database, then you probably want to look at creating a web service like an API that allows your external site to access the data. Queries to an external database or notoriously slow.
I'm not a Drupal (D)/ExpressionEnginge (EE) Expert.
For TYPO3 (which I know in depth) (and also for D & EE, I think) you can write your own extension which fetches the user data from your membership database and stores them in the local frontend or backend user table for temp. authentication.
We have build such a thing for a big financial institution even with single sign on and setting the corresponding rights depending on the remote groups.
There are plenty of TYPO3 extensions which can be used as an example.
The other possibility is to avoid again the real time feature and use a synchronization script on database level with checks the membership tables with the D or EE user table in e.g. 5min rhythm.
There is also the option to use a LDAP authentication.
If databases are all MySql then you can create a view which points to the remote membership table you only need to implement the correct columns.
It is always a bad idea to modify the core. ;-)
There's an EE extension for working with external data in MySQL: http://devot-ee.com/add-ons/external-entries . I don't know what db you'll need to access or if this might be made to work with it, though.
In ExpressionEngine there are six tables that contain Member and Member Group data (EE's lingo for users/user groups).
exp_members
exp_member_groups
exp_member_bulletin_board
exp_member_data
exp_member_fields
exp_member_homepage
In all likelihood you'd need to sync your user database with exp_members and exp_member_groups at a regular interval to make this happen. Trying to have EE connect to the external DB will likely get you in trouble fast.
If you are simply wanting access to your external database, you can write plugins that connect behind the scenes and make your data available inside your templates.
I worked a a Rails app once that pushed the Rails users to become EE members and shared sessions between them so users could view both systems and stay logged in.
My client has a host of Facebook pages that have become very successful. In order to move away from big brother Facebook my client wishes to create a large dynamic site that incorporates the more successful parts of the Facebook empire.
One of my client's spin off sites has been created and is getting a lot of traffic. I'm not sure exactly how much but it hit 90 Gigs in a month as the space allocated need to be increased.
In any case my client has dreamed up a massive website with its own community looking to put the community under the one banner. However I am concerned that it will get thrashed, bottlenecks, long load time, etc.
My questions:
Will a managed dedicated server be able to handle a potentially large amount of traffic?
Is it going to be better to create various parts of the empire in their own separate hosting and domain (normal hosting or VPS), or is it better to have them all under the one hood (i.e. using sub-domains).
If they were all together would it be better for SEO and easier to manage? Or if they are separate, they may be quicker but would it need some sort of Passport user system so people can log into any of the website with the same user details?
Whats the best way to implement a Passport style user system? Do you remotely connect to databases? Or run a regular a Cron job that updates each individual user details on each domain? Maybe run CURL request to the other site given then any new data?
Any other Pros/Cons to keeping all the section together or separating them?
Large site like Facebook manages to have everything under the one root. Then sites like eBay have separate domain names but you can use the same user login across all of them.
I'm not sure what the best option is and would appreciate any guidance.
It is a very general question but to give some hints:
Measure, measure and measure again. Know what kind of parts are used heavily and which are not.
Fix things and go back to 1.
Really: Without knowing what takes lots of time, what is used most heavily etc. you cannot say anything usefull.
VPS or dedicated servers are not the right question. You start with: What do I have to do for the users. Then: How am I going to do that? (for example: in database, in scripts, in message queue) and then finally you see how much hardware you need.
One or multiple domains doesn't really matter. Though one exception: For static content it might be interesting if you have lots of it to use a CDN like Amazon. Read for example: http://highscalability.com/blog/2011/12/27/plentyoffish-update-6-billion-pageviews-and-32-billion-image.html where you can read some things about the possibilities with a CDN.
In general serving static content from a static domain is useful many other things don't really need that. So there you could just consider all in one domain.
I've been given a task to connect multiple sites of the same client into a single network. So i would like to hear an architectural advice on connecting these sites into a single community.
These sites include:
1. Invision Power Board Forum (the most important site)
2. 3 custom made cms-s (changes to code allowable)
3. 1 drupal site
4. 3-4 wordpress blogs
Requirements are as follows:
1. Connecting all users of all sites into a single administrable entity. With permissions changing ability, users banning etc.
2. Later on, based on this implementation I have to implement "facebook like" chat, which will be available to all users regardless of place of login.
I have few ideas on my mind on how to go with this, but would like to hear some people with more experience and expertize than my self.
Cheers!
You're going to have one hell of a time. Each of those site platforms has a very disparate user architecture: there is no way to "connect" them all together fluidly without numerous codebase changes. You're looking at making deep changes to each of those platforms to communicate with a central database, likely modifying thousands (if not tens of thousands) of lines of code.
On top of the obvious (massive) changes to all of the platforms, you're going to have to worry about updates: what happens when a new version of Wordpress is released? You'd likely have to update all of your code manually (since you can't just drop in the changes). You'd also have to make sure that all of the code changes are compatible with your current database. God forbid one of the platforms starts storing user information differently---you'd have to make more massive code changes. This just isn't maintainable.
Your alternative (and best bet) is to have some sort of synchronization job that runs every hour or so: iterate through each user in each database and compare it to see if it both exists and is up-to-date in the other databases. If not, push the changes out. The problem with this is that it will get significantly slower as you get more and more users.
Perhaps another alternative is to simply offer a custom OpenID implementation. I believe that Drupal and Wordpress both have OpenID plugins that you can take advantage of. This way, you could allow your users to sign in with a pseudo-single sign-on service across your sites. The downside is that users could opt not to use it.
Good luck