I've created a couple of little few page long websites for one time projects or conferences in mostly Wordpress, and I'm thinking about what will happen to those websites in the future. And I think I'm not alone, as there are a big number of sites out there, which is now only kept as an archive, but unlike in the 90s where everything was static HTML, these websites are now using some software to provide CMS functionality, even if its only for a few pages + search.
My problem is that with all these modular software (Wordpress, Joomla, etc.) you need to use various plugins and themes to make them usable and nice, but all these functionality brakes sooner or later. Which means that if you want to keep the website as is, you need to leave the old versions of the software. I mean forever.
On the other hand they are so popular (Wordpress has more than 100 million downloads now), that I would be surprised if they would not became a target for the most popular exploits in the near future. I don't know how safe these software are, but I have experienced what it means to continuously keep cleaning/fixing an osCommerce website with about 7 successful hacker attacks every month, till the sites owner agreed that its better close the site entirely and start building a new one.
As an alternative solution (but I really don't know if its possible), is there any way to make a whole site into a read-only mode? I mean something like making the database read-only, the file system read-only, disabling the admin interface and all the comment fields and just leaving the site as an archive, the only dynamic part being the search function.
Is it possible on file-system/database level? Will it help at all to keep hackers out? Is there any other solution? Please understand that my point is that it is not possible to keep CMS sites always updated forever, and even if some of as are fanatic enough to spend a night looking for fixing a broken theme/plugin which just broke after a core upgrade, 99% of the sites will end up in a "fixed" state; using a working but old CMS/plugins/theme combination forever.
I think 99% is a very generous estimate, but that's beside the point. The majority of the sites that end up in the state you are referring to only last as long as their domain registrations (especially since most Wordpress or OSCommerce deployments are usually set up as the root domain and service the entirety of the web presence.) So generally speaking, if the domain itself is in a state of neglect and abandonment, the natural expiration process will decommission it and it will no longer be accessible in general.
As for locking down an entire, sitewide state on one of these CMS systems, it could in theory be possible if one removed all write privileges for all the server files and revoked every database user privilege except SELECT. In most cases this would defeat the purpose of leaving the software for CMS there at all, since none of the records would updatable any longer (items in the case of OSCommerce, posts in the case of Wordpress.) But this would be highly dependent on the environment required by the particular CMS, and Wordpress for one is pretty particular about read/write permissions to work at all. It would make for an interesting experiment, but probably isn't a practical solution for what you're describing.
Taking the rendered content and building a static mirror is another option, and can be pretty easily automated by writing a script that could get the HTML content of the rendered pages and building static, linked alternatives. But this too is a bit impractical, especially in the case of a search (since this by its very definition requires database access.)
In short, it's an interesting idea, but ultimately sites that are neglected and whose owners are not committed to sustaining proper updates are doomed to expiration, and the natural course of Internet business and domain registration pretty often Darwinizes them.
Yes, you can take a snapshot of a website using wget or similar, basically replacing the CMS driven site with static HTML pages.
wget -mk http://www.example.com/
That way you wouldn't need to update it forever.
As an alternative solution (but I
really don't know if its possible), is
there any way to make a whole site
into a read-only mode? I mean
something like making the database
read-only, the file system read-only,
disabling the admin interface and all
the comment fields and just leaving
the site as an archive, the only
dynamic part being the search
function.
WP Super Cache has a "Lockdown" function — serving static HTML files for almost every visitor.
It's not exactly what you're looking for, but a simple workaround, as I dont know a of a "read only" function for WordPress.
http://wordpress.org/extend/plugins/wp-super-cache/
Related
So I'm trying to make my own mini CMS, and just for my knowledge once I get it good enough, and I know enough, I'd like to sell it. Now for licensing, I know there's tons of licensing scripts you can pay for, but would the following be advisable?
I'd like to plant a script hidden in my CMS where instead of checking for some sort of key, it checks if your domain is allowed to run the CMS by running it past the main CMS database. Now I have two questions.
1.) Could I encrypt the code, so if I wanted it to redirect to a page where it just says "CMS Deactivated" For example, so that people don't go through the code just ctrl-f searching for the key text?
2.) I was going to reach the domain name by doing the following, $_SERVER['SERVER_NAME']. Is that going to be a reliable way of checking the domain? IE. Will IIS pick up on it?
I'm not trying to completely extinguish cracking of the CMS, I know that is impossible.
Maybe you should consider housing the whole thing on your own servers and making the content accessible via a REST API. You can certainly restrict and control that way.
Providing a CMS with source code to any client opens you to evaluation and cleansing. Not saying there's no way, but I am saying it may be easier for you to provide the content via REST than to write perfect security. Especially if you're asking this question.
As I said in my comment, I think worring about money is irrelevant for now, but here's some information for you to learn from.
1.) I haven't found an encryption solution that works. Any will require you to install additional PHP components (and no one wants to deal with that when there are plenty of free CMS's out there). There is code obfuscation, but that's iffy at best.
2.) According to this page, that should work on IIS!
My client has a host of Facebook pages that have become very successful. In order to move away from big brother Facebook my client wishes to create a large dynamic site that incorporates the more successful parts of the Facebook empire.
One of my client's spin off sites has been created and is getting a lot of traffic. I'm not sure exactly how much but it hit 90 Gigs in a month as the space allocated need to be increased.
In any case my client has dreamed up a massive website with its own community looking to put the community under the one banner. However I am concerned that it will get thrashed, bottlenecks, long load time, etc.
My questions:
Will a managed dedicated server be able to handle a potentially large amount of traffic?
Is it going to be better to create various parts of the empire in their own separate hosting and domain (normal hosting or VPS), or is it better to have them all under the one hood (i.e. using sub-domains).
If they were all together would it be better for SEO and easier to manage? Or if they are separate, they may be quicker but would it need some sort of Passport user system so people can log into any of the website with the same user details?
Whats the best way to implement a Passport style user system? Do you remotely connect to databases? Or run a regular a Cron job that updates each individual user details on each domain? Maybe run CURL request to the other site given then any new data?
Any other Pros/Cons to keeping all the section together or separating them?
Large site like Facebook manages to have everything under the one root. Then sites like eBay have separate domain names but you can use the same user login across all of them.
I'm not sure what the best option is and would appreciate any guidance.
It is a very general question but to give some hints:
Measure, measure and measure again. Know what kind of parts are used heavily and which are not.
Fix things and go back to 1.
Really: Without knowing what takes lots of time, what is used most heavily etc. you cannot say anything usefull.
VPS or dedicated servers are not the right question. You start with: What do I have to do for the users. Then: How am I going to do that? (for example: in database, in scripts, in message queue) and then finally you see how much hardware you need.
One or multiple domains doesn't really matter. Though one exception: For static content it might be interesting if you have lots of it to use a CDN like Amazon. Read for example: http://highscalability.com/blog/2011/12/27/plentyoffish-update-6-billion-pageviews-and-32-billion-image.html where you can read some things about the possibilities with a CDN.
In general serving static content from a static domain is useful many other things don't really need that. So there you could just consider all in one domain.
Ok, I am in the process of creating a cms. There will be a free version, and a premium version. Obviously the premium version will have modules and such that the free version does not have. Does anyone have an idea on how i can prevent my premium version from being shared across the web? Ive looked into using a license key with remote server validation, as well as encrytion, and encoding the premium scripts. I dont want to use Zend Guard or Ioncube, because i dont want users to have to have that software installed just to ues the cms. I also want the cms to be customizable which rules out encoding. Anyone have ideas to prevent the scripts from being nulled? If its possible to maybe just encode a single page that does remote validation... just something... It doesnt have to be a bullet proof thing.. but something that prevents novice crackers from nulling it and releasing it
ENCODING PAGES:
Personally, I have tried a few techniques to avoid PHP encoders but nothing was really effective in a commercial environment.
Based on my experience though, I wouldn't worry so much about Ioncube and Zend not being installed on servers because most managed environments will most likely already have both, this is what I have found anyway. Because of this it reduces the problem of users to have to install it for a single application.
In saying that it depends on your target market also, if you're going head-to-head with the likes of Joomla! or WordPress for example, then your target market typically uses a managed environment so no big issue.
If you're however going for say an intranet market this could be a minor problem but any server admin worth a grain of salt will be able to install this easily without fuss, they will also understand why you put it in place. Note, the intranet market is a bit harder as you will need to specify port settings to check the license in your licensing module.
SIDE NOTE: As your product is going to be distributed with source code available you do need to be careful and pay attention to your Intellectual Property (IP), this generally means putting a legal disclaimer on every page that is readable. Also, don't forget to respect the requirements of other IP owners scripts you may be using in your project.
LICENSING & ENCODING (THE SUGGESTION):
Encoding a single page with licensing functions is a way of going about it but you will find it fairly easy to bypass if the rest of the source code is available.
What I would look at is encoding a single page with licensing functions but also encoding your login validation, half of your authentication checks for each protected page and some basic functions for posting to the database also. This way if they try to remove your encoded page with the licensing script there is no login or updating of content - plus they will get kicked out of the system as only half of your session checking will be valid - I hide a kill function nested into another function that is required for each page to operate, this may be a menu (this is great because you can hide the function with the logout), it just looks like part of the log-out function but in reality it is a function to destroy the session if not all variables are present.
When choosing values for your authentication checks on each protected page (that function should be encoded), try using what appears to be a random variable and non-descriptive names then encode the variable (I like MD5 hashes for this). It is another way to give more security around the 'hacking' of your script.
I hope this may help you and sorry that I cannot recommend a better solution.
Me and a colleague were discussing the best way to build a website last week. We both have different ideas about how to store content on the website. The way I have always approached this has been to store any sort of text or image link (not image file) on to a database. This way, if I needed to change a letter or a sentance I would just need to go on the database. I wouldn't have to touch the actual web page itself.
My colleague agreed with this to a point. He thinks that there are performance issues related to retrieving content from the database, especially if every character of content is coming from the database. When he builds a website, any content that won't be changed often (if at all) will be hard coded on to the page, and any content that would be changed or added regulary would come from the database.
I can't see the benefit of doing it like this, just from the perspective of everytime we make a change to an ASPX page we need to re-compile the site to upload it. So if a page has a misspelt "The" (so it'd be like "Teh") on one page, we have to change it on the page and then recompile the site (the whole site) and then upload it.
Likewise with my colleague, he thinks that if everything was to come from the database there would be performance issues with the site and the database, and that the overall loading speed of the web page to the browser would decrease.
What we were both left wondering was that if a website drew everything from the database (not HTML code as such, more like content for the headers, footers, links etc) would it slow down the website? And as well as this, if there is a performance issue, what would be better? A 100% database driven website with it's performance issues, or a website that contains hard coded content which would mean 10/20 minutes spent compiling and uploading a website just for the sake of a one word or letter change?
I'm interested to see if anyone else has heard of it, or if they have their own thoughts on this subject?
Cheers
Naturally it's a bit slower to retrieve information from a database rather than directly from the file system. But do you really care? If you design your application correctly then
a) you can implement caching so that the database is not hit for every page
b) the performance difference will be tiny anyway, particularly compared to the time to transmit the page from the server to the client
A 100% database approach opens up the potential for more flexibility and features in your application.
This is a classic case of putting caching / performance considerations before features / usability. Bottlenecks rarely occur where or when you expect them to - so focus on developing a powerful application and then implement caching later - when it's needed and where it's needed.
I'm not suggesting storing templates as static files is a bad idea - just that performance shouldn't be your primary driver in making these assessments. Static templates may be more secure or easier to edit using your development tools for example.
Hardcode the strings in the code (unless you plan to support multiple languages).
It is not worth the
extra code required for maintaining the strings
the added complexity
and possibly performance penalty
Would you extract the string "Cancel" from a button?
If so, would you be using the same string on multiple cancel buttons? Or one for each?
IF you decided to rename one button to "Cancel registration", how do you identify which "Cancel" to update in the database? You would be forced to set up a working process around how to deal with this, and in my opinion it's just not worth it.
I've been given a task to connect multiple sites of the same client into a single network. So i would like to hear an architectural advice on connecting these sites into a single community.
These sites include:
1. Invision Power Board Forum (the most important site)
2. 3 custom made cms-s (changes to code allowable)
3. 1 drupal site
4. 3-4 wordpress blogs
Requirements are as follows:
1. Connecting all users of all sites into a single administrable entity. With permissions changing ability, users banning etc.
2. Later on, based on this implementation I have to implement "facebook like" chat, which will be available to all users regardless of place of login.
I have few ideas on my mind on how to go with this, but would like to hear some people with more experience and expertize than my self.
Cheers!
You're going to have one hell of a time. Each of those site platforms has a very disparate user architecture: there is no way to "connect" them all together fluidly without numerous codebase changes. You're looking at making deep changes to each of those platforms to communicate with a central database, likely modifying thousands (if not tens of thousands) of lines of code.
On top of the obvious (massive) changes to all of the platforms, you're going to have to worry about updates: what happens when a new version of Wordpress is released? You'd likely have to update all of your code manually (since you can't just drop in the changes). You'd also have to make sure that all of the code changes are compatible with your current database. God forbid one of the platforms starts storing user information differently---you'd have to make more massive code changes. This just isn't maintainable.
Your alternative (and best bet) is to have some sort of synchronization job that runs every hour or so: iterate through each user in each database and compare it to see if it both exists and is up-to-date in the other databases. If not, push the changes out. The problem with this is that it will get significantly slower as you get more and more users.
Perhaps another alternative is to simply offer a custom OpenID implementation. I believe that Drupal and Wordpress both have OpenID plugins that you can take advantage of. This way, you could allow your users to sign in with a pseudo-single sign-on service across your sites. The downside is that users could opt not to use it.
Good luck