Wordpress staging environment [closed] - php

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 12 months ago.
Improve this question
I work for a company that does sites for the pharma industry and we often need to get legal approval before we push any changes live. So, I'd like to migrate a lot of our work to a CMS environment, specifically wordpress but we need the ability to have a staging environment. Is it possible to instead of publishing a page have it publish to a staging environment that someone can browse as with a link as a site. So basically have 2 sites, one staging one live?

Generally speaking other "answers" here are correct, there are alternatives to WordPress that have better built-in support for staging environments and build migrations. However the suggested alternatives aren't exactly equal substitutes to the WordPress platform, so I think it's best to answer the question at hand instead.
WordPress does not natively support hosting the same site from two different hosts. The core relies on absolute URLs stored inside the database and are used in just about every aspect of the core logic. This results in a number of superfluous bugs like the 500 or so related to SSL access because they try to dynamically alter all http:// schemes to https:// on the fly.
As a result when you host on dev.example.com and migrate to staging.example.com and again to www.example.com you have to do very careful search & replace manipulations on the database export each time you switch hosts. And this causes additional problems when you find out that many popular WordPress plugins serialize the URL into values in the database. So when you search & replace dev.example.com with staging.example.com the serialized data which contained the character length of the original value no longer deserializes with the new longer format. Some core contributors believe the solution to this later problem is to only ever setup staging sites with the same number of characters as the production account...
In a similar vein they also suggest swapping host mappings and only ever using the production.com URL on all hosting environments. Depending on your particular use-case requirements this is probably not a valid solution if you need to provide access to off-site clients, tech-illiterate users (versus tech-literate users of course.)
But WordPress itself has a number of great features otherwise and is a very adaptive and powerful rapid development platform. As a result you can extend the core framework to do much of what you need from it. When I was presented with this situation, I had to develop a solution that was viable for all circumstances. Traditionally this problem is solved with root-relative URLs, they work in cross-hosting environments, and they don't suffer from scheme changes, port changes or subdomain swapping practices that are common with staging migrations.
With this plugin: http://wordpress.org/extend/plugins/root-relative-urls/
(biased? yes, I wrote this plugin.) you get root-relative URLs where it's important and dynamic hosts where root-relative URLs don't work (like rss feeds.) All that remains from migrating the site to different hosts is to move the wp-config.php file outside of the www root (one level up is supported natively by WordPress.) so you can maintain different copies on different servers. Or alternatively you can use basic if-statements to distinguish hosts by server name and define key WordPress constants based on the server. In the end your content, code & data will transition seamlessly.
As a note of concern, the referenced plugins require setting write access to the wp-config.php file, a very bad practice from a security perspective for production or publicly accessible servers. Perhaps you can comfortably implement this in a restricted staging environment but then you'd need to disable and remove the plugin in production transitions.
Long story short, yes you can host WordPress in multiple host environments. The long-touted solutions are very case-specific and option-restricted because of the core architecture. But the framework is flexible enough to overcome the core deficit. This core design decision will probably change at some point in the future given the amount of effort the core developers continually spend on overcoming the cascading issues. But there are also devout defenders of the absolute URL religion that will keep the practice in place for the time being. Maybe a different platform that supports server migrations natively (pick just about any of them because most do) would be a better option for you now.

It's possible: Take a look at this Github Gist to see an example on how to switch environments with your wp-config.php file. Furthermore, take a look at wordpress.stackexchange to see some other Qs about this that give you a more in-depth look at the stuff you should consider.

Greg,
An even better CMS with staging environment would be Silverstripe (silverstripe.org). This cms allows for you to browse an entire staging site.

I think that you can try to use some plugins.
For example (fast search on official wordpress plugin repo) wp-deploy or Dev and Staging Environment Plugin (maybe outdated).
Or as alternative you can try to use different wp-config.php files - one for production and one for dev environment and switch them by checking requested url.

If you need to use WP, and need to publish only one or several pages from staging to live site, why not implement kind of tag on pages that needed to be published (seen by live site visitors)? Simple tune up your templates to display tagged or not pages and you're done! Then you can use only one site and maintain both public-available and not-approved pages on the it.
You can as well maintain local copy of your site (stage) and have some script to upload it (at whole) to your hosting - easy to automate task. In such a case, you may want to consider render the whole site into HTML pages and upload (simple rsync) this HTMLs to hosting - the live site will be hard to break as no dynamic script will be there!
But maybe you really should not choose the WP? There are a lot of CMS that support write-approve-publish scheme.

You can create a WordPress Staging environment with just two click with the help of this plugin: https://de.wordpress.org/plugins/wp-staging/
Disclosure: I am the author of this plugin. So ask me anything about it.

Related

Using Laravel for serving multiple domains with one codebase

I'm new to Laravel and now figuring out how to use Laravel for this project i'm working on. I have some ideas about approaching it, and like to know how the experts will do the job.
The case
I'm working on a project which has it's CMS, a informative site and about 35 specialized websites. The CMS, informative site and specialized sites are different on content and design. But all the 35 specialized websites are pretty much the same. The design and layout is identical, but the sites contain different content. Currently all sites have it's own codebase, which makes maintaining it very hard.
I'm now looking for a solution that will use just one codebase. CMS and sites share some functionality and both will have it's own functionality.
I'm just not sure about how to do this.
The sites run on a dedicated webserver that has DirectAdmin installed on it. Within DirectAdmin it's possible to create a domain pointer. This way I can create a site which shows the correct content by checking the http_host variable. Also, if someday a specialized site needs different functionality than the other, I can also check the http_host. I'm not sure if the has any consequents for SEO and whether it's a neat solution.
I've read about bundles and thinking of using them for the informative- and specialized sites. The "main" applications directory will then host the CMS. site.com/ will then be the cms, site.com/info will show a bundle which hosts the informative site and site.com/special will show a bundle for the specialized sites.
When use a domain pointer, some domain will show the site of another domain. But I don't think this will work with a bundle when it's located at site.com/(bundle).
I already have a database that distinct content by http_host. I would now like to know how to structure the code (and how to point 35 domains to the right pieces of code).
I hope the experts over here can tell me some of there best practices on how to do this job.
Thanks in advance.
I would recommend you to keep 3 projects (one for each kind of site).
You could combine the main site and the informative site they way you proposed (as a bundle), but the only reason to do so is to share something between them, which is probably not the case since they have different design and content. If it's only a matter of having it on "/info", this is achievable by a well designed Virtual Host configuration (can't give too much direction on that tho).
Although, I would recommend you to keep the informative site on http://info.site.com instead. It feels "right" to me to have different websites on different domains (or subdomains). Some people even advocate this is the correct thing to do, but who knows...
If you have libraries that you may reuse between all projects, I would recommend putting those on a bundle and include that bundle on all projects.
Now, about the 35 specialized sites, you should point all URL's to the same application through Virtual Host's ServerAlias.
On that application, you should defined Environments which let you configure some aspects of it based on the URL. For instance, you can define a different database for each website, create separate log files, languages, anything that is configurable on /application/config. You can also create your own configurations either.

PHP Application Structure/Pattern - 2 sites with shared libraries and assets

I'm having a bit of an application structure design dilemma.
I have created a web app that creates online surveys. It all works fine, but I would now like to create a new site that does different types of online surveys. This new site will be pretty much 95% similar in terms of layout, logic, functions, etc.
Rather than duplicate all the code from the current web app, I'd like the new app to share in the "fountain of knowledge" created by the current app - so to speak.
Can anyone enlighten me with their experiences of doing this sort of thing? Their best practices?
As a rough guide, I'm currently thinking of using symlinks for all the major logic files (library.php, functions.php, etc), and then deciding which logic to use based on which URL the user logged-in from.
Does that sound like a good or bad idea?
Would it be any better or worse to divide the whole system in to 3 sites, with the site in the middle containing all the common elements and logic? This middle site would have no independent use - it would be used from either of the 2 applications looking for functionality and assets, etc.
Any help and experience on this matter is very much appreciated indeed.
I'm very wary of going down a dead-end solution.
Kind Regards,
Seb
Good solution if:
you host your website yourself and creating symlinks between differents virtual hosts is not a problem
you won't have to make significative changes between the 2 websites
But instead of using symlinks, I could take advantage of PHP's include_path directive and put the common libraries in this path. This way, just write your includes relative to this path, the files will be accessible from any site you want on the same server.
The second advantage of using include_path is you can bypass any open_basedir directives which wouldn't allow you to include files which are not in the same virtual host base dir.
This is how I'd do it...
Create a core library.
Create you 2 site directories.
Create site specific code folders in
each site.
Create core library folders in each
site that simlink to the main core
library created.

Any downsides to using a CMS for a high-risk website?

I'm helping a client with their website (it's manually written using a Dreamweaver template and a ton of quadruple-nested table elements for design. Ouch), and I want to offer them a break from using Dreamweaver to write things.
I was thinking of using Wordpress or a similar CMS to do the job, as Wordpress is clean, fast, and really easy to design for. I've done it a few times, and it's almost as easy as just coding pure HTML.
My main concern is that the site has been hacked a few times before, even though it was pure HTML with no server-side code whatsoever. I can setup a manual Linux server for them, because the hosting company they use is one that I've never heard of.
The site owners are completely technologically impaired, so I don't want to scare them off by showing them a dynamic CMS with tons of features, as they think pure HTML is so much safer, they have to go out of their way to work with it.
I know this is a ton of writing, but what would be the most appropriate CMS for such a setup (hard-coding or dynamically generating content) for such a setup? I don't want to keep having the person manually write non-standards compliant quadruple-nested table layouts anymore, but I don't want to be responsible for having their site hacked...
Thanks!
A solution that allows for local editing, and the uploading of only static HTML files, would be the safest way to go. If it's a high-risk site, I would consider staying on that track.
If a site containing only static HTML was hacked, then most likely through some problem on web server or even operating system level - I am not aware of any exploits concerning static HTML resources. Problems usually come up when dynamic languages are involved.
Whatever you do, don't use Wordpress. It is bound to be subject of exploits and attacks simply due to its popularity.
If the site is pure HTML, then the insecurity is in the server, or the connection made between the server and the client.
I'd look into how to make the server more secure before making changes to the site, although doing both is a good idea. CMS's like WordPress use MySQL databases to store posts, etc, so that means client -> server connections. A way to make transfers of data more secure is to use https:// instead of vanilla http://. You can redirect using a .htaccess file if need be.
To summarise, I'd look at the server side of things for any vulnerabilities.
James
Wordpress has become a pretty wonderful CMS. If the site is high-risk, you might want to shy away from it, but I haven't had a site that I thought was too high-risk for WP myself. The site should keep up with regular updates and regular backups and there are some security tips that you can follow to help keep it more secure and less of a target.
First. Hide WP on the front end
Add this to your functions.php:
remove_action('wp_head', 'wp_generator');
remove_action('wp_head', 'rsd_link');
remove_action('wp_head', 'wlwmanifest_link');
That will remove default header info that can be searched for by scripts.
Install wp in a directory that will help obscure its location and obscure the admin URL.
Change the name of wp-contents folder to something else and move it outside of the main wp directory. For instance, you could name it "includes" and put it into the root folder. and then links to template files will not have wp-contents in them.
On top of that, use a secure host, lock down your files (especially on shared hosting), and you can look at something like vaultpress, but it seems like if you use a solid backup plugin and a good host, that is unnecessary. You can also look at some of the security audit plugins, but don't keep them running after you get feedback.
This code in your wp-config.php file will help to install in a directory and move wp-contents outside of it into an "includes" folder:
define('WP_HOME', 'http://domain.com');
define('WP_SITEURL', WP_HOME .'/admin');
define('WP_CONTENT_DIR', $_SERVER['DOCUMENT_ROOT'].'/includes');
define('WP_CONTENT_URL', WP_HOME .'/includes');
Wordpress is good for blogs
Typo3 is a good cms but hard to learn at start
Joomla and Drupal can be used as cms

Wordpress Multiple Developer Setup

Is there a good option for having more than one person developing a Wordpress application with a testing site.
The biggest hurdle that I have encountered are path issues when developing locally and integrating to a testing environment.
Does anyone have a good process for maintaining developer environment(s), keeping working content and links, and the code is maintained in source control?
To clarify, I would like to develop locally, and have a testing environment, and avoid path issues. I am open to other solutions, or ideas.
It comes down to three main concepts
Development environment should be as close to production as possible.
Use Source Control!
Automated Deployment Scripts Take out as much human error when deploying.
The development enviornment/process I prefer looks like this.
Dev/Local
SVN To Check Out Code Locally
Virtual Box running Ubuntu as a Solution for LAMP Environment or XAMPP
Deploy Scripts (Script Automation e.g. NAnt/Ant) For Staging / QA
Module Development
Theme Development
Etc
QA
Initial Content Setup / QA
After Initial Development Staging is Used Less
Production
Live Content Entry
Blogging Etc
As for path issues after initial content development path issues become less relevant, as most content is performed live. If a backup of production is used to create a dev site, SQL scripts and manually changes can be used as necessary. Also switching to using a Virtual Box solution helps ensure everything is in root. But the answer by FractalizeR does help.
May be I don't get the problem in a whole, but what is the problem of putting the whole source code of WordPress into version control, check it out to a single test server for tests?
If you have problems with site names, force your developers to check out to their machines and store it under www.yourwpdomain.local (mind the .local part). They can use DNS or simple hosts file to resolve .local domain address into 127.0.0.1. Apache setup is pretty straightforward.
The rule is Separation of Concerns. You should not place your compiled components like DLLs, EXEs, or data of a system in source control. Wordpress is included in this context for two reasons: 1. the site is stored in the database. 2. The database and the site had better be on a nightly backup schedule. Would you ever need to restore the db or WP from Git? Heck no! Additionally, any WP customization should be placed in child themes, that will be included in source control. The parent theme? NO WAY! Never customize the parent theme in WP. If you are making changes to your base Wordpress site and\or parent theme, then you are at risk of losing your customization when Wordpress or the theme is updated.

How to efficiently manage multiple installations of a web application?

From my experience, one of the bigger problems we come across during our webdevelopment process is keeping different setups updated and secure across different servers.
My company has it's own CMS which is currently installed across 100+ servers. At the moment, we use a hack-ish FTP-based approach, combined with upgrade scripts at specific locations to upgrade all of our CMS setups. Efficiently managing these setups becomes increasingly difficult and risky when there are several custom modules involved.
What is the best way to keep multiple setups of a web application secure and up-to-date?
How do you do it?
Are there any specific tips regarding modularity in applications, in order to maintain flexibility towards our clients, but still being able to efficiently manage multiple "branches" of an application?
Some contextual information: we mainly develop on the LAMP-stack. One of the main factors that helps us sell our CMS is that we can plugin pretty much anything our client wants. This can very from 10 to to 10.000 lines of custom code.
A lot of custom work consists of very small pieces of code; managing all these small pieces of code in Subversion seems quite tedious and inefficient to me (since we deliver around 2 websites every week, this would result in a lot of branches).
If there is something I am overlooking, I'd love to hear it from you.
Thanks in advance.
Roundup: first of all, thanks for all of your answers. All of these are really helpful.
I will most likely use a SVN-based approach, which makes benlumley's solution closest to what I will use. Since the answer to this question might differ in other usecases, I will accept the answer with the most votes at the end of the run.
Please examine the answers and vote for the ones that you think have the most added value.
I think using a version control system and "branching" the part of the codes that you have to modify could turn out to be the best approach in terms of robustness and efficiency.
A distributed version system could be best suited to your needs, since it would allow you to update your "core" features seamlessly on different "branches" while keeping some changes local if need be.
Edit: I'm pretty sure that keeping all that up to date with a distributed version system would be far less tedious than what you seem to expect : you can keep the changes you are sure you're never going to need elsewhere local, and the distributed aspect means each of your deployed application is actually independent from the others and only the fix you mean to propagate will propagate.
If customizing your application involves changing many little pieces of code, this may be a sign that your application's design is flawed. Your application should have a set of stable core code, extensibility points for custom libraries to plug into, the ability to change appearance using templates, and the ability to change behavior and install plugins using configuration files. In this way, you don't need a separate SVN branch for every client. Rather, keep the core code and extension plugin libraries in source control as normal. In another repository, create a folder for each client and keep all their templates and configuration files there.
For now, creating SVN branches may be the only solution that helps you keep your sanity. In your current state, it's almost inevitable that you'll make a mistake and mess up a client's site. At least with branches you are guaranteed to have a stable code base for each client. The only gotcha with SVN branches is if you move or rename a file in a branch, it's impossible to merge that change back down to the trunk (you'd have to do it manually).
Good luck!
EDIT: For an example of a well-designed application using all the principles I outlined above, see Magento E-Commerce. Magento is the most powerful, extensible and easy to customize web application I've worked with so far.
I may be wrong, but it seems to me what Aron is after is not version control. Versioning is great, and I'm sure they're using it already, but for managing updates on hundreds of customized installations, you need something else.
I'm thinking something along the lines of a purpose-built package system. You'll want every version of a module to keep track of its individual dependencies and 'guaranteed compatibilities', and use this information to automatically update only the 'safe' modules.
E.g. let's say you've built a new version 3 of your 'Wiki' module. You want to propagate the new version to all the servers running your application, but you've made changes to one of the interfaces within the Wiki module since version 2. Now, for all default installations, that is no problem, but it would break installations with custom extensions on top of the old interface. A well-planned package system would take care of this.
To address the security question, you should look into using digital signatures on your patches. There are lots of good libraries available for public-key-based signatures, so just go with whatever seems to be the standard for your chosen platform.
Not sure whether someone's said this, there are a lot of long responses here, and I've not read them all.
I think a better approach to your version control would be to have your CMS sat on its own in its own repository and each project in its own. (or, all of these could be subfolders within one repo i guess)
You can then use its trunk (or a specific branch/tag if you prefer) as an svn:external in each project that requires it. This way, any updates you make to the CMS can be committed back to its repository, and will be pulled into other projects as and when they are svn updated (or the external is svn:switch 'ed).
As part of making this easier, you will need to make sure the CMS and the custom functionality sit in different folders, so that svn externals works properly.
IE:
project
project/cms <-- cms here, via svn external
project/lib <-- custom bits here
project/www <-- folder to point apache/iis at
(you could have cms and lib under the www folder if needed)
This will let you branch/tag each project as you wish. You can also switch the svn:external location on a per branch/tag basis.
In terms of getting changes live, I'd suggest that you immediately get rid of ftp and use rsync or svn checkout/exports. Both work well, the choice is up to you.
I've got most experience with the rsync route, rsyncing an svn export to the server. If you go down this route, write some shell scripts, and you can create a test shell script to show you the files it will upload without uploading them as well, using the -n flag. I generally use a pair of scripts for each environment - one a test, and one to actually do it.
Shared key authentication so you don't need a password to send uploads up may also be useful, depending on how secure the server to be given the access is.
You could also maintain another shell script for doing bulk upgrades, which simply calls the relevant shell script for each project you want to upgrade.
Have you looked at Drupal? No, not to deploy and replace what you have, but to see how they handle customizations and site-specific modules?
Basically, there's a "sites" folder which has a directory for every site you're hosting. Within each folder is a separate settings.php which allows you to specify a different database. Finally, you can (optionally) have "themes" and "modules" folders within sites.
This allows you to do site-specific customizations of particular modules and limit certain modules to those sites. As a result, you end up with a site that the vast majority of everything is perfectly identical and only the differences get duplicated. Combine that with the way it handles upgrades and updates and you might have a viable model.
Build into the code a self-updating process.
It will check for updates and run them when/where/how you have configured it for the client.
You will have to create some sort of a list of modules (custom or not) that need to be tested with the new build prior to roll-out. When deploying an update you will have to ensure these are tested and integrated correctly. Hopefully your design can handle this.
Updates are ideally a few key steps.
a) Backup so you can back out. You should be able to back out
the entire update at any time. So,
that means creating a local archive
of the application and database
first.
b) Update Monitoring Process - Have the CMS system phone home to look for a new build.
c) Schedule Update on availability - Chances are you don't want the update to run the second it is available. This means you will have to create a cron/agent of some kind to do the system update automatically in the middle of the night. You can also consider client requirements to update on weekends, or on specific days. You can also stagger rolling out your updates so you don't update 1000 clients in 1 day and get tech support hell. Staggered roll-out of some kind might be beneficial for you.
d) Add maintenance mode to update the site -- Kick the site into maintenance mode.
e) SVN checkout or downloadable packages -- ideally you can deploy via svn checkout, and if not, setup your server to deliver svn generated packages into an archive that can be deployed on client sites.
f) Deploy DB Scripts - Backup the databases, update them, populate them
g) Update site code - All this work for one step.
h) Run some tests on it. If your code has self-tests built in, it would be ideal.
Here's what I do...
Client-specific include path
Shared, common code is in shared/current_version/lib/
Site specific code is in clients/foo.com/lib
The include path is set to include from the clients/foo.com/lib, and then share/lib
The whole thing is in a version control system
This ensures that the code uses shared files wherever possible, but if I need to override a particular class or file for some reason, I can write a client specific version in their folder.
Alias common files
My virtual host configuration will contain a line like
Alias /common <path>/shared/current_version/public_html/common
Which allows common UI elements, icons, etc to be shared across projects
Tag the common code with each site release
After each site release, I tag the common code by creating a branch to effectively freeze that point in time. This allows me to deploy /shared/version_xyz/ to the live server. Then I can have a virtual host use a particular version of the common files, or leave it pointing at the current_version if I want it to pick up the latest updates.
Have you looked at tools such as Puppet (for system administration incl. app deployment) or Capistrano (deployment of apps in RubyOnRails but not limited to these)?
One option would be to set up a read-only version control system (Subversion). You could integrate access to the repository into your CMS and invoke the updates through a menu, or automatically if you do not want the user to have a choice about an update (could be critical). Using a version control system would also allow you to keep different branches easily
As people have already mentioned that using version control (I prefer Subversion due to functionality) and branching would be the best option. Another open source software available on sourceforge called cruisecontrol. Its amazing, you configure cruisecontrol with subversion in sach a way that any code modification or new code added in serversion, Cruise control will know automatically and will do build for you. It will save your hell of time.
I have done the same way in my company. we have four projects and have to deploy that project on different servers. I have setup cruiseconrol in such a way that any modification in code base triggers automatic build. and another script will deploy that build on the server. your are good to go.
If you use a LAMP stack I would definitely turn the solutions files into a package of your distribution and use it for propagate changes. I recommend for that matter Redhat/Fedora because of RPM and it's what I have experience on. Anyway you can use any Debian based distribution too.
Sometime ago I made a LAMP solution for managing an ISP hosting servers. They had multiple servers to take care of web hosting and I needed a way to deploy the changes of my manager, because every machine was self-contained and had a online manager. I made a RPM package containing the solution files (php mostly) and some deploying scripts that runned with the RPM.
For automated updating we had our own RPM repository set on every server in yum.conf. I set an crontab job to update the servers daily with the latest RPMs from that trusted repository.
Trustiness can be achieve too because you can use trust settings in the RPM packages, like signing them with your public key file and accepting only signed packages.
Hm could it be an idea to add configuration files? You wrote that a lot of small script are doing something. Now if you'd build them into the sources and steered them with configuration files shouldn't that "ease" that?
On the other hand having branches for every customer looks like an exponential growth to me. And how would you "know" which areas you've done something and do not forget to "make" changes in all other branches also. That looks quite ugly to me.
It seems a combination of revision controls, configuration options and/or deployment receipts seems to be a "good" idea.....
With that many variations on your core software, I think you really need a version control system to stay on top of pushing updates from the trunk to the individual client sites.
So if you think Subversion would be tedious, you've got a good sense for what the pain points will be... Personally, I wouldn't recommend Subversion for this, since it's not really that good at managing & tracking branches. Although benlumley's suggestion to use externals for your core software is a good one, this breaks down if you need to tweak the core code for your client sites.
Look into Git for version control, it's built for branching, and it's fast.
Check out Capistrano for managing your deployments. It's a ruby script, often used with Rails, but it can be used for all sorts of file management on remote servers, even non-ruby sites. It can get the content to the remote end through various stragegies including ftp, scp, rsync, as well as automatically checking out the latest version from your repository. The nice features it provides include callback hooks for every step of the deploy process (e.g. so you can copy your site-specific configuration files which might not be in version control), and a release log system--done through symlinks--so you can quickly roll back to a previous release in case of trouble.
I'd recommend a config file with the list of branches and their hosted location, then run through that with a script that checks out each branch in turn and uploads the latest changes. This could be cron'd to do nightly updates automatically.

Categories