I'm rather new with the silverstripe framework but i have experience with magento which is also in PHP.
I have read several topics about deploying silverstripe. It seems that a lot of users preferred to download a fresh version of silverstripe install it on live server and then transfer the template and mysite folders and do the /dev/build
I guess this can work if you have access to the live server but what if you have a client who just wants his site on USB (mainly for security reasons)? What then?
So my question is this: what is the recommended way of migrating SilverStripe from dev to live server?
Will it work if you just copy paste all the files and the database?
As long as the live server meets SilverStripe's requirement it will work.
You'll need to ensure you update the database details if they change...
define('SS_DATABASE_SERVER', '');
define('SS_DATABASE_NAME','');
define('SS_DATABASE_USERNAME', '');
define('SS_DATABASE_PASSWORD', '');
...along with the environment type.
define('SS_ENVIRONMENT_TYPE', 'live');
Unlike other systems the IP or URL aren't baked into the database. Although It's advisable to do a /dev/build on the live server
The client should know what files on the server should be writeable by the webserver / php (namely assets/ and cache dir) and what not (all the rest imho).
I also recommend to setup all server specific configuration (see Phill's answer) in a file called _ss_environment.php (see docs) which can reside outside the webroot.
Then grab the database dump and setup the site on another server.
Related
Can someone give me some general guidelines or point me to any source on how to deploy an advanced Yii template application v2 on OpenShift?
The problem is that I have not much experience with that platform and can not find specific documentation.
I can't access appdns/frontend or /backend (it gives me 404 error).
Given that non-use OpenShift but I use another platform that does not provide access to the shell so I can not make the normal installation Yii2 commands (composer and so on). I assume that you are my same condition. If you can be useful I depart from a copy of the entire application correctly configured on my local server and using ftp I transfer what is necessary. During the initial installation of course I transfer everything, then with a version control appropriately focused on my local copy I manage updates to be transferred to the server in the cloud. I hope this will be useful.
I'm developing web app using CodeIgniter PHP framework. The server I'm working with does not support any type of source control (i.e. Subversion) unless you go to a higher price tier.
I would still like to put the code under some sort of source control. Does it make sense to do the following:
Install git or SVN on my local machine and develop there
Copy changes from my local machine to development directory on the server (using FileZilla, WinSCP, etc.) and test
Copy changes from development directory to production directory on the server
Does that sound reasonable? Are there better alternatives? Thanks!
If you are using svn locally it's a bit dangerous because then you will need to protect also your computer - I think the best way is to work with the commercial sites offer fully supported svn/git - like http://www.beanstalkapp.com/ or http://www.github.com
You could use source control on your local machine (SVN, Git, etc.) and use an open source tool like Capistrano to deploy the code from your local source control repo to your server via SSH. Or if you're limited to FTP, this blog post has a potential solution.
An advantage of using a tool like Capistrano instead of directly mirroring the files on your local machine to the server via FileZilla or WinSCP is that Capistrano will version your deployed files so that, if you end up breaking something and need to roll back quickly to the previously-deployed version, it can be as easy as changing a symlink to the previous deployment directory.
Does that sound reasonable?
Partially, in p.1. But even in this case I'll suggest to have your repository also at some Repository-Hosting (BitBucket, GitHub, Assembla)
For pp. 2-3: your deploys must to be automatic and non-interactive, thus - you'll have to select another tools (for using in post-commit hook of SCM-of-choice)
Somehow better alternative to 2-3 may be:
Use 2 different branches (DEVEL and PROD) as sources of DEVEL and PROD dir
Post-commit hook, which upload only changed in committed revision files to corresponding dir (NCFTP for FTP, SCP with scenario for ssh)
Main development happens in DEVEL branch
PROD have only mergesets from DEVEL
Workflow is SCM-agnostic and scalable to any reasonable amount of branches and developers
I'm part of a team of 3 (2 developers and 1 designer) who sometimes work in the office and sometimes remotely and I'm looking at a way of using GIT to develop our websites seamlessly. I've got a managed account with Rackspace and have 3 servers setup on the account - development, staging and production.
I'm looking at the best way for our team to develop daily on our websites without having to FTP the files up to the server each time we make any changes. I've used SVN in the past but i'm looking to use Git for version control. The workflow I had in mind for an example website called 'test' was the following:
Development Server would have a directory (called trunk but not sure if it should be called something else?) for each user as well as a central directory. E.g /var/www/test/jbloggs/, /var/www/test/asmith/, /var/www/test/rjohnson/ and /var/www/test/central/trunk/.
The central repository would be installed within /var/www/test/central/trunk/ and then /asmith/, /rjohnson/ and /jbloggs/ would clone the trunk which would mean they would become /var/www/test/asmith/trunk, /var/www/test/rjohnson/trunk/ and /var/www/test/jbloggs/trunk/.
Each user would then have a copy of /trunk/ which will contain all the website files, will all have a subdomain configured i.e jbloggs.test.development, rjohnson.test.development etc and will configure their IDE to automatically SFTP to the server so that they are working directly within their directory the development server. The central directory domain will be test.development. When they come to committing any changes to the central repository they will SSH into the server and commit their changes and when we want to update the central repository we will pull these changes to get the latest version which can then be viewed at test.development.
Is this the right method of doing things or should we all have a local LAMP stack installed (apart from the designer who uses Windows) and have our repositories locally? If so, should the central repo still be on the rackspace server? The developers will be using phpstorm and the designer dreamweaver.
Hope the above makes sense.
Thanks
I strongly advise you to work local and then commit on the shared server. This is what git is made for. Development will be more reactive and easier for everybody. Make sure all dev master git so they can do their internal soup as they want. If one dev destroy the database, the others can keep on working. But you'll also need a convenient way to synch databases so developers work with an up to date local database.
The rest of your chain is ok, you can still have two test step like dev server for dev team and test server for testers. This will make testers working on a more stable version and it will also make you test the upgrade process when you copy changes from dev server to test server. Lot of errors arise because of untested upgrade procedures.
You can updates changes on test and production server either by installing GIT on them or just using a simple script that will ftp changed files. I don't like having git on a production server but this is a personal opinion.
Basically I've got various projects all version controlled using subversion. This is for many reasons: backup of files in case of bugs/issues in the future; backup of files in case of local system failure etc; collaboration from others in the company; etc..
One of the systems we work with is Wordpress which does updates and installs plugins through its administration panel and such, plus on installing it the system creates various files (including a wp-config.php file and a .htaccess file). This means that on install there are files on the server integral to the running of the system which aren't on the local systems and aren't in svn. Plus any installed plugins and updates aren't mirrored in version control or the local copy.
Plus it feels wrong (specifically when you compare with data normalisation in databases and such) to be working with two copies of the same code - one in version control and one on the server.
So my question is am I using the tools in the right way? Is there any way that the public_html folder from the server can "point" to the latest version in the repo? Or can SVN be configured to read from the public_html folder and automatically add+commit any files created/edited on the server?
Or do people just literally download anything that gets changed/created and add them to SVN manually? Or do people not care? Maybe I've misinterpreted what SVN is for? I'm using it for backup effectively.
Thanks
Tom
I only have versioned my own wordpress theme. All the other stuff including the data is live on the server and solely backuped from there.
The code of wordpress and the plugins used are developed elsewhere, they have their own repositories, and i do not mess mine with code I never will touch.
The question is how to deal with configurations. I am currently running a wiki where I document all the plugins installed live and what configuration properties I have set up.
A sync of live to local then goes like this:
Update wordpress version and plugins to the versions written in the wiki
Setting all configuration options as written in the wiki.
Importing the data base (except wp_options). Converting the static URL of wp_content files to the local scheme.
Syncronisation of the wp_content directory
In many cases your hosting provides regular backup. But is you use VPS you have more freedom to do whatever you want. I have made my public_html folder under version control and created a small script to commit every night. So I can have a complete version history of my site with changes traced. You can also create a script just to copy this folder elsewhere. There may be other better solutions for enterprises, but this may be enough for small project.
So we are pushing to create good processes in our office. I work in a web shop that has been doing web sites for over a decade. And we don't use version control. I know! It's bad, not my fault. I'm the guy with a SoftE background pushing for this at a minimum.
The tech lead has been looking into it. We all use Mac workstations and mostly use Coda for editing since it is a great IDE. It has SVN support built in but expects it to work on local files. We're trying to explore mounting the web directory as a local network drive with an SFTP tool.
We are a LAMP shop, BTW.
I am wondering what the model is here. I think we have typically would checkout the whole site to our local machine where we have apache running and then test it there? This isn't how we work yet, we do everything on the server. We've looked at checking things in and out, but some files are owned by apache and the ownerships change when I check them in, because I'm not apache.
I just want to know a way to do this that works given my circumstances. Would be nice to not have to run apache locally.
You might want to checkout the Coda mailing list and ask there. Lots of Coda enthusiasts there with specific experience.
If you don't want to have to run locally could make Apache on your server run a copy of the site for every developer, on a different port per person, and then mount those web-roots to the local macs and make that the working directory. If you're a small shop that's not hard to manage. I find that pretty easy to set up and saves a lot of resources on the local machines. The one-site-per-person helps to avoid conflicts with multiple people working on files at the same time.
What I'd additionally recommend is to have a script that gets the latest changes from SVN and deploys the entire site to the production server when you're ready. You could have that script change permissions on appropriate files/folders as needed to be owned by Apache. The idea once you're using source control is to never manually edit the production files -- you should have something that deploys it from SVN for you.
A few notes:
Take a look at MacFuse / MacFusion (the latter is the application, the former is the library behind it) to mount remote directories via SSH / FTP as local ones.
Allow your developers to check out into their local environment (with their own LAMP stack if they're savvy), or look into a shared dev environment with individual jails. This way your developers can run their own LAMP stack (which you could deploy for them on the machine) without interfering with others.
The idea being, let them use a workflow that works best for them, to minimize the pain in adapting to this change (if change management might be an issue!)
Just as an example, we have a shared dev server where jails are created with a single command for new developers. They have a full LAMP stack ready to go, and we can upgrade and re-deploy jails easily to keep software up to date. Developers have individual control to add custom settings / extensions if they need it for work, while the sys admins have the ability to reset everything when someone accidently breaks their environment :)
Those who prefer not to use jails, and are able to, manage their own local environments (typically through Macports or MAMP).