Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
What is the best way to deploy PHP Website with Bower components and Gulp tasks?
Is it good to install npm on production server for run Gulp tasks?
If not, is it good to commit minified files (CSS, JavaScript, compressed images) in Git?
If not, is it good to use rsync for prepare and (push) deploy project to remote server and then use post deploy script for set up chmods, update database etc?
Maybe there is better way to do that?
The safest way to make sure everything works on your production server is by deploying first to a staging environment. This could be very well the same server, just not accessible for everyone.
As an example: we have a website: www.website.com, which is the live environment.
Assume we have another subdomain called staging.website.com which points to the same server but has its own DocumentRoot (more on DocumentRoot) but it is protected by some form of authentication (or an IP firewall, in our company's case. If it is not authorized to see staging, it will just point to the www.website.com, very convenient).
You would first deploy to the staging, there run all the build scripts (Composer, Gulp, minify), test out if everything still works. If it does, you can just easily point the DocumentRoot of the live domain to the staging directory and it's done.
As for committing minified files, no. You should commit the original files and the builder scripts. The minified scripts will all be built every time you deploy your application.
You could use Rsync, but like I suggested you should build your application (chmod, database etc) during deployment. Not afterwards.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I'm developing a website, and im doing it on its final destination domain, not on localhost, and its almost finished.
Now I've come to the point where I'm beginning to get worried about what I do when users start using the site and some problems occur, or maybe I want to add a features to the site.
Is there any best practices which will allow be to minimize risks ruinin website and customer UX during updates, how to do it correctly?
If your website is small and easy:
Create a development domain/subdomain
Code and test there
Record all database structure changes (do database changes on a db copy)
Record your actions you use to test your website
As soon as you are ready to release a new version there are two options:
Update db replica and switch domains
Turn main domain off, update code and db, turn on
If website is not that easy, there should be local development, testing, staging and production environments set up independendly. You dev, then you test what you did, then you copy and install your code on real data before pushing it live on production.
To track changes and easily deploy new version to each of environments there are many tools connected with version control systems like git
And there is a good answer on how to use dev-test-stage-production environments with git: git with development, staging and production branches
First of all you have to work on localhost, while developing any new feature, of fixing a bug.
I recommend you also to use GIT branches, so you can create a new branch to add feature of fix a bug.
After finish use GIT merge with your website master branch
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
What is the best way to connect a team of developers working on a project which runs on a VPS, We have Mac and windows development Environement and VPS is Based on CENT OS?
I am totally newbie at creating a shared codebase.
I am laravel, ionic, javascript, angular and node developer.
The ease of doing this is nearly zero when we have to continuously change code and multiple people have to update different script files on the Project from Mac and Windows development environment.
We use the later described technology, we want to connect to the servers and the code that is written should reflect on the servers. Right now we have to copy all the code and than paste it on the servers?
Should we go for SVN or GIT, and how to execute the Shared code base Environment?
Develop locally
You definitely have to go with git to synchronize your code between all developers.
Git will ease most of the synchronization / merging work. All you have to do is to get familiar with it and its workflow. I know it can seem a lot at first but since every company is using it, it's not a waste of time.
Each developer works on his own machine (localhost) and commits / pushes to git when it's working properly. Other developers pull the changes to get new updates from the other developers on the team.
Don't shy away from using branch system to work simultaneously on different features / part of you app and pull requests to review code of other developers.
Worry about deploying to the VPS later: setup your git repository (github, bitbucket) and work locally.
Deploy later
Then, once you've reached a working version of your app/website, you can push your code to the VPS server using git as well and make it available to all for testing. You have to have SSH access on it.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I m thinking about right deployment strategy for PHP (or any) web application.
I have versioned code (git) which contains source codes like LESS, non minified JavaScript etc.
As ideal steps I see this ones:
Build app to /build directory - including compile LESS, minify JavaScript and others. In build directory will be everything ready for deployment.
Run other necessary scripts like tests etc. in /build directory
After this steps I'm a bit confused what to do next.
Whole /build directory should be copied to staging/production server, but before I was actually building app I've used to copy only files which changed from last Git commit. Copying all files seems to be inefficient for me. However versioning /build directory doesn't seem to be right thing as well.
Other possibility is to create /build directory but it seems to be too messy to have built and non built files together and all versioned.
How do you build and deploy web applications?
If you use Jenkins/Hudson CI then you can use different publish plugins which may be executed after successful build. Most of plugins allow you to say exactly what you want to be published and over what protocol.
Basically you run full build on Jenkins, then deliver your build files and artifacts to production server.
Look at Publish Over SSH or Publish Over FTP. Both plugins allow you to configure "Transfer Set" and if you uncheck "Clean remote" checkbox then plugins will only send out files that are different from your production host.
PS: I made an assumption that you use CI server of some sort given tag "continuous-integration"
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Hello fellow Programmers,
I am still a relatively new programmer and have recently gotten my first on-campus programming position. I am the sole dev responsible for 8 domains as well as 3 small sized PHP web apps.
The campus has its web environment divided into staging and live servers -- we develop on the staging via SFTP and then push the updates to the live server through a web GUI.
I use Sublime Text 2 and the Sublime SFTP plugin currently for all my dev work (its my preferred editor). If I am just making an edit to a page I'll open that individual file via the ftp browser. If I am working on the PHP web app projects, I have the app directory mapped to a local folder so that when I save locally the file is auto-uploaded through Sublime SFTP.
I feel like this workflow is slow and sub-optimal. How can I improve my workflow for working with remote content? I'd love to set up a local environment on my machine as that would eliminate the constant SFTP upload/download, but as I said there are many sites and the space required for a local copy of the entire domain would be quite large and complex; not to mention keeping it updated with whatever the latest on the staging server is would be a nightmare.
Anyone know how I can improve my general web dev workflow from what I've described? I'd really like to cut out constantly editing over FTP but I'm not sure where to start other than ripping the entire directory and dumping it into XAMP.
Are you using source code control? If not, you should. I suggest using Git, for example hosted on Github.
For a simple setup like this you don't need to use any special deployment tools; you can also use Git for deployment.
Developing directly on the staging server is not a great idea. Try to set up a development environment environment on your laptop.
You can push from your development machine to Github. Then then on either staging or the live server you can connect via ssh and pull from Github.
This allows you you to use all the power of Git to create branches and tags and to rollback to an earlier version if you make a mistake.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I'd like to deploy using git, but some members in the team doesn't prefer git and they would like to be able to use ftp, I want us all to be able to access the files and make change. Is there anyway this can happen?
Keep an update repository in a ftp server
then use this repository to deploy via ftp
Using a synchronizer? and exclude git specific files by filter. http://sourceforge.net/projects/freefilesync/
You could map the ftp folder.
NetDrive by Novell has a GUI interface for setting up FTP servers as drives on your computer.
Set up a test server on which people can go crazy if they want to.
Require everyone to commit into a central repository. Seriously, this is a requirement.
Declare a handful of guys who know what they're doing as "deploy masters" and allow only them to touch the production server, through a clearly defined automated deploy mechanism (git can be fine here).
Hack off anyone's hands who still messes directly with production via FTP.