Need advice on my PHP development solution - php

Here is how our current php development solution is set up:
Each developer work on their local machine.
Each developer commit their change to a common SVN server (intranet).
A commit hook upload the change to the staging server and perform validations tasks.
When the product is ready, manually deploy it to the production server via SFTP.
Note: Most - if not all - of the time I don't have SSH access to the server, only SFTP.
I could automate the deployment to the production server in the same way the staging server is updated but this solution works only one-way. How can I revert to a previous revision in case of problems?
How can I improve this solution?
Thanks and sorry for my English.

If you can set up the production server to access the SVN repo via a secure channel, such as https with webdav maybe try the following:
Create a script on the Production server that allows you to enter a tag directory and/or revision number/date and perform an svn export. This way, the prod server is pulling the changes from svn.
Now, if you have a way to have this script called securely from, say a commit script. Voila, you have automation.
Most importantly, you do not want an automatic update performed to the prod server that you were not planning for.
To solve this:
The commit script should only call the prod update script when something is committed to "/path/to/tags/release/dir"
Make sure only appropriate change control staff (or whoever currently controls the manual prod deplyment) have the ability to perform an svn copy to this directory in the repo.
For example, say your repo is set up as:
/yourWebsite
--> /branches
--> /trunk
--> /tags
----> /releases
The commit that would trigger the auto deployment to prod would be something like:
svn copy https://mySvnRepo/yourWebSite/trunk \
https://mySvnRepo/yourWebSite/tags/releases/x.y \
-m "Tagging for production deployment"
Rolling back can be achieved by making a commit to a previous releases directory. Note however, that this will not cause new files that were added to be rolled back.
Of course, your mileage may vary; this is only a suggestion for your investigation.
You should take time to consider the security implications and potential for disaster if set up incorrectly.
Hope this helps, even if only to get you thinking of other solutions.

Related

git push to var/www/application

My friends and I want to make a website using the CodeIgniter PHP framework. We rent a VPS which runs Cent OS 6.4.
As we developed the website, we met a problem: After user A changed something in var/www/application, A must inform B and C that he has changed something, and B and C must then download the new version and try to find out the difference between the new file and the old file (especially when this file had also been changed by B or C at the same time). We managed to use git to solve this problem. We installed gitolite in the server.
However, we now have another problem: after we push changes, we want to push the file directly into var/www/application, so we can test it directly in browsers.
My questions are,
Is it possible to push directly into var/www/application?
How can I do this?
how can I do this pushing with eclipse. I mean what kind of information should I put in the form below(I can't upload picture)
location
url:
host:
repository path:
user:
password:
It sounds like you're wanting to use Git for deployment. git push is what you use to push to a remote repo, not to some directory somewhere. What you really need is a script that does a git pull, and then possibly a git checkout to fetch the updated code from the repo, and switch to whatever branch you're using for deployment (unless you use master, then you can just stay on master all the time).
Depending on where you host your repo, you can set up a hook so that it will hit a URL on your server when new code is ready to grab.

Creating a safe dev environment

We are a small team developing a Wordpress site. Till now we have been editing the same files online, which inevitably led to mistakes. We thought to use Git / Github to stop stepping on each others tows and manage source code efficiently. However I can not run the site locally on XAMPP as I am getting numerous PHP errors. What would you recommend in this case?
Maybe creating another folder on the server with identical content just for testing? Is it possible then to run Git on the server?
I operate in a team of devs and this is what we do.
Locally each set up our own wordpress install under a seperate vhost and locally modified dns. Get just plain jane wordpress running perfectly.
Create a dev testing site on a live server on the internet (often prefix the real url with dev-www.mysite.com)
Create a GIT repository, and have it auto push-deploy to the dev testing site (we create the folder structure in GIT from wp-content down, and only have custom themes/plugins)
Configure GIT to MANUALLY push changes to the production server (for going live)
On all systems we install the wp-migrate-db-pro plugin. Locally devs only PULL from the shared dev site to their local (which means planning what/when/where you create content).
Copy around the /uploads folder, incase there are a bunch of images/media uploaded.
This works when some of us run Windows/IIS/WordPress, Mac/Apache/WordPress, Linux/Apache/WordPress
As a general idea a deployment process could go like this:
developers push their git changes to the server 'staging' area
either manually or using git's post processing, changes are pushed is needed to a virtual Wordpress web host that acts as test or 'staging' server. This can be password- or geolocation-protected
once it's tested properly, then the code can be moved to production web host files, this can be accomplish with a simple script that goes like: stop web server - backup files - nuke files - move files from staging - start web server

Continuous deployment over a cluster of servers

I am currently working with a startup that is in a transitional phase.
We have a PHP web application and utilise continuous integration with the standard unit and regression tests (selenium) run over jenkins. We have a development server which hosts newly committed code and a staging server that holds the build ready for deployment to the production server. The way we deploy to the production server is through a rudimentary script that pulls the latest svn copy and overwrites the changes in the htdocs directory. Any SQL changes are applied via the sync feature from MySQL Workbench.
This setup works fine for a very basic environment but we are now in a transition from single server setups to clusters due to high traffic and I have come up against a conundrum.
My main concern is how exactly do we switch deployment from a single
server to a cluster of servers ? Each server will have its own htdocs
and SQL database and under the current setup I would need to execute
the script on every server which sounds like an abhorrent thing to
do. I was looking into puppet which can be used to automate sysadmin tasks but I am not sure whether it is a formidable approach for deploying new builds to a cluster.
My second problem is to do with the database. Now my assumption is the code changes will be applied immediately, but since we will have a db master/slave replication my concern is the database changes will take longer to propagate and thus introduce inconsistencies during deployment. How can the code AND database be synchronised at the same time ?
My third problem is related to automation of database changes. Does anyone know of any way I can automate the process of updating a DB schema without manually having to run the synchronisation ? At the moment I have to manually run the workbench sync tool, whereas I am really looking for a commit and forget approach. I commit it and DB changes are auto synchronised across the dev and QA setups.
I am running a similar scenario, but I am using a Cloud Provider for my production environment, in order that I do not need to care about replication of DB, multi server instances etc. (I am Using pagodabox, but AWS would also work perfectly fine).
I would recommend you to create real migrations for Database Migrations, in order to track those via svn or something else. In that case, you can also provide information, how to roll back. I am using https://github.com/doctrine/migrations, but mainly because I use doctrine as ORM.
If you have a migration tool, you can easily add a command in your deployment script to run those migrations after deployment.
I don't think that the database synchronisation is a big issue during deployment. That might depend on the actual infrastructure youre using. The cloud providers like pagoda or aws take care of it for you.

Team Environment Setup and Remote Working

We are working on a project which is getting bigger and bigger.
Till now our work looked like this:
I have a web server, I'm coding with VIM directly over SSH
Another programmer send me new files I add them and integrate them (he have a copy of project and local server)
Designer sends me design I also integrate it.
It is pretty much waste of time, because every time they made change I have to integrate it.
Now we have a hosting which doesn't support SSH (I cannot use vim now). How should we work on a project like this? How should I set up my VIM to work on remote project (I don't want to download and upload every time I want to change file)?
You didn't mention your OS. That would be certainly hepful if you want a precise answer.
The first thing would be to find a better host. You can rent very decent dedicated servers for as low as 40 or 50 € or less. If your project is big and serious, 50 €/month or 100 €, or 200 € is perfectly acceptable and you can install/enable whatever you need. Depending on the size of your project, a VPS could be enough. Whatever the price, a web host without SSH access is worse than shit.
But you may not have any power on that area.
Since your server doesn't support SSH, a proper VCS is not an option. The only practical solutions I see are rather "old-school" but they work:
Solution A:
Download the whole site on your local machine with an FTP client.
Edit locally.
Test your changes with a local web server.
Upload the changed files when your tests are OK.
Solution B:
Connect to your server with an FTP client.
Use its "Edit Locally" feature to open the files in Vim.
Write your changes, the file is automatically updated on the server.
Solution C:
Use Vim's bundled netrw plugin: :e ftp://host/path/to/file. See :h netrw.
Note that the process will always be download -> edit -> save -> upload, whether you notice it or not. Depending on the solution you choose, the process can be horribly repetitive and inneficient or almost completely invisible.
But, seriously, get another server and use a VCS and a local server.
I recommend using version management software like git with SSH hooks that automatically upload changes to your server.
You can use a version manager, like git and make a git pull in the web server every time you have a stable version.
You collaborator can push the new content and you dont need manage the file youtself.

What is a good solution for deploying a PHP/MySQL site via FTP?

I am currently working on a web application that uses PHP and MySQL, but I do not have shell access to the server (working on that problem already...). Currently, I have source control with subversion on my local computer and I have a database on the local computer that I make all changes to. Then, once I've tested all the updates on my local computer I deploy the site manually. I use filezilla to upload the updated files and then dump my local database and import it on the deployment server.
Obviously, my current solution is not anywhere near ideal. For one major thing, I need a way to avoid copying my .svn files... Does anyone know what the best solution for this particular setup would be? I've looked into Capistrano a bit and Ant, but both of those look like it would be a problem that I do not have shell access...
I'm using Weex to synchronize a server via FTP. Weex is basically a non-interactive FTP client that automatically uploads and deletes files/directories on the remote server. It can be configured to not upload certain paths (like SVN directories), as well as to keep certain remote paths (like Log directories).
Unfortunately I have no solution at hand to synchronize MySQL databases as well...
Maybe you could log your database changes in "SQL patch scripts" (or use complete dumps), upload those with Weex and call a remote PHP script that executes the SQL patches afterwards.
I use rsync in production but you could do this:
Add a config table into your site to hold what level of DB you are currently at.
During development, store each set of SQL changes into a single file (I use something like delta_X-up.sql). These will stay in your SVN as well. So, for example, if you are at delta_5 and add a table between the current release and the new release, all the SQL needed will be put in delta_6-up.sql
When it comes time to build, export the repo instead of using a checkout. This lets you ignore all the SVN cruft that comes along since you won't need that into production.
Use Weex to push those changes into production (this would be were I would use rsync but you don't have that option). Call a remote script that checks your config DB to see what delta level you are currently at, parse the directory with you delta_x-up.sql files and see if there are any new ones. If there are, read them and run the SQL inside.
You can do a subversion export (rather than checkout) to different directory from your working copy, then it will remove all the .svn stuff for you

Categories