Development and production environment in PHP - php

I would like to setup two environments for my new website written in PHP. One - to develop new versions and test them. And second production where my actual stable version of website will be available.
Website in PHP will consist of many PHP and other files (JS, images, and so on). So I think how to prepare this environment in best way to make it easy to do source control, fast copy website from development environment to production environment and to make development version available for people on the web so they will be able to see actual work progress and suggest changes or report bugs.
Could you please give me some advice where to go from this starting point? Are there books about this (from practical point of view?) or do you have experience or tips what to watch out and what is important to make this process easy and good for me and other people involved in developing new project?

For starters use the following three:
SVN - this will give you source control and allow you to track changes. You may want to get GUIs on top of this (Tortoise is a popular one) to ease the learning curve.
RSYNC - this will allow you to streamline your syncing between local and remote site with a single command. RSYNC uses a diff engine to sync which means that incremental syncs happen in a matter of seconds. During intense programming, I will sometimes sync 4-5 times in one hour pushing out little changes real fast just because I can so easily.
MySQLDump - This will allow you to import/export data from your production site. I usually do this once a week to get production data on my local servers which not only gives me a local backup but also lets me toy around with production data on a local test environment.
Those three alone will save you a lot of time in the long run and allow you to scale. Later on you can look into an automated build tools, unit testing frameworks, xml documentation framework and the like to build some serious products.

I work with a setup like this, so I can give you some tips on how to do this. I've been doing this for a while now, working out the kinks here and there, and feel like this is a setup I can honestly say is pretty darn productive.
Small note: I work on OSX, so the specific applications used might be a bit different from you if you're a linux/windows user.
I run a production 'server' on my Mac, using MAMP (www.mamp.info) to easily supply me with an Apache server with PHP and MySQL. You could use a similar tool such as XAMPP or install everything manually, it's really up to you.
Then I have my live servers, where my websites and customer websites are hosted. For each new website project (let's take abc.com as an example) I create a subdomain called staging.abc.com, on which I do my testing. It's always a good thing to test things on the exact same hard- and software before actually going live.
I use Subversion (or in short, SVN) for my versioning needs, with the added bonus that I can easily add 'hooks' to automatically update my online production server whenever I send my newly updated version to the SVN server. SVN also allows you to easily work with more than one person on the same project. For more information on SVN and how to use it, I suggest the great (and free) online book found here: http://svnbook.red-bean.com/
So in short: I work locally with MAMP providing me with a local 'working' server. After that, I test online on a staging.abc.com location to see if everything works well, and to possibly allow others to see the project (in case you want your client to see what's going on, for example), and after that I actually publish the project by putting it on the actual domain.
There are many more things that can be done to optimize your workflow, but this should get you started.
Hope this helps!
-Dave

I prefer to have development occur on the developers local box if possible. If other developers are involved, you probably want to setup your version control such that both the database schema, javascript, css, and the php code can be checked out and setup on a developer's personal box pretty easily (assuming they have the correct LAMP/WAMP setup)
I've also seen it where people maintain a test website on a server where active development occurs. I would avoid this for active development, but use this for black-box testing of the latest checked in code (the latest build).
Once your test website checks out, then its a matter of exporting the code from your version control to the location the live website is. With svn, you can really just do an update of the live code with svn update specifying a revision or tag that indicates the current live version.
I would further recommend keeping some settings, like db access/username/pass, in a separate included file that is not version controlled. Keep this elsewhere, let developers plug in the access rights to their local database on their PC. On your server, plug in everything you need to access the database there. This should be really trivial code (defining a few variables) so not having it version controlled shouldn't be a big deal. If you like, you could version control a templated version, but I wouldn't put the real database info into version control.

Here is a pretty good starting place if you want to use MAMP or WAMP to develop locally and then push that to github and then update your live site from github:
http://www.mybringback.com/bringers/14509/git-local-and-shared-server-development-environment-with-ssh-setup/
Hope that helps.

Related

How do bigger & smaller web devs test out their code before deployment?

I've been trying to learn more about PHP and more server-side code in general, and I hit a point where I realized that if I want to upgrade my website, I would need to take it down and copy the code while the users were online.
Furthermore, since I haven't launched my website yet, and unindexed by Google, it rarely gets any visits, so I'm free to upload and test out whatever I want using PHPStorm + FTP. But I realize that once I finish my project, and have users, I wouldn't want to change things while they were using them.
How do people write code and debug, and setup before deployment to verify that their website would function fine? Also, how would you copy over the code? Both from a large website (i.e. apple.com, cnn.com - websites that need to be on 24/7) vs. smaller blogs/websites.
You're asking about Continuous Integration / Continuous Deployment ('CI/CD').
Basically you'll want to make use of a testing server to test your code on before the public get to see it. You deploy out your code to this server and test it thoroughly, confirming that everything is ready to go out to production. With more in-depth setups this testing can be automated with tools like Selenium or TestComplete.
In theory, you should be deploying a 'release cut' of a GitFlow workflow out to your testing environment. This ensures that all planned changes have been finalised and are ready to ultimately be shipped out to the public without any other changes getting mixed up in the release.
When you're finally ready to ship out to production, you'll want to plan the release, and meet with any associated parties. I'd recommend a proper release checklist, where you confirm that all of the additional features are working as expected, and that you're not accidentally removing any existing functionality (regression testing).
Note that a deployment out to production should have absolutely no differences to a deployment out to a test environment; the only thing that should be changing on the production environment is the server in which the environment is hosted, along with database configurations. This ensures that you're not accidentally testing / using any data that your customers might also interact with.
The production deployment itself should (essentially) just be a 'copy-paste' of the files to disk, and is often done with a Continuous Integration tool like Jenkins or TeamCity. Assuming you only have a small website, this should be an almost instantaneous procedure, and shouldn't even require any downtime on your production environment. If the process is likely to take a while longer (such as with a complex deployment), you may want to implement a maintenance page. This will inform your users that you're working on the website, and let them know when they can expect functionality to resume.
I'm a developer for an e-commerce website, when we were smaller (50-75 users online at a time) I would just use FTP eg, FileZilla to update the files I had changed with my latest edit, this would mean people would be served my changes and the website would not go down at all. For small edits, I would also just ssh in and manually edit the file if it needed to be done quickly.
Now we have grown much larger and have other developers (100-200 users at a time) to be fair I should be doing it this way before but I deploy directly from PHP storm/version control. I have working branches and once my work is ready to be deployed ill put it into the master branch.
Once the master branch has work pushed into it I have a script that checks for changes and clones the repo. (this happens automaticaly.)
There is a lot to cover here, and some of it is opinion based (which I will try to avoid).
How do people write code and debug, and setup before deployment to verify that their website would function fine
There are many ways to create a local development site, but most modern approaches involve containers. Docker is the most popular, which allows you to create replicas of a web server, database, etc. and run them locally on your machine. You can also use virtual machines to create a replica of your production server, or simply install the needed packages (i.e. php, apache, mysql) on your local machine.
Larger companies will typically have a QA (Quality Assurance) department who test all changes before they are deployed. As others have mentioned, some of this can be automated with testing software. When there is a QA team, you have a staging server where changes are first deployed. Once the changes pass QA and any automated tests (i.e. PHPUnit) they are deployed to production.
Also, how would you copy over the code?
This can get very opinionated, so I will just give you different options and not argue for or against any of them:
Use a service like rsync to transfer the files. With this approach, you typically have a staging server where you checkout your code from version control (or manually copy it if you aren't using version control), run any build scripts such as composer, gulp, etc., and then sync the files to production. In a larger setup, you wrap all of this into a script to automate it.
If you are using version control (let's use Git as an example), you can also pull the code down to your production server (as apposed to pushing it up to production), by running git pull on your production server. This will get all the latest changes, then you run any build scripts such as composer, gulp, etc.
There are also tools for continue integration (i.e. Jenkins) that facilitate a lot of the work mentioned above. You can go as far as setting up hooks on GitHub, so your code is automatically built and deployed when you merge into master.
On a small or personal project, it might be simpler to just use FTP and manually transfer the files.

Control multiple versions of PHP project

I've stumbled upon the following problem and can't figure out a decent solution. I make websites in PHP for various clients. Like all clients, some will find bugs that need fixing, some will request updates to live sites after a few months and some do both.
When working on updates for clients, I like to preview them to the clients before putting them live.
I've used a couple of different solutions in the past, but none I'm happy with. What I've done so far:
Define VERSIONs and CURRENT_VERSION in the index. Visitors see the approved version. I send the client a specific link that sets a $_SESSION variable, which lets them see the new version. In the code I work with if's and switches to show new things depending on CURRENT_VERSION. This works, keeps all the code in one place and easily allows bugfixing, but riddles the code with stupid if(CURRENT_VERSION >= V2) statements. I also can't use this in CSS files.
Put all files in a "build1" folder, when starting on big updates that need previewing, I copy everything to a "build2" folder. I upload the "build2" folder for the client and password protect it. This works pretty well, however, if I have to bugfix build1 while working on build2, I have to make sure I copy fixes from build1 to build2.
Use a development server (some clients can provide this) this works the best so far, as the dev server is separate from the live site. It works much the same as the second solution, where I have to make a copy of the project, but it feels cleaner to me.
I am however looking for a better way to manage my code, possibly with the use of Git/SVN, but do not know enough about these things if they could help me.
A fairly typical paradigm these days is Development/Staging/Production. You don't need an entire development server for this approach either, VirtualHosts/nginx equivalent will suffice.
I'd suggest that the first thing you need to do is get your projects into Git, the quicker the better!
Disclaimer: This is my workflow, there are many like it, but this one is mine.
Here's an example of my current work flow.
GitHub
A separate repository for every project I've taken on
Development Server
Bare Git Repositories, replicating my GitHub (I'll explain this shortly)
/opt/git-bare
VirtualHosts of all my projects
/var/www/vhosts
Local Machine
I clone my bare repositories, as I need, for quick editing and commits. I'm not worrying about FTP'ing files back and forth or mounting anything locally on to my machine. I find this the BEST way to work on a project. When I am ready to check out some code on my development server, I simply commit and push my work to the bare repository, where I have a post hook that then tells my development VirtualHost to update itself.
This means that within seconds of commiting/pushing my work, from my local machine, I can see it on my development server through my browser.
When I am happy with what I have seen on my development server, I then push my bare repository up to GitHub. Git is a wonderful tool and all my local commits are also available within the logs on GitHub.
Staging
This is a clone, from GitHub, of my master branch on GitHub. This is what I use for showing to clients and getting changes signed off.
Production
My production server is a clone of a tag from GitHub. No matter what I do within my master branch, production will never be effected and should anything ever go wrong with one of my servers I have this tag easily available for rebuilds.
If you have any questions about this, please just fire away.
Before my long answer: if you're looking for a host that would kind of do this for you, stackable supports the concept of multiple 'enviroments'. I'm sure other hosting platforms offer similar features that allow essentially the same thing (AWS Elastic Beanstalk for example), but I don't know of one that is as core to the offering. Note: I don't have any connection to stackable, I'm not even a customer.
Define VERSIONs and CURRENT_VERSION in the index...but riddles the code with stupid if(CURRENT_VERSION >= V2) statements. I also can't use this in CSS files.
If I recall correctly, this is actually similar to how Facebook rolls out changes. You're right, it adds that additional logic; however there's an advantage as you're able to 'preview' the changes to more than a single user (say, all users that are admins, or all users in a specific geographic location).
And of course, the preview uses the same data - which means the user previewing the site will use it like they normally do (instead of odd interaction with contrived data).
While you're right as to the disadvantage, there are cases where this is a useful way to test new features.
Put all files in a "build1" folder, when starting on big updates that need previewing...however, if I have to bugfix build1 while working on build2, I have to make sure I copy fixes from build1 to build2.
Here you're essentially deploying two versions of the project to the same server. In the example you give, you're putting the second copy under the original webroot - but depending on hosting, you could just assign a subdomain and work from two different web roots.
The advantage is similar to the first, as both installs could easily share the same data, and if all requests pass through some kind of front controller you can add logic to only show changes to select users (or use some kind of Basic Auth as you describe).
In this case putting your project into version control (as I see it, git would be better than SVG for this) can make this much easier. On your development system simply switch between branches to work between the existing version and the new version.
If you fix a bug in the old version, you should be able to easily (or more easily than your current workflow) merge that fix into the new version with a few commands. If you fix a bug in the new version which was also in the old version, doing a cherry-pick can allow you to just merge that single change back into the old version.
Deploying your code can be as basic as logging into your web server and doing a git pull, or you could use tools to automate the deployment. Essentially your deploy of the old version would be based on the 'master' branch of your repository (or something similar to that), and the new version would be based off whatever you've called that branch.
Use a development server (some clients can provide this) this works the best so far, as the dev server is separate from the live site. It works much the same as the second solution, where I have to make a copy of the project, but it feels cleaner to me.
As this is very similar to your second method, adding version control will certainly make this easier as well.
There are plenty of resources explaining how to deploy from various version control systems to various hosting platforms, but hopefully this illustrates how that will fit into what you're already doing and make things easier for you.

PHP Code Deployment Tips

In the past, I have been developing in a very amateurish fashion, meaning I had a local machine where I developed and tested code and a production machine to which I copied the code when I was done. Recently I modified this slightly to where I developed locally, checked the code into SVN and then updated the production machine through SVN.
Now I would like to start a new project and improve my workflow. Ideally I had the following in mind:
Have one or more local dev environments
Develop and test on local machine(s)
Use SVN (or Git) as code repository
Use a build tool to set up new environments (either dev, staging or production) and deploy code
Since I am not very familiar with this process, I am looking for suggestions on how to best set this idea up and the tools to use, especially when it comes to the build tools. I was looking into Ant and Phing (possibly make), but I am so new to this that I would really like to get some guidance. Are there any good tutorials or books about PHP deployment, especially for beginners? What I am especially interested in are the following topics:
Deployment to different types of servers with different settings (e.g. dev uses different db, db passwords, PHP error reporting than production or staging).
Deployment that automatically pulls code from SVN.
Deployment that temporarily sets a "Maintenance" page for production environment.
Once I mastered the above, maybe even do some testing in the build process.
I know my question might sound quite confused... I admit, I am new to this and might be a little off the target in what I really need. That's why any help is greatly appreciated.
I would suggest making your testing deployment strategy a production-ready install-script -- since you're going to need one of those anyway eventually.
A few tips that may seem obvious to some, but are worth pointing out:
Your config file saved in your VCS should be a template, and should be named differently from the file that will eventually contain the actual settings. E.g. config-dist.php or config-sample.conf or sample/config-mysql.php or something along those lines. Otherwise you will end up accidentally checking in a server-specific configuration file over your template.
For PHP deployment, anticipate that some users will not be able to run server-side scripts through any mechanism other than the web server itself. A PHP-based installer is almost non-negotiable.
You should include a consumer-friendly update mechanism, and for that, wordpress is a great example of a project to emulate. A PHP script can (a) download the latest build, (b) use the ftp functions to update your application's files, and (c) execute an update script which makes the appropriate changes to the database, etc.
For heaven's sake don't do like [redacted] and make your users download and install separate patches for each point release. Have them download the latest (final) release which contains all the updates to date, and applies the correct ALTER TABLE functions in sequence.
Whether the files are deployed via SVN or through FTP, the install/update mechanism should be the same: get the latest files, run the update script. The updater uses the version listed in the PHP script and the version listed in the DB, and uses that knowledge to apply the appropriate DB patches in order. As for how to generate those patches, there are other questions here that you can refer to for more info.
As for the "Maintenance" page, just use the version trick mentioned above to trigger it (compare the version in the DB against the version in the PHP code). It's also useful to be able to mark a site as "down" to the public but make it visible to admins (like Joomla does), which you can trigger through database or filesystem flags.
As for automatically pulling code from SVN, I'd say you're better off with either a cron script or with commit triggers than working that into your application, since it wouldn't be relevant to end users.
This isn't exactly part of your question, but it's relevant:
If you go into distributing code intended for a wide audience, I would advise you to go with building and distributing OpenSSL-signed PHAR packages. You can distribute them over HTTP without a problem, and because they're OpenSSL-signed, you're also mitigating the risk of man-in-the-middle attacks and protecting end-users/customers/clients from someone injecting code if you want to setup an automatic or one-click update.
There's a set of tools I've contributed to in the past that work great for this, but you'll either need PHP 5.3, or you'll need PHP 5.2 with PHAR installed via PECL. https://github.com/koto/phar-util
As far as testing goes, PHPUnit is the de facto standard.
If you are interested in using Git then you should check out this build system from CodeMeme. From what you described it sounds like it would be a good fit. You can add it to any project as a submodule and with the included code you can tailor a build script that will deploy to different multiple servers in multiple environments. It uses Git to build the code for deployment but unfortunately SVN is not supported.
https://github.com/CodeMeme/Phingistrano

Multi-Developer-Setup with SVN-VC and remote test-servers for each developer. Best practices?

I would like to have some input on how a professional development setup with the following requirements might look like.
several PHP-developers (say PHP)
each developer belongs to one group
each group has one team-leader who delegates tasks
each developer works on one Windows 7 machine
and developes either with NetBeans or Eclipse
each developer 'owns' one virtual test-server where he can run the code
the VCS in use is SVN
there is a staging server where the product is ultimately tested before it gets released/deployed
I gave some specific technology to not be too abstract and b/c I also would be interested in concrete suggestions for plug-ins etc.
There are several questions coming to my mind in that setup.
1) So every developer will work on
personal branch.
2) This branch is checked out in a working copy.
Now ... this working copy is edited locally on the PC with the dev's IDE and executed/tested on the server.
What would be in that case the best/usual way to do that? I mean - how do you get your edited code on the server without causing too much overhead?
Would the dev have the code on his local disk at all? Or would it be better to have the IDE write on the remote virtual server through a tunnel or via a specific protocol?
3) Every day a dev will commit his work into his personal branch which resides in a central repository.
Is there a best practice on where the repository is supposed to be located? A seperate server?
4) Then after a dev finished his task either s/he or the team-leader merges the new code into the respective main-branch or trunk.
The most confusing part is about what I wrote between 2) and 3). Because so far I only worked with a local server. For example a VM with a server running a code which is located in a shared folder so I will be able to edit it directly. I'm not sure how to bridge the gap efficiently when the server is now actually remote. Efficiently means not having to upload manually via FTP for example.
Also external sources or book recommendations are welcome.
edit
My question/s is/are aiming at a quasi-standard / best-practice. I think this is pretty much a standard development scenario so there must be a 'usual' solution.
edit 2
Okay ... so lets try with a picture:
V is the virtual test-server for one or more developers D. C and C' are the two code-versions. They should be kept as identical as possible.
Two solutions come to my mind:
1 : Edit C, then upload it to C', then execute C', then commit C.
2 : No C existant. Just C' which is edited through some tunnel technology and executed and committed.
My guts tell me both solutions are semi-optimal. So what would be "professional" / most efficient / fastest / most convenient / most friction-less / least error-prone / best practice / industry standard?
Any questions?
Maybe its not of great help but GIT sounds like a perfect fit to your problems, i recommend to take a look to the GIT features. And if you got time check Linus Torvalds him self talking ablout GIT. http://www.youtube.com/watch?v=4XpnKHJAok8
The standard procedure as you describe is more or less the same. I also you this approach for my team. It can also be called staged application development.
Here is how I am doing it, I use a remote SVN host (ex: assembla.com, unfuddle.com) to store all my codes. My team members store the information there on these remote svn servers. You can also buy an VPS and setup SVN there and user the same approach.
Best practices is to test locally and commit and commit as many times as you can but every commit must solve a problem or include a significant segment that adds any new feature.
Once the commit is done by everyone the lead developer then can login to the staging server via SSH using tools like PuTTY. First time the lead developer has to checkout the code into the folder where the codes are to be located. During this phase file conflict may arise if multiple developers edits same segment of a file. The lead developer should then resolve the code first and then proceed with the checkout. Once checked out, there onwards the lead developer will only need to do a svn update on the staging server to make the code up to date.
Basic idea is to get the code working on local setup then commit and update the staging for testing the application on a simulated scenario and then commit it to the live site.
There are a lot of if's and but's here which will need me to write a chapter on :) but in short this is the zest.
Tools (you can use to work under this setup):
- Tortoise SVN Manager
- PuTTy
- NetBeans
hope it helps :)
I don't like working with personal branches. I worked with ClearCase for almost 15 years and even though ClearCase probably handles personal branching better than most, it was still a big pain. Even worse, personal branches encourages people to not commit their work until the last minute -- usually a day or two before a major release.
For that reason, and to force developers to stay on track with each other, I highly recommend everyone working together on a single branch (or on the trunk) as much as possible. I keep telling developers to take small bites when they make changes.
What you sound like you need is a way to automate the deployment. That is, I make changes on my local machine, and with a single command, I make sure that the server has a duplicate copy of the code. You also want the deployment to be efficient. If you change a single 2 kilobyte file of a 2 gigabyte, 10,000 file deployment, you only want to copy over that one file, not 10,000 gigabytes. For that, I would recommend you write a deployment script in Ant.
Your developers can modify files, then deploy those files via an Ant script. The developers don't have to remember what files they had updated because Ant will automatically handle that. In fact, Ant can even modify files to make sure they contain the right environment information as they get copied over. And, of course, Ant can rearrange the files if the setup on the server is different from the setup in the source repository. And both Netbeans and Eclipse can execute Ant scripts right in the IDE.
So:
Have your developers modify code on their local machine.
Run an Ant script to make sure the server and the local machine are in sync.
Test on the server.
Then, check in their changes once they're happy with the results on the server.
Someone mentioned a Continuous Build System like Jenkins. That actually would be a good idea anyway even though it doesn't solve this particular issue. Jenkins could have its own server and database. Then when you commit your code, Jenkins would update the server and run automated tests. Jenkins can then create a report. It all gets displayed on Jenkin's webpage. Plus, you can archive your deployments on Jenkins, so if you tell someone to test "Build #20", they can simply pull it off of Jenkins where its easy to find.
I'm sure everyone has different ways of doing things but here are my thoughts.
"Best Practice" is probably "Continous Integration" ie each developer doesn't have their own branch but checks in to a common development branch. This forces them to handle conflicts and coordinate with each other early and often to avoid the lead developer from managing a huge train wreck merge later down the road. Take a look at cruisecontrol if you really want to go that route.
The best way is if they have a local apache web server and full php stack. You can use the Zend_Server community edition to get up and running on windows fast. Most standard php code will run just fine on both Windows and Linux, but if you are doing lots of file manipulation or cron job or cli stuff, or need memecache, etc you'll run into incompatabilities. If thats the case and the Linux only stuff is going to bite you use VMWARE or VirtualBox to run local linux instances and install the IDE inside those and just make sure they have goobs of RAM to deal with it.
Each developer needs to run a syncronize inside of Eclipse, basically an svn update, deal with any conflicts with the other developers right then and there, do local testing and commit their changes.
I setup a post_commit hook on the svn server that calls and /autobuild.php on my web server. autobuild.php runs svn update and gets the latest code changes as well as does any chown or chmod file permissions stuff and resets any server specific config files config.php. Its a little tricky to get it setup so that the apache user can run svn update, but once you do your beta/testing server always has the latest committed code. CruseControl, and several others can also help you do this sort of thing and add unit testing, etc
Now your Lead Developer still has a job to do merging the Development Branch into the Production One, testing on the dev server, and reviewing the commits of the others and deciding how and when to push out a release, but your not putting the burden on him of resolving every conflict and merging every change.
Your developers are not ftping files or ssh remoting into servers, they just work locally in their IDE and interact with each other through svn (and email, phone, chat, etc) updating to get the new code and commiting as they finish things.
I don't see any good coming out of having a seperate branch for each developer using SVN. Merging those branches might work in Git but with SVN your lead developer will be hating life very quickly with that type of setup.

What is the more effective dev/test/prod environment setup for PHP development..?

Unfortunately, I've never had a senior developper or mentor to show me some of the best practices. So I develop sites (php/mysql) on my Windows machine with WAMP, I test in hidden (password restricted) folders on the production server and finally move sites to production folder.
I would like to have a more fluid/practical/error-proof setup so that from development > test > production, there is no hiccups.
The important points/questions are (you probably can come up with a lot more):
Ease of use
Easy to dev/test modifications after site is live (to avoid tests on production site)
No server difference between local/test/prod (error reporting, apache setting, etc)
Avoid problem with DB differences (ex: if columns were added, how do you add them to prod DB.?)
Do you skip the test environment or do dev AND test on the same.?
etc...
How do you guys develop PHP/MySQL websites.?
Do you use SVN.? Do you use IDEs.? Do you use VMs.?
Thanks.
This is a kind of frequent question - this is why most seasoned devs do not reply on - and generally end up in a flame war with torrid opinions. So, be careful about that.
But you seem to be an nice guy intending to get on the right paths, seeking for some really productive paths. And I recognize a little about myself on this a few years ago.
OK, first thing to keep in mind is: do not blind follow anyone on anything. Anyone can claim to be a great master, but you can find at least 10.000 guys far way better and completely anonymous. So, for anything you hear about do the following: listen, test, and take your own conclusions. If there is just one golden rule this is it. Everything else is crappy until your own conclusions appear. You are your final judge.
That said, let me begin for the one of the most current question: IDE. What you should use? You should use the one you can produce more and makes you more comfortable. Netbeans, Eclipse, VIM, Notepad++, Notepad, gedit, kate, quanta plus.... You have many options, and each person has it's own opinion. Test what you think interesting and go ahead with the one you choose.
This is true also for any methodology, framework or tool. Use, learn, and get critic about it. Stick with the one which makes you more comfortable and productive.
Same thing for developing environment. Does not matter that much if you develop on Windows, Mac or Linux. The important is get the resources you need available. The resources you need can and generally do change from one project to another.
So the best environment to develop a certain project is one that reflects the real environment where the production will run. What if you develop with PHP 5.3 OOP resources and at the end you get on a PHP 5.1? That's the point. The final environment is who tells you what is the best environment to develop, not the inverse.
For testing, you should trace a strategy. I'm talking about that as a 5 years Test Team Lead inside IBM. This because there are a LOT of testing you can perform, but not all can be really interesting to the current project.
First decide, according to project needs, what you are going to test. Security, performance, UI display, UI effects, error handlings, load and balance, usability, accessibility...
Take notes of what you are going to test (what, when, where, success criteria), and make a report of success and failures.
As I said before, the project needs is what guides you on every step. Testing is not different. If you just need to check the display on different browsers, feel free to use different machines, or VM's.
Generally this is sufficient. But if the project requires performance or load testing, then you will need specific load testing softwares. I will not get deep in this subject as it is very extensive.
It takes some time to find a ideal process and tools match, and after achieve that, you will always discover a new tool to test or a process to make you save a little time. This is IT.
Here's my recommendations:
have a dev environment purely for development. keep a staging and/or a live environment based on the resources you have at your disposal. the staging environment is where you test and ensure there are no serious issue(s) with your application. the live environment is basically your production setup. in fact, the staging and live should ALWAYS be the same. It is useful to reproduce issue(s) on the staging and do a bit of troubleshooting without modifying the code. Bear in mind, this also holds true for any associated databases.
Use SVN or some form of version control. This way you will have the ability to fall back to any stable version of the application if someday the world falls apart!
If you are using Linux environments you can write simple scripts to synchronize the setup with your latest (STABLE) development environment. Ideally, you do your development and conduct unit tests to ensure everything works as per design. Run a script and the staging environment is updated with the latest codebase. Conduct functional tests on staging and ensure that everything works as per specs. Run another script and your latest changes are moved into live/production environment.
My development process is still a bit rough, and I am looking forward to the answers also.
What I do for large projects is setup a git repo on my linux desktop and my windows desktop. I will test locally if possible. As components as finishinged I will push my changes to the centrally hosted git repo (private git hub account usually), or pull them to dev (i setup dev as a repo and pull from ssh). All MySQL updates are stored in update files, and I use netbeans for development (although I have used eclipse and others, netbeans just works for me).
I think you hit on all the important points. Personally, I
run the same OS and server software on the production server and my development system. Same versions of PHP, Python, MySQL, Django, etc.
I don't change DB structure often. I set up the DB tables on the dev. system, then use mysqldump to produce the table creation SQL. I install it on the sever using mysql <name_of_sql.file. When I do make changes, I back up the DB and then just do it through the command line interface. For PHP, I use Doctrine just for the table structure/migration support.
I write everything in Kate (Linux), Komodo Editor(Mac), or Notepad++ (Windows). I don't like IDEs very much, I much prefer to-the-point text editors.
I upload files to a staging dir, and check them with diff before copying them to the actual location.
This isn't the most sophisticated set up, but it's worked quite well for me. I'm the only dev working on the projects I'm involved with, which probably explains a lot. The most important part, that isn't really just personal preference, is the first one - match your dev/test system as closely to the production system as possible, including the OS.

Categories