Creating a safe dev environment - php

We are a small team developing a Wordpress site. Till now we have been editing the same files online, which inevitably led to mistakes. We thought to use Git / Github to stop stepping on each others tows and manage source code efficiently. However I can not run the site locally on XAMPP as I am getting numerous PHP errors. What would you recommend in this case?
Maybe creating another folder on the server with identical content just for testing? Is it possible then to run Git on the server?

I operate in a team of devs and this is what we do.
Locally each set up our own wordpress install under a seperate vhost and locally modified dns. Get just plain jane wordpress running perfectly.
Create a dev testing site on a live server on the internet (often prefix the real url with dev-www.mysite.com)
Create a GIT repository, and have it auto push-deploy to the dev testing site (we create the folder structure in GIT from wp-content down, and only have custom themes/plugins)
Configure GIT to MANUALLY push changes to the production server (for going live)
On all systems we install the wp-migrate-db-pro plugin. Locally devs only PULL from the shared dev site to their local (which means planning what/when/where you create content).
Copy around the /uploads folder, incase there are a bunch of images/media uploaded.
This works when some of us run Windows/IIS/WordPress, Mac/Apache/WordPress, Linux/Apache/WordPress

As a general idea a deployment process could go like this:
developers push their git changes to the server 'staging' area
either manually or using git's post processing, changes are pushed is needed to a virtual Wordpress web host that acts as test or 'staging' server. This can be password- or geolocation-protected
once it's tested properly, then the code can be moved to production web host files, this can be accomplish with a simple script that goes like: stop web server - backup files - nuke files - move files from staging - start web server

Related

Gitstack git server - push to apply changes

(Sorry for the unclear title). This is the scenario. I have a local server and I installed Gitstack in it. I was able to to push and pull to this local git server. This same server runs the webapp that I am working on. My plan is to push commits to this local git server and that change must also reflect to the webapp. So instead of using filezilla to copy files from my machine to serer, I will just push the changes via git. How to do it? Or is it possible?
I tried to look inside the Gitstack's installation folder. I expected to see the actual project's files inside but only git files are there.

Deploy a media folder without storing it on github with capistrano

I work on a php website that deploys to our servers with capistrano. The site has a very large media folder that needs to be moved into the current deploy every time. Currently what happens is I have a shell script that does a 'mount --bind' every time it deploys. And on the before deploy it does an 'unmount' of that folder. The problem is that it isn't reliable and sometimes on cleanup it rms my media folder. I thought about putting the media folder in the github, but it is 500 mbs and needs to change as users create accounts.
So options I have thought of and want your opinions on, or your options that is better then I can think of.
An .htaccess rewrite that whenever it looks for the media folder it reroutes it to a subdomain that has the folder on it. I just don't know if this rewrite will work for creating files and directories or only reading them.
I tested this today and it worked, for reading from the subdomain, but I could not get the create or write to work
Find a beter way to deploy the media folder without having to rely on shell commands that seem to fail and destroy the folder
Restarting the server on every deploy (this unmounts all folders) then just do the mount after the deploy and server restarted. This is just time consuming if I am doing many deploys it wouldn't work well because I have staging and production on the same server, so if I am testing on staging, restart it, it also restarts my production site, thus unmounting production well I am on staging testing.
Those are the only options I could think of. Any help would be great.

How create beta (testing) website?

How create beta (testing) website use same webroot and cake folder ?
That beta (testing) website probably is at http://beta.example.com or http://example.com/beta
A method I have been using for a couple of years is to set up staging server instances. These could be either separate physical servers, or on the same server using hostname checks. However, it is good practice to have separate web roots and separate databases under each instance. You'll be asking for trouble if different aspects of your site are shared between staging instances!
My setup is the following:
Development (a computer with the source code on, set up to serve to http://websitename.dev (a local domain).
Preview (a separate server, used to provide a preview of a website or a change to a website, before doing the extra work to putting it live). http://websitename.preview.mycompanyname.com
Next (this is on the same server as the live website, under a different web root, and connected to a different database. The reason for this server is because SO MANY TIMES has a site worked on the development machine, but when it is put live, something on the live server makes the site DIE. http://websitename.next.mycompanyname.com
Live (the usual live server setup) http://websitename.com
This is all achieved by assigning DNS records correctly (to point to the correct servers), and using the config script of my web server application, listening to the hostnames and serving the correct web root.
A testing or "staging" server should be set up completely independently of the production server. You should not reuse any component of the live system, which even includes the database. Just set up a copy of the production system with a separate database, separate files, if possible a separate (but identical) server.
The point of a test system is to test code that is possibly buggy and may delete all your live data, shoot your dog and take your lunch hostage. Also, your test system may not be compatible with the production system, depending on what you're going to change down the road.
As such, create a new virtual host in your Apache config (or whatever you're using) and set it up exactly like the production system. Done.

What is the best development environment for Drupal to be able to move it to a different server at go live?

When i've worked on Drupal sites before, if there is internal access to the server, or if remote desktop access is available, i've always developed it on the machine it would be ran from when live, and just not made it public on the server.
However, what is the best thing to do if you don't have access to the server yet, for example if the client hasn't got anything in place?
I need to be able to build and test the solution on my local machine, or on my VPS which I have RDP access to, and be able to move it over with as much ease as possible to the clients server when ready.
Any tips or best practices? As far as i'm aware Drupal doesn't have any specific migration tools? I could be wrong though
I don't work with Drupal, but for Prestashop, Wordpress, Zencart, etc. I always use the same workflow:
I setup a vhost in my virtual sever, usually using a subdomain of my own domain (like customer.mydomain.com). Install the software with its DB etc. on the server. Setup FTP access.
I get a local copy of the files, which I maintain in a local git repository, pushing to github for backup purposes mainly.
I work with ZendStudio and configure a remote server and set it up to upload the files when I save them, so I can check them pretty much as if I were working locally. But the main advantage of this approach is that I can share the project with the customer as it progresses.
When I have to move to final server, at least with Wordpress, I have to search/replace the domain name, which wordpress saves on DB. But I do it locally. I download the entire DB as an SQL file through phpmyadmin, open it, searc-replace and upload it again via phpmyadmin to the permanent server.
With ZenCart and others the problem is the config file, which stores some paths. For long projects or long term customers I modify the config file to use some config details or anothers depending on the server name.
adding to the above comment...
Check the "backup and migrate" module and the "backup files" module. "Backup and migrate" is useful in any setup...
with this I was able to do a barebones drupal install and then migrate/replace the database with the one backed up from my local system... if the databases are named differently you will still need to edit the settings.php
"backup files" is useful for themes and content assets like images etc. but is essentially just a wrapper around gzip
I typically develop on my local machine and then upload to server once complete.
All you need to do is change the folder name in /sites/ and change the settings.php file to reflect the server settings/domain.
Something you should be aware of:
If you are uploading files on your local installation, the file paths will be wrong on the server and you will need to execute a one off mysql replace query.
Make sure you use relative paths in any hard coded links.

What is a good solution for deploying a PHP/MySQL site via FTP?

I am currently working on a web application that uses PHP and MySQL, but I do not have shell access to the server (working on that problem already...). Currently, I have source control with subversion on my local computer and I have a database on the local computer that I make all changes to. Then, once I've tested all the updates on my local computer I deploy the site manually. I use filezilla to upload the updated files and then dump my local database and import it on the deployment server.
Obviously, my current solution is not anywhere near ideal. For one major thing, I need a way to avoid copying my .svn files... Does anyone know what the best solution for this particular setup would be? I've looked into Capistrano a bit and Ant, but both of those look like it would be a problem that I do not have shell access...
I'm using Weex to synchronize a server via FTP. Weex is basically a non-interactive FTP client that automatically uploads and deletes files/directories on the remote server. It can be configured to not upload certain paths (like SVN directories), as well as to keep certain remote paths (like Log directories).
Unfortunately I have no solution at hand to synchronize MySQL databases as well...
Maybe you could log your database changes in "SQL patch scripts" (or use complete dumps), upload those with Weex and call a remote PHP script that executes the SQL patches afterwards.
I use rsync in production but you could do this:
Add a config table into your site to hold what level of DB you are currently at.
During development, store each set of SQL changes into a single file (I use something like delta_X-up.sql). These will stay in your SVN as well. So, for example, if you are at delta_5 and add a table between the current release and the new release, all the SQL needed will be put in delta_6-up.sql
When it comes time to build, export the repo instead of using a checkout. This lets you ignore all the SVN cruft that comes along since you won't need that into production.
Use Weex to push those changes into production (this would be were I would use rsync but you don't have that option). Call a remote script that checks your config DB to see what delta level you are currently at, parse the directory with you delta_x-up.sql files and see if there are any new ones. If there are, read them and run the SQL inside.
You can do a subversion export (rather than checkout) to different directory from your working copy, then it will remove all the .svn stuff for you

Categories