I'm not sure if I'm missing the point here...
Our devs want the following...
On a LAMP server with SVN/WebDAV they want the root Apache directory to be a repository that they can all work on. However, setting the default Apache directory to a repo doesn't work as the files aren't stored as html/php files, instead in the SVN db structure to handle changes/revisions/etc.
Is there any way to do this? or would we have to have a separate repo that they copy files to/from the web root when developing?
You have to setup a separate svn repository and Hook Scripts. This hook scripts can checkout the code on every code change to your apache root. This is quite common an also used for automatic testing etc.
Related
More and more projects are starting to pile up and I want some workflow and also version control over the different projects. I'm trying to set the "right" workflow with Xampp, Git, GitDesktop and PhpStorm on a Windows 2012r2 machine.
Xampp base: d:\xampp
http://localhost = d:\xampp\htdocs
Dev repositories: d:\xampp\htdocs\repositories\dev\GitDemo
Live repositories: d:\xampp\htdocs\repositories\live\GitDemo
Live folder: d:\xampp\htdocs\GitDemo
Live URL: http://servername/GitDemo (intranet use only)
Right now I have my repositories folder inside the htdocs folder, otherwise I would need another alias/copy action to be able to see what I'm developing. But at the same time the repositories folder is exposed. How to hide this? htaccess?
I've ran git init --bare inside the live folder for this project. With GitDesktop I've created the repository for GitDemo inside d:\xampp\htdocs\repositories\dev.
Using PhpStorm I've created a project based upon local files and pointed it towards d:\xampp\htdocs\repositories\dev\GitDemo. I'm able to see the changes made using git status, add them using git add . and commit them succesfully with git commit -m "my commit..".
I've created a remote server git remote add live master and created a post-receive to checkout the files inside d:\xampp\htdocs\repositories\live\GitDemo to d:\xampp\htdocs\GitDemo.
This all feels like a "ton" of work to set up initially and somewhat redundant (having the same files in 3 locations).
Is this the ideal way to set this up, or do you suggest an alternative approach? Thanks for sharing your thoughts!
I've been thinking about the most logical solution (to my opinion at this moment). Here's how I solved my unclear items as described:
I've optimized it by:
Bringing the repositories directory outside htdocs to d:\repositories\dev\ and d:\repositories\live.
I've set up a symlink to http://localhost/dev/GitDemo that links to
d:\repositories\dev\GitDemo. In this way I don't need to place the
repositories folder inside the htdocs folder and still benefit from
Apache being able to serve the content, which actually resides outside the htdocs folder.
The live version is now placed at http://localhost/GitDemo and with a post-receive hook it gets deployed from d:\repositories\live\GitDemo.git to d:\xampp\htdocs\GitDemo.
If you think I made a mistake or have mistaken something from the way it's supposed to be, please correct me as I'm still not sure this is the correct way but at least it seems like it to me.
How we do things now
We have a file server (using NFS) that multiple web servers mount and use these mounts as the web root. When we deploy our codebase, we SCP an archive (tar.gz) to the NFS server and unarchive the data directly in the "web directory" of file server.
The issue
During the deploy process we are seeing some i/o errors, mostly when a requested file cannot be read: Smarty error: unable to read resource: "header.tpl" These errors seem to go away after the deploy is finished, so we assume that it's because unarchiving the data directly to the web directory isn't the safest of things. I'm guessing we need something atomic.
My Question
How can we atomically copy new files into an existing directory (the web server's root directory)?
EDIT
The files that we are uncompromising into the web directory are not the only files that are in the directory. We are adding files to the directory, that already has files. So copying the directory or using a symlink is not an option (that I know of).
Here's what I do.
DocumentRoot is, for example, /var/www/sites/www.example.com/public_html/:
cd /var/www/sites/www.example.com/
svn export http://svn/path/to/tags/1.2.3 1.2.3
ln -snf 1.2.3 public_html
You could easily modify this to expand your .tar.gz before changing the symlink instead of exporting from svn. The important part is that the change is the atomic application of the symlink.
I think rsync is a better choise instead of scp, only the changed files would be synchroned. but deploying code by script is not convenient for deveopments in a team, and the errors in deployment is not humanize.
you can think about Capistrano, Magallanes, Deployer, but they are script too. I may recommend you have a try walle-web, a deployment tool written in PHP with yii2 out of the box. I have hosted it in our company for months, it works smoothly while deploying test, simulate, production enviroment.
it depend on groups of bash tools, rsync, git, link, but a web ui generally well for operation, have a try:)
Why don't you just have 2 dirs with 2 different versions of the site. So when you finished deploying in site_2 you just switched site dir in your webserver config (for example apache) and copy all files to site_1 dir. Then you can deploy in site_1 dir and switched to it from site_2 with the same method.
RSync was born to run... er... I mean to do this very thing
RSync works over local file systems and ssh - it's very robust and fast - sending/copying only changed files.
It can be configured to delete any files that have been deleted (or are simply just missing from the source), or it can be configured to leave them alone. You can set up exclude lists to exclude certain files/directories when syncing.
Here's a link to a tutorial.
Re: atomic - link to another question on SO
I like the NFS idea. We do deploy our code to NFS server that is mout on our frontends. In fact we run a shell script when we want to release a new version. What we do is using a symlink current to the last release dir, like this:
/fasmounts/website/current -> /fasmounts/website/releases/2013120301/
And apache document root is:
/fasmounts/website/current/public
(in fact apache document root is /var/www which is a symlink to /fasmounts/website/current/public)
The shell script updates the current symlink to the new release AFTER everything has been uploaded correctly.
Between me and a Network Architect we manage a bunch of Web Servers (FreeBSD). He's responsible for all server/network related stuff (IPs, Firewalls, users/groups, etc.) and I'm responsible for all web-related stuff (Apache, PHP, MySQL). Sometimes the responsibilities overlap.
It happend few times that some changes were made to the config files which more or less affected the server and we were not able to figure out which of us made the changes and why.
I - being a Web Developer - think it'd be a good practice to put the files under version control (we currently use Subversion), so that whenever we change anything we have to commit an comment the changes. It'll solve all the problems with wondering who did what and why.
The particular config files I was thinking of were:
firewall config
apache config (with extras)
php config (php.ini)
MySQL config (my.conf)
I already know, that the idea version control of server config files is sound based on other question asked here. My only worry is how to do it properly on the Web Server side since the files are in different locations. Putting the whole /usr/local/etc under version control seems pointless as it contains not just the config files.
I was wondering whether not to create a new folder, say /config which would be under version control and would contain all the config files we need and then replace the original ones with symlinks to ones in the /config folder. E.g.:
/usr/local/etc/apache22/httpd.conf -> /config/apache22/httpd.conf
So the question is: Is this a good idea and if not, what is a better solution?
If you use GIT then puitting the whole /usr/local/etc under version control is not pointless at all.
you can only track a handfull of files if you so chose
the working directory with all config files tracked is hardly bigger in size
Just install git, then go to /usr/local/etc and write git init. This will create the .git folder in your current location (basically making this folder a repository).
Then add the config files you want to track:
git add firewall/firewall_config.conf apache2/httpd.conf etc
and commit: git commit -m "Initial Configuration"
Your config files are now being tracked.
Since you are versioning sensitive configuration files, I would recommend setting up an internal git server like gitlab. Create a git repository for each server, or server template/image/etc. Go to the / directory and 'git init'.
You can be selective about what you put under version control by only using 'git add /path/to/file for the files that you will be customizing.
Then, 'git -m 'commit comment' and 'git push -u origin master'.
Have you looked at any of the management tools made to do just this sort of thing? I recommend Puppet or Chef, having used them in previous jobs to do this sort of thing.
I am trying hard to use version control but i am so used to old method of editing files directly via FTP that i am feeling confused what to do. So i am thinking of one solution and please help me with this if its possible or not
I have the user folder in Linux VPS system(Single VPS only)
/home/user/public_html2
Now that will be linked with Http://demo.server.com.
That directory will be version controlled with repo in the folder /usr/bin/repo/ so that i can commit the changes.
Now i will have another directory called
/home/user/public_html which will have the contents updated as the commit is done in /home/user/public_html2.
That /home/user/public_html will be linked with http://main.server.com
So that i keep working and editing files in public_html2 via FTP normal and test code as it is but main site is version controlled
Is it possible??
When you are using a version control system, why do you need to of to a FTP location to edit the files... It defeats the whole purpose.
Rather, checkout the files from the repository into your local machine and after the changes are done, commit the code.
If you don't want to use the version while working with the files on your local, simply export the code instead of checkout.
The problem I see from your approach is that the files / directories may go out of sync. Also, if there are multiple users working then this approach would bomb.
I would suggest to have the working copy of the files in your local machine and use them instead of mirroring them to another folder.
Maybe it's possible. But once you start to use version control systems, you should keep accessing your version controlled files through that system.
Or better, if you want to access files in "normal way" as you said, then if you access that files without committing any change then it should be ok. But if you want to modify them, you should always pass through your version control system, otherwise you could have problem with file versions.
You need a deployment script. Create a script to upload the files to the appropriate locations. Run this script whenever you wish to see your changes reflected on the site. If you wish to maintain two separate locations on a server running different versions, simply add a command line argument to the script to choose which location to upload to.
Use rsync instead of FTP if you can, to avoid transferring unchanged files every time.
Hey I've a svn repository with some php files and I want to run it over the browser. Just like a normal page. But if I go to the repo url (http://server-URL/svn) enter my username/pass, navigate to my php project and selecting the index.php it just shows me the content of the index.php but not the real rendered page.
How can I run my project?
Presumably you're using something like mod-svn in Apache to deal with your SVN repository.
If you want to run the PHP then you'll need to set up another virtualroot on Apache that isn't using mod-svn and give it a root directory which matches the location of the PHP files.
Basically, you can't manage the repository and parse the files from the same URL.
That's what I'd expect viewing the project contents through the svn interface. What you want is to actually upload the php project to a php-enabled website/host.
SVN is just a repository. You would want to "check out" the code to your local machine and have a PHP Web Server setup.
Of course, you could host it in a real website also.