I'm using Phing as my build tool for a website i'm developing. I have a server running on localhost to test things on my own system, and i have a test environment on the server it's eventually going to run on. Deploying to that test environment is currently done by tarring all the built files, uploading the tar to the server and extracting it there.
However, since i'm also using quite a bit of images, this takes pretty damn long; 10 seconds for local deploy vs 4 minutes for a remote deploy. Is there any way to either compare files in 2 directories and only tar the ones that are newer in one dir (so i can keep a shadow copy of the build dir to compare file dates) or another best practice?
Something else i've been thinking about trying is uploading the site using git. Any ideas on that?
yesterday I had the same problem, this answer resolved my problem
Phing - Deploy with FTP but only overwrite when size has changed
Related
I have a laravel project. I'm deploying it on an Apache server by copying files to the server. Is there an alternative using phar that works like jar files?
The simplest way would be for you to rsync the files to the server. One a bit more complex but worked out great was using capistrano on the desired servers, the deploy script from jenkins would upload the build file to s3 and ssh into the server and run
bundle exec cap deploy
on the desired servers.
This way would make the servers deploy on themselves, in the case I used it, the capistrano on the machines did the build and deploy using git archive, but you could split your build step and deploy step.
Basically, you would have a private s3 bucket to upload your builds, they would always be named application-latest.zip
Your jenkins server would always build, upload and trigger the deploy
Your application servers would have a capistrano download the zip, unzip it and 'deploy'
Why am I mentioning capistrano so much? Because it relieves you from a lot of the deploy struggles, exemple:
- it will do the mentioned steps on a folder called application-timestamp
- this folder is kept alongside other folders containing your previous deploys
- there is a symlink called release that points to your latest deployment
- once capistrano finishes your steps, it changes the symlink to point to your fresh deployed folder
Benefits from this approach and tech choices:
you can choose to keep 'x' last deployments, that let's you rollback deploys instantly because that is just a symlink change from your current deployment to your current-1 deployment
that also controls how much disk space your assets take by cleaning up the oldest folders, if you chose to keep 5 deployments for rollback, when you deploy for the sixth time, the oldest folder is removed
it prevents you from having the white screen of death that is caused when nginx/apache/php can't access the file that is currently being updated by the deploy script
it lets you create an image of this machine, let's say on aws, and put it on a autoscale group, and once this machine is started, it updates itself by downloading the latest zip from s3 without having jenkins get in the way, you will just have to setup the autoscale init script to trigger the deploy on whatever provider you chose
you have a very clear boundary between build and deploy
hope that helps!
Recently I investigated the continuous integration development process, so I installed a TeamCity server, set up the build and tried to make it automatically deploy the build artifacts (my web app) to the web hosting server through FTP. However, I failed on the last step (deployment) because of thousands php files deploying for a very long time. So, I'm wondering if there is a way to do it more quickly, perhaps using zip-archives or something else.
So, my question is, if there is a common way to solve such problem?
Use git for deploy. It get only changes
Example using hooks: gist.github.com/noelboss/3fe13927025b89757f8fb12e9066f2fa – Łukasz Gawrylu
Last week, I tried to deploy a simple symfony app on azure.
I choose the plan app service B2 (2cores / 3.5Go RAM).
PHP Version : 5.6.
First it took forever to complete the composer install. (I tried to go on S3, it was a little faster but not very different).
So I tried to optimize the php config, opcache, realpath_cache_size...etc (xdebug already disabled).
I even tried to enable wincache, but with no real improvment.
So now my app is deployed, but it is too slow to be usable.
A simple php app/console (in dev mode) takes ~23secondes.
It seems to recreate the cache everytime. On my local unix environnment (similar specs), it takes 6seconds when the cache is cold and 500ms when the dev cache is warm.
I think that the main problem is a filesystem issue, because to remove the dev cache folder it takes 16 seconds.
On my local unix environnment, similar specs, it takes ~200ms to remove the same folder.
Like I said I tried S3 Plan with a small improvment but not enough to explain this slowness.
One thing weird, it's that if I rerun the command php app/console just after it finished, the command takes 5seconds to run (much better). But If rerun it 5seconds after it finished, it takes 23seconds.
I already tried these solutions (even if the environnment is different) :
https://stackoverflow.com/a/17021255/6309878
Update : I tried to set the symfony app/cache folder to the local filesystem D:\local\cache, but no improvment, it may be worst.
Please try below steps and let me know if it improves the performance -
1) In the wwwroot directory of your site, create a .user.ini file (if it doesn’t already exist) and add “wincache.fcenabled=0”. This will disable Wincache.
2) Go to Azure Portal and go to the Application Settings for your app. In the App Settings section, add “WEBSITES_DYNAMIC_CACHE” with a value of 1.
3) Restart the site.
I'm sure there are answers for this all over stackoverflow but I was unable to find anything specific.
I have a PHP project which I am revisiting. Its running on a RHEL5 box. I have SVN on the same box.
Out of curiosity I recently added Jenkins to the machine and have the jenkins php template at...
http://jenkins-php.org/
There was a bit of playing around with the setup but I more or less have this all running and doing Continuous Inspection builds when something is committed to SVN.
What I want to do now is have Jenkins copy my updated files across to the server when the build completes.
I am running a simple LAMP setup and would ideally only like to copy across the files that have actually changed.
Should I just use ANT & sync? Currently the files reside on the same box as the server but this may change whereby I will need to sync these files across to multiple remote boxes.
Thanks
Check these - Copy Artifact Plugin and the job's env variables.
Now set 2 jobs - 1 on source machine and 1 on destination server (make it a slave). Use the plugin to copy required artifacts by using environment variables.
Do you have your project (not the jenkins but that with LAMP setup) under the SVN? If yes I'd recommend to create standalone job in Jenkins that will just do an svn up and you could tie it to jenkins job the way like - you running your main job, and if build is ok jenkins automatically runs job to update your project.
For copying to other servers take a look at Publish Over plugins
It's very easy to setup server and rules. The bad thing is that you can't setup copying only the new files for current build which means that the entire project is uploaded every build.
If your project is too big, another solution is to use rsync as post build action.
Ok,
This seems like something that would be obvious, but I haven't been able to figure this out.
I just started using Solar PHP5 Framework http://solarphp.com. It is a great php5 framework. But with any new framework the is a learning curve.
Issue: Solar uses many pre-written scripts to make directories and files for you. Making it easy to rapidly deploy a site. Being that it uses these scripts, it makes symbolic links to files and directories. (Example: Chapter 1 in the manual) This is great until you need to export your entire root directory to upload to your server or make another instance on another development computer. The problem for me is, when I do this, the files are editable, but do not reflect any changes when I refresh a page. Its like it doesn't update any code. The only way I can accomplish changes or updates, is to (essentially) run the site set-up each time, which involves running all the setup scripts, setting up the DB connections, etc. This is a total pain.
Question Is there any advice out there on doing this where I can just export the working root directory, to easily upload to server or other dev machine, without having to run those scripts over and over again. I know its something easy but I do not know exactly what to search for.
Is the a certain method for exporting directories/files that use symbolic links?
You might try using rsync instead of ftp to deploy the site. rsync will respect symlinks. Of course you will need have ssh access or mount the server over ftp/sftp with FUSE. If youre using SVN you could also ssh into the server and do an svn export or checkout.