Azure using cached deployment script - php

I have a PHP website that is automatically being deployed to Azure, this works well.
However, I would like to use a custom deployment script to automate moving certain files to Azure Storage/CDN on deploy. I've set up azure-cli using npm and created a deployment script using azure site deploymentscript --php. The deployment scripts were created and I've added my own script at the end of deploy.cmd.
My problem is that, the generated deployscript does not appear to be used. In the azure deploylog it says this:
Generating deployment script
Using cached version of deployment script (command: 'azure -y --no-dot-deployment -r "D:\home\site\repository" -o "D:\home\site\deployments\tools" --basic --sitePath "D:\home\site\repository"').
Running deployment command...
Command: "D:\home\site\deployments\tools\deploy.cmd"
etc...
That deploy.cmd is not the one it should be using- not the one I can change.
So, how do I get it to stop using a cached deploymentscript, or to use my custom deployment script?

Try to add the .deployment in the root folder, and then the deployment.cmd inside the specific project. This way I got azure to do the non cache deployment. Just ensure that you update the path to the .cmd file inside the .deployment file.

Related

Git Post-Push hook replacing build folder

I'm working with an AngularJS app that I am hosting on Azure as a Web App.
My repository is on Bitbucket, connected as a deployment slot. I use TortoiseGit as a Windows shell for commit/pull/push. Grunt is my task runner.
Problem: I am not managing to replace the build folder after a push has been made. This is what is being exposed on Azure (see picture):
What I've tried:
Batch file using Windows ftp.exe replacing the folder after push using mput
Following this guide by taking advantages of Azure's engine for git deployment(s) that is behind every web app.
WebHook on Bitbucket calling URL with simple PHP script doing a FTP upload
Currently my solution is that after a push I'm building using grunt build and connecting through FTP with FileZilla and replacing build folder manually.
How can I automate this?
Thanks in advance.
I solved this by refactor my initial batch script that calls:
npm install
bower install
grunt clean
grunt build
winscp.com /script=azureBuildScript.txt
and my azureBuildScript.txt:
option batch abort
option confirm off
open ftps://xxx\xxx:xxx#waws-prod-db3-023.ftp.azurewebsites.windows.net/
cd site/wwwroot
rmdir %existingFolder%
put %myFolder%
exit
This is being triggered Post-Push as a Hook job in TortoiseGit.
It turns out ftp.exe is restricted regarding passive mode and can't interpret a session whilst WinSCP does this with ease.

Handling File ownership issues in a PHP apache application

Env: Linux
PHP apps runs as "www-data"
PHP files in /var/www/html/app owned by "ubuntu". Source files are pulled from git repository. /var/www/html/app is the local git repository (origin: bitbucket)
Issue: Our Developers and Devops would like to pull the latest sources (frequently), and would like to initiate this over the web (rather than putty -> and running the git pull command).
However, since the PHP files run as "www-data" it cannot run a git pull (as the files are owned by "ubuntu").
I am not comfortable with both alternatives:
Running Apache server as "ubuntu", due to obvious security issue.
The git repository files to be "www-data", as it makes it very inconvenient for developers logging into the server and editing the files directly.
What is the best practice for handling this situation? I am sure this must be a common issue for many setups.
Right now, we have a mechanism where the Devops triggers the git pull request from the web (where a PHP job - running as "www-data" creates a temp file). And a Cron job, running as "ubuntu", reads the temp file trigger and then issues the "git pull" command. There is a time lag, between the trigger and the actual git pull, which is a minor irritant now. I am in the process of setting up docker containers, and have the requirement to update the repo, running on multiple containers within the same host. I wanted to use this opportunity to solve this problem, in a better way, and looking for advise regarding this.
We use Rocketeer and groups to deploy. Rocketeer deploys with the user set to the deployment user (ubuntu in your case) and read/write permission for it, and the www-data group with read/execute permission. Then, as a last step, it modifies the permissions on the web-writable folders so that php can write to them.
Rocketeer executes over ssh, so can be triggered from anywhere, as long as it can connect to the server (public keys help). You might be able to setup your continuous integration/automated deployment to trigger a deploy automatically when a branch is updated/tests pass.
In any case, something where the files are owned by one user that can modify them and the web group can read the files should solve the main issue.
If you are planning on using docker, the simplest way would be to generate a new docker image for each build that you can distribute to your hosts. The docker build process would simply pull the latest changes on creation and never update itself. If a new version needs to be deployed, a new immutable image with the latest code is created and distributed.

git aws.push uploaded new php application to Elastic Beanstalk

I have a similar question to Pushing to an existing AWS Elastic Beanstalk application from the command line and Git pushes entire project even if few files are changed. AWS but did not see the answer I am looking for.
There have been comments about the confusing changes to Amazon's documentation because different versions of the documentation state they are the latest when some functions have actually been replaced and I think a new question is warranted now.
I used the Deploying a Symfony2 Application to AWS Elastic Beanstalk guide to setup my dev app and it works great. After I make several changes and want to update the aws app, I use git aws.push which creates a new version of my app and restarts the server.
I do not have my configuration files finalized (this is just a dev app) and need to manually run several commands on the remote server before my app can be viewed. For very minor temporary changes, I connected to the remote server via ssh and edited the php files directly which works fine. This way the server does not need to be restarted because everyimt I use git aws.push the server restarts. I would like to have a method to update those files using git without restarting the entire server/app.
Main question - Is there anyway I can push only the files that were changed in the recent commit and not have the server restart?
Side question for new aws commands - Should I use the eb commands Getting Started with EB CLI 3.x and use eb deploy instead of the git command?
No, currently there is no scenario where Elastic Beanstalk pulls changes without reinitiating and restarting server.
You can try to write your own workaround for that purpose, but you
will need to mention files that need to be updated manually and be
sure that files will be delivered to each EB instance. If you are
pushing from windows don't forget to convert line ending dos2unix.
eb deploy is a canonical command for aws.push
If you experience "full upload" issue instead of sending "only changes", please read my fresh answer here:
Elastic Beanstalk "git aws.push" only commited difference?

Issues Deploying Site with Git & PHP (Cpanel/WHM Server)

We are trying to automatically deploy our web application using Git, GitHub, and PHP on a Cpanel/WHM server.
I've tried, using the information in the article below, to set up a deploy script on our server that GitHub posts to when we push to the repo.
https://gist.github.com/1809044
Unfortunately, it seems that the fact that apache is running scripts as "nobody" is preventing the script from running. We created SSH keys as the account's user, and the git pull command is not running.
Is there any way to successfully pull a git repo from GitHub on a deploy hook and have it update without installing something complex like Jenkins?
Do you have control over how apache runs? I'm dealing with some e-mail/spam issues. I've also been reading that if you run suPHP, your apache will run as the user of that account. Sounds like that might be what you need.

Using PHP to execute `git pull` command - deploy with github

I have been able to connect a local repository to a private Github repo (using the Github for Windows GUI), which should then automatically deploy to a development server. Everything is running smoothly until I make an attempt at the following:
<?php `git pull -f`;
Source: http://net.tutsplus.com/tutorials/other/the-perfect-workflow-with-git-github-and-ssh
Once I commit then sync/push through Github for Windows, this script is supposed to call the Linux command 'git pull -f'. I have used a SSH key without a passphrase in an effort to make it easier but to no avail. However, if I were to execute the command via command-line, everything works perfectly.
File & folder permissions are set to 777 throughout the root directory (including the /.git folder), which is what I thought was the reason for not being able to see updates when viewing the website I am developing.
If anyone has had a similar experience with this I would be grateful for a solution.
Note: I have searched high and low on SO and elsewhere on the web for a potential fix to this but any resources will be met with open arms.
Thanks,

Categories