How to ftp php code including vendor folder using bitbucket pipeline - php

I have php project in bitbucket.
I am able to install composer and generate vendor folder using pipeline.
Currently, there are no unit test cases. Hence, no script added to execute test cases.
Further, I need to ftp files and vendor folder both to my server. Below is current bitbucket-pipeline.xml
image: php:7.2.0
pipelines:
default:
- step:
caches:
- composer
script:
- apt-get update && apt-get install -y unzip
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
- cd src
- composer install
Following is suggested to push files but this is supposed to push only changed files.
- apt-get -qq install git-ftp
- git ftp push --user $FTP_USERNAME --passwd $FTP_PASSWORD ftp://YOUR_SERVER_ADDRESS/PATH_TO_WEBSITE/
I am blocked over:
Using "git ftp push" will only push changed file from last commit.
How to ftp vendor folder as well? Complete folder needs to be ftp. This folder is generated while executing pipeline script. Its not checked in repository.
Any input is appreciated!

i had same issue. In order to deploy vendor dir as well, just remove it from .gitignore , add it to you project commit it. pipelines will catch it and deploy as normal directory.

Related

Symfony 6 AWS Beanstalk run npm run build

I'm new to AWS and I've gotten as far as getting the following error in Symfony:
Asset manifest file "/var/app/current/public/build/manifest.json" does not exist.
In local, this would be fixed by running npm run build. I've tried adding NPM_CONFIG_PRODUCTION=true in the environment variables, but I think that might just be for node.js apps?
I've also tried SSHing onto the EC2 instance and installing node on there, but I ran into errors trying to install either npm or nvm. I feel like this is the wrong approach anyway, since it seems like the idea of beanstalk is that you shouldn't need to ssh onto the instance.
Perhaps I should just include the node_modules folder in the zip uploaded, but since one of the recommended ways to produce the zip is to use git, this doesn't seem correct either.
After a lot of digging around, it seems like there's 3 options here:
SSH onto the instance(s) and the following worked for me (Amazon Linux 2 - ARM chip)
curl --silent --location https://rpm.nodesource.com/setup_16.x | sudo bash -
sudo yum -y install nodejs
(cd /var/app/current/;sudo npm add --dev #symfony/webpack-encore)
(cd /var/app/current/;sudo npm install)
(cd /var/app/current/;sudo npm run build)
The problem with this, is if you have multiple instances that scale up and down with a load balancer, it isn't really practical to do this.
Add the above as a hook:
The following sh file could be put in the following directory: .platform/hooks/predeploy
#!/bin/bash
curl --silent --location https://rpm.nodesource.com/setup_16.x | sudo bash -
sudo yum -y install nodejs
(cd /var/app/current/;sudo npm add --dev #symfony/webpack-encore)
(cd /var/app/current/;sudo npm install)
(cd /var/app/current/;sudo npm run build)
However, I've since learnt that it's best advised to just include the node_modules in the zip that gets uploaded. I guess this way the time to get the server up is reduced.
Include the node_modules folder in the zip that gets uploaded.
To include the node_modules folder, since this is naturally ignored by GIT, I used the EB CLI and added a .ebignore file, which is a clone of the .gitignore file, but includes the node_modules and public folders. Also be cautious in your build process that you're not including the node dev dependencies.

Bitbucket pipeline deploy ignores vendor folder with ftp upload

I am trying to deploy PHP project using bitbucket pipeline. With this code:
init: # -- First time init
- step:
name: build
image: php:7.1.1
caches:
- composer
script:
- apt-get update && apt-get install -y unzip
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
- composer install
- vendor/bin/phpunit
artifacts: # defining vendor/ as an artifact
- vendor/**
- step:
image: samueldebruyn/debian-git
name: deployment
script:
- apt-get update
- apt-get -qq install git-ftp
- git ftp init -u "$FTP_DEV_USERNAME" -p "$FTP_DEV_PASSWORD" ftp://$FTP_DEV_HOST/$FTP_DEV_FOLDER
But it ignores the vendor folder. I have assumed, that artifacts will add this folder to deploy too.
What is wrong or what can I do better?
This happens because you probably have a .gitignore which includes the vendor directory. The artifacts are in fact passed to the next step by bitbucket but are ignored by git-ftp. In order to upload these files with git-ftp you need to create a file called .git-ftp-include where you will need to add the following line: !vendor/. The ! is required as stated in the docs:
The .git-ftp-include file specifies intentionally untracked files that Git-ftp should
upload. If you have a file that should always be uploaded, add a line beginning with !
followed by the file's name.

Composer Server won't load vendor file into server root

I started looking into using composer for my php project. i have MAMP Pro and would like to continue to use that for my development hosting. The problem is that I can't get composer to run anywhere accept in my /Users/***/ folder (where it runs really well). How would I get it to run in the root of my MAMP Server? I have messed with the .bash_profile file adding the following lines:
alias composer='php composer.phar'
export PATH=/Applications/MAMP/bin/php/php7.0.10/bin:$PATH
That does not work (and neither does the alias). The vendor file still shows up in the /Users/***/ folder.
Any help would be greatly appreciated!
You can do a global installation:
Since Composer works with the current working directory it is possible to install it in a system wide way.
1.Change into a directory in your path like cd /usr/local/bin
2.Get Composer curl -sS https://getcomposer.org/installer | php
3.Make the phar executable chmod a+x composer.phar
4.Change into a project directory cd /path/to/my/project
5.Use Composer as you normally would composer.phar install
Update: Sometimes you can't or don't want to download at /usr/local/bin (some have experienced user permissions issues or restricted access), in this case you can try this
Open terminal
curl -sS http://getcomposer.org/installer | php -- --filename=composer
chmod a+x composer
sudo mv composer /usr/local/bin/composer
Ok after looking at it a little bit more here's what I figured out using #Lokesh Gamot's answer and the composer site. Again, this is geared to use with MAMP.
Find the path of your development server. I would suggest putting var_dump(getcwd()); at the top of your index.php file to get the path to show up on your index page. For me, it was /Applications/MAMP/htdocs/.../web-root
Because I wanted composer in my server root, I inputted cd "/Applications/MAMP/htdocs/" into the terminal.
I then downloaded the file using curl -sS https://getcomposer.org/installer | php -- --filename=composer. This installs Composer into "/Applications/MAMP/htdocs/" as a file named composer.
When I begin a new project, I have to move the composer file into the project root using mv composer "/Applications/MAMP/htdocs/*Destination Root*" and then navigate the terminal to that same root by using: cd "/Applications/MAMP/htdocs/*Destination Root*". What I was missing was needing to move the composer file around to each root when I needed it.
After that, I can use composer using php composer. Again, thanks!

Paypal core SDK without Composer

I'm trying to get the PHP Core SDK to work without composer. There doesn't seem to be a simple way of working with the SDK without composer (https://github.com/paypal/sdk-core-php)
Any chance someone has an autoloader script or another solution to get this working?
I've been scanning for other information throughout the web, but it seems i'm the only person alive trying to get this to work without Composer.
Any chance? Thanks!
Alright, so it seems that i am really the only person on this planet who wants to do this. Well, then i'll answer my question myself. It seems like this is the guide for running every composer package without composer. Yihaa \o/. Probably easy stuff for most people using composer, but i've never used it because i'm on a shitty windows shared-host.
This is based on debian, but replace every apt-get with YUM for redhat or whatever.
So, i'm doing this in my root directory, don't whine about it :)
Ssh into your Linux Box (local mac or windows will work aswell but i'm not telling you)
# cd into the root directory (or user directory)
cd ~/
# install php5 and php5-curl and unzip (because the package we're
# getting is from GitHub). There might be other stuff your package is asking for.
# So just include it at the end
apt-get install php-5 php5-curl unzip
# install composer
curl -sS https://getcomposer.org/installer | php
# get the master archive
wget https://github.com/paypal/sdk-core-php/archive/master.zip
# unzip it
unzip master.zip
# cd into the directory
cd master
# move the files back to the ~/ directory
mv * ..
# remove the master directory
rm -r master
# install package using composer
php composer.phar install
# now we have the lib directory and the vendor directory. Lets tar that up
tar -cf package.tar lib/ vendor/
#we now have a tar file called package.tar copy that to your computer, ftp, whatever.
You can now create a directory in the place where you include all your stuff called lib-package (or whatever fancy name you'd like to call it) and add the following line in your project
require_once(/path/to/your/package/lib-package/vendor/autoload.php)
Voila, you're done.

Continuous Integration using composer

I have a PHP project in which I load packages through Composer. I also run Continious Integration using Jenkins, on a dedicated CI server. Once every hour, Jenkins queries my repository for changes, and if present, if executes a test run.
First step of the testrun is making a fresh checkout of the repository, and performing a build of the application, using Phing. One of the steps of the build is performing an
composer install
Since Jenkins always works with a fresh checkout, composer will always fetch all packages on every test run, even if none of the packages have been changed since the previous run. This has a couple of disadvantages:
It takes a relativally long time to complete a test run (composer needs to fetch for example Zend Framework, which is rather large
It put unnecessary strain on the packagist server, if new packages are fetched every hour
If, for some reason, the composer install fails, so does my test run.
I was thinking of possibly storing the packages that composer fetches on a central spot at the CI server, so Jenkins would be able to access the packages at that location for every test run. Of course, now I have to rewrite part of my application to handle the fact that the vendor folder is in a different location when on the CI server. Secondly, I have to tell Jenkins to keep track of changes on the composer.lock file, to see if he needs to run composer anyway. I'm afraid none of those two things are really trivial.
Does anyone have any suggestions of a other/better way to do this, or is it the best option to just fetch all packages through composer on every test run. Admiditally, it's the best way to make sure you always use the correct packages, but it sortof feels like a waste of bandwith, certainly in later stages of development, when the list of packages will hardly change anymore.
One way to speed it up is to use composer install --prefer-dist which only downloads zips even for dev packages. This is preferred for unique builds since it skips the whole history of the project.
As for sparing packagist, don't worry about it too much, one build every hour isn't going to make a huge difference compared to all the open source libs that build on travis at every commit.
One thing you could do is to store vendors in a location outside of project's workspace in jenkins so that it remains between the builds. You not necessarily need to change your application. Just update the build script so that it creates a symbolic link to the vendors location.
I use capifony for deployment and it uses this approach to keep the vendors between releases.
One thing to note is that Composer caches packages that it downloads. So once they are downloaded the first time, they should work even if Packagist is down (not 100% sure), and network bandwidth spared (100% sure).
Second thing is: why are you running tests by doing a fresh checkout of the repository? It is entirely possible to keep a copy of your code in the workspace in Jenkins, and just make sure you wipe on every test run the caches, logs and other artifacts. This will speed up not only composer install, but also the git pulls, especially for big repos!
Side note: for our own Jenkins platform, where workspaces are not cleaned between tests, the main drawback we found with composer is the sheer amount of disk space taken by having the full vendor dir in each workspace. I tried to work around this by using symlinks and sharing the vendors (named based on hashes of composer.lock), but then composer autoloader had a bit of problems finding where to load classes from...
Steps to install zf2 project on Jenkins
mkdir /path/to/your/project
1. Install the composer
curl -sS https://getcomposer.org/installer | php
mv composer.phar /usr/local/bin/composer
Note: If the above fails due to permissions, run the mv line again with sudo.
A quick copy-paste version including sudo:
curl -sS https://getcomposer.org/installer | sudo php -- --install-dir=/usr/local/bin --filename=composer
create a composer.json file in the root directory of the project
add all the pacakages you require
{
"name": "amarjitsingh",
"description": "amarjitsingh",
"license": "BSD-3-Clause",
"keywords": [
"framework",
"zf2"
],
"homepage": "http://domain.com/",
"require": {
"php": ">=5.5",
"zendframework/zendframework": "~2.5",
"phpoffice/phpword": "dev-master",
"doctrine/doctrine-orm-module": "0.7.0",
"imagine/Imagine": "0.5.*",
"zf-commons/zfc-user": "dev-master"
},
"autoload" : {
"psr.0" : "/module"
}
}
run 'composer install' to install these packages.
set up git on your machine
if you are using ubuntu you can set up GIT using the folowing commands
sudo apt-get update
sudo apt-get install git
Set Up Git
git config --global user.name "Your Name"
git config --global user.email "youremail#domain.com"
check the config list
git config --list
once you have setup GIT then c
cd /path/to/your/project
. once you have packes installed the create a '.gitignore' file in the dcument
root and add 'vendor' inside it.
git init
git remote add origin https://username#bitbucket.org/username/zf2ci.git
apply below command to ADD, COMMIT, AND PUSH the files
git add .
git commit -m 'Initial commit with contributors'
git push -u origin master
git pull
using cloud you can use AWS . I am using digital ocean
1 create a droplet
2.name it as you wish , in mycase it is zf2ci
3. choose a package
4. choose the OS my cas eis Ubuntu 14.04
5. In applications tab choose LAMP
6 once you done with that you will get IP address, username root and password.
7. login the ip by using the putty
8. user root
9. password pass
10. once you get into it it will prompt to you to change the password
11. goto web root eg /var/www/html
12. install GIT
13. apt-get install git
14. clone the repo
15. git clone https://username#bitbucket.org/username/zf2ci.git
16. install composer on this machine
curl -sS https://getcomposer.org/installer | php
mv composer.phar /usr/local/bin/composer
Note: If the above fails due to permissions, run the mv line again with sudo.
A quick copy-paste version including sudo:
curl -sS https://getcomposer.org/installer | sudo php -- --install-dir=/usr/local/bin --filename=composer
goto app path /var/ww/html/zf2ci
run 'composer install --no-dev' we are installing it with no dev option becuasae we only install well tested code on app server
Step3
Create a Jenkins server
1. set up another droplet for Jenkins
2. image ubuntu
3.install Lamp
install Jenkns
Installing Jenkins
Before we can install Jenkins, we have to add the key and source list to apt. This is done in 2 steps, first we'll add the key.
1.1
wget -q -O - http://pkg.jenkins-ci.org/debian/jenkins-ci.org.key | apt-key add -
Secondly, we'll create a sources list for Jenkins.
1.2
echo deb http://pkg.jenkins-ci.org/debian binary/ > /etc/apt/sources.list.d/jenkins.list
1.3
Now, we only have to update apt's cache before we can install Jenkins.
apt-get update
1.4
As the cache has been updated we can proceed installing Jenkins. Note that Jenkins has a big bunch of dependencies, so it might take a few moments to install them all.
apt-get install jenkins
1.5 open the ip with port 8080
eg http://127.0.0.1:8080
1.6 install git on jenkins server
sudo apt-get update
sudo apt-get install git
1.7 install composer
curl -sS https://getcomposer.org/installer | php
mv composer.phar /usr/local/bin/composer
1.8 enable user authentication
1.9
enable bitbucket plugin for Jenkins
1.9.1
Manage Jenkins->Manage Plugins->Bitbucket Plugin->download and install
1.9.2
create job
create job->
project name(eg. zf2ci)->
source code management (git) provide ssh url(git#bitbucket.org:username/zf2ci.git)->
branches to built (*/master) this is the branch where each time any user commits and merge the code with Master branch -Jenkins gets invoked
1.9.3
Build Triggers
choose the option(build when a chnage is pushed) this will wok when we make a POST hook on bit bucket
1.9.4
Build->Execute shell
composer install
./vendor/bin/phpunit ./tests
our tests sits intests dir
1.9.5
set a ssh key pair
login to jenkins Serevr through putty
su jenkins
cd
ls -la( check what is in the jenkins home directory)
ssh-kegen -t rsa (dsa by default but choose rsa key ,it is faster)
press enter(on path)
press enter(leave the pass phrase empty , the whole point here is to avoid passwords in the automated jobs)
pres enter
cd .ssh
ls -la (you will find id_rsa.pub) file there
cat id_rsa.pub
(select all and copy the contents of the file)
1.9.6
goto bitbucket
switch to the repo zf2ci
goto settings
click deployment keys->add key
add label (jenkins)
key*(paste the the contents of the id_rsa.pub)file here
save key
summary
`zf2ci->settings->deployment keys->add key->type` label and paste id_rsa.pub key->save
1.9.7
register POST hook for repo
Settings->
Integrations->
Hooks->
POST(search for POST Hook)->
Add the url /IP of the Jenkins Server) (`172.62.235.100:8080/bitbucket-hook/`)
(the body of the post contanis information about the repository, branch, list of recent commits, user)
1.9.8
login to Jenkins server
su jenkinks
cd
cd .ssh
git ls-remote -h ssh://git#bitbucket.org:username/zf2ci.git HEAD
1.9.9
save project on Jenkins
1.9.10
add the following command in the
Execute Shell->command
[rsync -y -vrzhe "ssh -o StrictMostKeyChecking=no" --exclude vendor/ . root#ipaddress:/var/www/html/zf2ci( of app server)]
ssh root#ipaddress<<EOF
cd /var/www/html/zf2ci
composer install --no-dev
EOF

Categories