I've installed composer and added some packages via 'composer install'. It installed them under "my_project\vendor" path but some of the packages were cloned using git, so when I committed "my_project", those cloned packages were ignored.
The problem is that when other developers are cloning "my_project", they are missing the packages that were ignored. Is there a way to automatically add the packages to "my_project" so other developers will fetch them from me?
I think this should be done using submodules, but I don't know how to automatically add every new package from composer as a submodule to my project.
Preface: Jordi - I love composer, keep up the great work, and if I'm mistaken on any of this let me know and I'll both update my workflow and edit the post :D
Unfortunately this isn't the "general recommendation" depending on who you ask, it's by far a developer-only perspective. And the caveats to using the practice prescribed in the composer FAQ have many more considerations than I can cover here. So I'll leave a couple major points for the consideration of others.
By #Seldaek's own admission composer isn't really 100% stable, far better from a year ago, but still a very active project regardless. So relying on composer to implement an identical environment on a dev server vs staging server vs production server wouldn't be a general recommendation from any QA / Deployment group. This is not meant as a slight to Jordi, but rather an expression of the maticulous nature of QA peoples.
From the FAQ, it states when merging vendor libs into your own repository you should:
Limit yourself to installing tagged releases (no dev versions)
However if you use composer to manage your CI or automated deployments, the same constraint would apply - only more so - because deploying a master or dev tag to your production environment could be a very different package than what you tested in staging only a day or even an hour ago.
Even outside of changes introduced in third party libs (which would be solved by using only tagged versions regardless of dev or production deployments) unless you can rely on composer doing the exact same thing every time, you'll risk introducing bugs into production. This is not really a risk-case I would concern myself with, but then again I'm a developer too ;) But issues can result from simple changes like this where unless you maintain the exact same version of composer.phar on all environments, you could really muck up a staging or production server.
The other major issue I have is really related to all of the points listed under this heading:
While it can be tempting to commit it in some environment, it leads to a few problems:
I don't see the consequences as problems, but instead benefits! A large vcs repository isn't that big of a deal in modern high bandwidth environments. And unless you are using a very active vendor lib, your diffs won't be that big either. Even if they were big, git/hg/dvcs systems are all capable of re-using chunks, compressing chunks and keeping all your ducks in a row. But more so, they are an alert to the developer when changes are introduced to those packages, and diff -w is a great summary view of the total changesets, especially if you are on dev/master tags.
Duplication of the history of all your dependencies in your own VCS.
This is worded a little incorrectly, it won't duplicate the entire commit history of the vendor lib, just a single commit (your commit) covering the full delta between now and the last time you ran a composer update resulting in changes. You're probably not updating all of your libs every time you update, even if you don't specify individual packages. And if you did accidentally update a vendor lib, you can easily revert, whereas if you did so on a dev/master tag and it broke your environment, you'd have to figure out what version you were previously using and specify the tag in composer.json, and update again to revert it. git checkout /vendor/3rdpartylib --force just seems easier to me.
Adding dependencies installed via git to a git repo will show them as submodules. This is problematic because they are not real submodules, and you will run into issues.
Ideally, composer would give you a config option. It could automatically delete the .git directory from git pulls, and automatically rm the directory (or temporarily mv it) before updating a lib, when and only when an updated version exists. And doing so would be far more reliable than leaving that manual process up to individual developers. There are an equal number of reasons to have vendor libs integrated into your version control repo so the choice really depends on the details of your situation.
The biggest reason for versioning all of your files is being able to reliably deploy the exact package you tested in development to staging to production, a key purpose of vcs and automated deployments to begin with. Unless you configure your development environment to use specific tags for every package and you version control your composer.phar you should not rely on composer to deploy your software.
You should ideally just add vendor/ to your .gitignore, and then every developer of the project would run composer install to get the vendors on his setup.
You can read the FAQ entry on commiting vendors for more details.
Related
I am reading/learning about Composer, the application-level package manager for PHP.
In this blog post written by lead dev Jordi Boggiano, he writes:
Composer on the other hand forces you to declare your project
dependencies in a one-stop location (composer.json at the root). You
just checkout the code, install dependencies, and they will sit in the
project directory, not disturbing anything else on the machine.
Another related feature is the composer.lock file that is generated
when you install or update dependencies. It stores the exact version
of every dependency that was used. If you commit it, anyone checking
out the project will be able to install exactly the same versions as
you did when you last updated that file, avoiding issues because of
minor incompatibilities or regressions in different versions of a
dependency.
If I understand Composer properly, when we're talking about packages downloaded/installed by Composer, we are talking about PHP code packages, ie, programming code written in PHP, and not system-level packages, eg, extensions to the PHP runtime installed on the server. So once these PHP code packages have been downloaded and added to a PHP project, I would have thought those packages become part of the PHP application source code, eg to be checked in to whichever version control system is being used for the project. If another developer comes along and checks out the code, why would they need to then "install the packages", as is stated in the blog post? Wouldn't they get a copy of all code packages when they check out the code from source control? This line in the blog post is confusing me, and making me think I don't understand Composer.
Any clarity on this would be greatly appreciated. Thanks.
The dependencies themselves should not be commited to source control. The composer.json and composer.lock files, on the other hand, should. There's various reasons for this, amongst them:
Every time you update the dependency you would have to commit the changes. That kind of tightly couples your code to the dependency, when it should be exactly the other way around.
The packages themselves are already in their own repository with their own history. Why repeat that in your project's history?
Those repositories can be huge, just muddling the waters around your project. Why carry around all that weight?
Instead, having each developer just run composer install (very important: not composer update) whenever they check out the project is much more efficient. Composer will install the dependencies from composer.lock, making sure everyone running the same commit is on the exact same page. The same goes for deploying.
You can read more about this here.
On the other hand, there might be situations where you have to commit your packages to get around a problem, like for example when you know you won't be able to run composer install on your production server (shared hosting)
Normally packages installed via composer don't get checked in to source control, only the code you write and the composer.json and composer.lock files.
This way the repository for your project does not get bloated with code you did not write and possibly don't really care that much about.
Yes its normal after cloning down your repository a developer will need to run the "composer install" command. The composer.lock file will ensure they get the same modules and versions of them you used when creating your project.
Not including the composer modules in your source control also allow you to easily update to the modules to get bug fixes and new features in new versions of them.
I'm still new to coding and I'm learning everything on my own. This is a silly question for you but after reading a dozen of articles I am still confused.
I have a php based website on a shared host. After reading the various articles on benefits of using repositories and Composer, I decided to give it a try. These are my difficulties so far:
Which version of the operating system of Composer should I download, to enable me to install/update repositories of my cPanel based shared hosting?
If I am to install Windows version, how do I connect to my shared hosting to install/update the repositories?
My apologies for my silly questions, but it would really help.
If you are using shared hosting, you are unlikely to be able to use Composer on the host itself. Furthermore, you are not encouraged to use Composer "on production".
I would recommend you use Composer locally (on the O/S of your local machine), to compose your project and install your dependent packages. Once it's all working and tested with your own code, you upload your entire development directory tree including the resulting vendor library - as one big FTP/SCP upload of "flat files".
Once you get more advanced you could adventure into automated deployment techniques, but I feel for now you would be best to stick to using Composer as a local development tool to manage your codebase.
Update, further details:
Composer is really a tool to help you manage your codebase in development. It's not intended as a "deployment" tool. Previously you used to find a library you liked, download it, unzip it into your codebase somewhere random like "lib/stuff" and then link to it, and commit it into your version control system (VCS). OK, but they a year later you want to update it and you have to download it again, figure out where you saved it and how to overwrite the files, or delete old ones... it gets hard. Also your VCS repository gets full of 3rd-party components - even duplicates of the same one! Composer solved this by bringing order to this long-term dependency management chaos.
The reason you don't want to run Composer "on production" (i.e. your live website), is that during the process of download, update, composition your website will probably be broken. Even if the composer process works, this could be several minutes of broken site. After the update has finished - you now have a completely new set of 3rd party packages: how do you know they are compatible with your codebase?
So therefore you only do composer updates locally, test everything, amend your code to work the shiny new updates, and only then do you decide to upload the whole new site to the server - just as if you'd cobbled it all together manually. The deployment is independent.
Introduction
This is quite a lengthy question, the question I asked in the title is probably ambiguous and I may change this to something more suitable.
A similar question has already been asked and answered here although I do not think this entirely answers the question.
Synopsis
I am working with a team of developers on a project. We are using a framework (for argument sake - the framework is irrelevant) except one of the requirements is that we use composer.
These packages are essentially de-coupled from the application and each other, however one package may depend on another package.
These packages have their own git repository, and during development of the application have branch aliases set to dev-master.
Problem #1
In order for the application to work with my packages I need to register them with composer.json, which is fine until I have to commit the existing work of my package development to their repository before I can run composer update.
Problem #2
I have committed the initial package architecture and the composer.json. I run composer update which completes and my package is available to the application. Yet, I continue to develop this package at the same time another developer has already committed a different change to this package - i.e. a bug fix.
I need to update another package in order for my work to continue, yet I can't because doing so would throw a warning similar to:
Loading composer repositories with package information
Updating dependencies (including require-dev)
- Removing hu/admin (dev-master)
The package has modified files:
M composer.json
M src/...
Discard changes [y,n,v,?]?
If I respond with y my changes are blown away and lost forever. If I choose n composer is aborted and my project is broken due to a package not being updated parallel to my changes.
Problem #3
Another problem with this way of developing is that if I am working on my package outside of vendor and I commit my changes I can run composer update but my application is broken due to a fatal error. I fix the missing ; or other small syntax error. I commit this change and run composer update - yet I don't want my git history full of little typo fixes and parse error fixes just because I can't work on my package within the application parallel to other development/developers on the application and other packages or this package.
Problem #4
I discovered a package on GuitHub called franzliedke/studio which seems to part-solve my problem. Once a package has been published due to being complete/functional, this then cannot remain inside the vendor/bin directory alas causing the initial problems to rise once more.
Conclusion
I am wondering the best way to work around this or any best practices in order to work on packages with teams of developers without having to commit everything every time before I run composer update.
laravel did have a workbench feature which was pretty cool. But was removed from version 5.0 and so was that bad practice?
That's what we do in huge projects consisting of several little composer components which are developed at the same time:
Develop your application 'in one piece' like described in the other answer your mentioned by just keeping all components separate inside their own namespaces and directory structure.
/Application
-composer.json (Application json)
-/src
--/Component1
----composer.json (Component json)
--/Component2
----composer.json (Component json)
--/ApplicationFeature
----Class1.php
----Class2.php
The whole application is developed in a single git repository which would eliminate most of your aforementioned problems. Then occasionally split the application repo into single component repositories using git subtree. I wrote a little php cli script which splits the project into smaller components and pushes them to the according component repositories. There are a lot of advantages compared to git submodules. The whole commit history of a component is kept and pushed to the component repository. More information on subtrees here
In case you are interested please let me know, I am happy to share the script which splits/tags and finally pushes the single components by just defining a directory <-> componentName mapping as a json.
In my project the deployable version needs to have a copy of each of the external libs, a different config file and install and setup files, for security concerns, the main project is set to refuse to run if they are present. Thus the upstream copies of the other projects need to be committed to repo. How can I work on code running on localhost where the file layout and sometimes file contents from dev and testing are different to what I need to commit?
Background
I am working on a project on hosted on github and my main IDE is netbeans which has imperfect git support (good enough for >99% of my needs). The project is in PHP and uses several other projects as libraries.
As Netbeans does not have the best support for sub-repos I have chosen to keep each additional project in a separate project. This is fine as the central project looks at the config data for where to find these outside libs.
Half an answer
My instinct is to suppose that there will need to be some "build stage" prior to committing to the github repo but how on earth do I go about setting all that up?
I could write some sort of homebrew thing but then when I pull other people's contributions I would need to reverse the process unless we had a branch for builds and a branch for working copies which seems needlessly complex and could leave the dev(s) config data on public display (not to mention updates being a mess).
I have seen that others have wrestled with somewhat similar problems to no conclusion (at time of asking) (How to push and pull from github without sharing sensitive information? Smudge & clean?) so I am looking for anything that might help me come up with a solution
my main IDE is netbeans which has imperfect git support
Most devs just use the command line. I switch to the NetBeans conflict resolver occasionally, which is very good, but for normal stuff the console is usually faster.
My instinct is to suppose that there will need to be some "build stage" prior to committing to the github repo
... unless we had a branch for builds and a branch for working copies
No, there is only ever one repository. It is better to think of your repo as your code history, rather than your deployment state. Branches should just be for features or large changes, which merge into your mainline/master.
There are a good deal of options available to you when deploying. The first is Composer, which Mark points out: when deploying you issue an install or update command, which fetches the dependencies that satisfy your library requirements recursively. You can use Bower to do the same thing for your JavaScript dependencies.
Some deployment strategies prefer to build locally and then scp/rsync to a remote server. Composer and Bower are still probably a good idea, but you write a build script (using Ant or Phing, for example) to create a build copy in a local temporary folder, and then send it to the server. It is common here also to push it to a new release folder on the server, and then swap a symlink or Apache config file when it's ready to go live.
the deployable version needs to have a copy of each of the external libs, a different config file and install and setup files, for security concerns
Assuming this is a web project, have you tried adding your sensitive environment data to your Apache configuration file? This can be trivially read in PHP, and of course PHP does not care that this information is different according to whether you are developing, testing, demoing a branch or operating live.
Further reading: an excellent PHP deployment book, free of charge, that suggests Phing and Capistrano.
Here's our situation :
We have 3 different Laravel projects and all 3 projects rely on our Core project.
This Core project is a separate Laravel package hosted on our private repo and is used as a dependency for other projects.
Before, whenever something would change in the Core project we woud just run a composer update ourvendor/ourcorepackage on our servers for each project to pull in the core changes. However as of lately composer seems to suffer from serious memory issues when we try to run the update on our Digital Ocean staging environment with 512 MB Ram. See : https://github.com/composer/composer/issues/1898
The solution I always come across is people saying that you should always run composer install on your production servers. I can relate to that in terms of security because it can be dangerous if you update to a new version of some 3rd party package that can possibly break your code. But in our case we only update our own core package so we know what we're doing but this memory issue forces us to use the composer install method because it is less memory demanding.
So basically this is our current workflow :
When something changes in our core package we need to run a composer
update ourvendor/ourpackage on each project LOCALLY This generates a
composer.lock file
We commit the composer.lock file in our repo
On the servers for each project we run a git pull and run a composer
install. This will only update our core package and runs much faster
and has no memory issues vs composer update
However this solution raises 2 issues :
Since we're working with multiple devs on the same project we sometimes end up having merge conflicts for the composer.lock file when pulling in the changes locally.
Running a git pull on the server gives an error : Your local changes to the following files would be overwritten by merge: composer.lock
Please, commit your changes or stash them before you can merge.
So what am I supposed to do here? Before the pull on the server remove the composer.lock file?
How should we handle the merge conflicts for the composer.lock file?
It's a shame that composer update suffers from memory issues because that method seems much more logical. Just update the package you want and no hassle with the composer.lock file..
Please advice how a correct workflow with GIT and composer should be in our case and how to solve the conflicts above ?
Many thanks for your input
How can you test that a core update (or any other dependency that gets updated) doesn't break things in the projects using it if the developer don't do this step themselves?
That's why the usual workflow is expecting the composer update being run on a development machine having enough RAM (i.e. probably more than 1GB set as memory limit for PHP), and the update should be triggered manually by the developer (and if triggered automatically by a continuous integration build, the memory requirements apply to this machine as well).
There is no way around this memory requirement. A web server with only 512 MB RAM installed might be able to function as a staging server with barely any concurrent users present, but it shouldn't be used to update the Composer dependencies.
Personally I fix the merge conflicts in the composer.lock with a very easy system: Delete the lock file and run composer update. This will update all dependencies to the latest versions that satisfy the version requirements, and create a new working composer.lock file that get's committed during the merge.
I am not afraid to potentially update everything, because either it works as expected, or my tests will catch errors quickly.
I do select the 3rd party packages I use carefully:
they have to tag their versions, preferably using semantic versioning.
I do not use any branches for release versions (the rare occasions that someone used them during development were painful)
they should ship a new major version if they make backwards incompatible changes
the locally developed packages also follow these requirements
This works with around 270 packages served by our local Satis instance (probably also a factor to consider when trying to reduce memory footprint - only the packages known to Composer can end up in memory: Compare the ten thousand packages potentially available on packagist.org with 270 local packages). 60 packages of the 270 are locally developed by 20 developers, and randomly releasing new versions. The update failures in the last 2 years are very rare, and should be handled like other bugs: If a tagged version is detected to be incompatible, we release a bugfix release reverting the change, and tag the original change with a new major release, if the incompatible change is necessary.
So the workflow you ask for is probably like this:
Anytime, any developer should be able to run composer update on their local machine.
They should be able to detect if this breaks things on their local machine.
If nothing is broken, they commit the changes including the composer.lock file to Git
The staging server only runs composer install and will use exactly the versions that the developer used on his machine.
If nothing is broken on staging, that version is ready to be used on production.
Merging an already committed version on another developers machine will likely show merge conflicts with composer.lock.
Resolve conflicts on all other files.
The composer.lock file should be deleted.
From here, the workflow is like above, i.e.:
The developer should be able to run composer update on his local machine.
They should be able to detect if this breaks things on his local machine.
If nothing is broken... and so on.
Another approach (without doing composer update):
Copy your new (and deleted) lines from composer.json into a separate text file.
Use entire remote composer.json and composer.lock files.
During merge conflict mode do:
composer install
For every new package your wrote down in step 1 run composer require vendor/package:version
For every removed package your wrote down in step 1 run composer remove vendor/package
Testing!, Commiting, done!
This method will keep locks from remote branch (maybe master or develop branches), and only updates your new packages.
Sometime composer update can break things.
What I do is.
Discard all of my changes on composer.lock
Merge composer.json
Run composer install so that packages get installed according to composer.lock.
Take one of your package and run composer require vendor/package if you
added that package or composer remove vendor/package if you removed it.
if there are multiple packages it will be installed automatically from composer.json