In my company, we build most of our projects using composer, which means a lot of repeated packages (same library with same version) getting downloaded from the internet across my different teams.
I have tried Satis Composer Server,but the problem is the cache is not generated on demand.
I want to implement a central caching service which can help implement runtime caching or on demand cache.
Is it possible to implement?
I've developed a solution for this exact problem, called Velocita:
https://github.com/isaaceindhoven/velocita-proxy
Works together with a Composer plugin. It improves reliability and performance of Composer installs and you can configure which locations you want to mirror.
Satis is still great for local repositories or proactively generated caches based on composer.json contents, but Velocita allows for a more dynamic pull-through cache.
Today, Github faced a short outage and that made me look into your question. I've got multiple webservers and I'd like to setup one of them as a git proxy server. There is no need for a webserver to look download from Github (or Bitbucket, Gitlab, etc.) if it can download the same package from the internal network.
I found this blog post, explaining two (not actively maintained) options:
Gitpod
https://github.com/sitaramc/gitpod
local caching server for git when the actual server is on the other
side of a (possibly slow) WAN link
broker
https://github.com/researchgate/broker
A full proxy for composer repositories
Related
I'm still new to coding and I'm learning everything on my own. This is a silly question for you but after reading a dozen of articles I am still confused.
I have a php based website on a shared host. After reading the various articles on benefits of using repositories and Composer, I decided to give it a try. These are my difficulties so far:
Which version of the operating system of Composer should I download, to enable me to install/update repositories of my cPanel based shared hosting?
If I am to install Windows version, how do I connect to my shared hosting to install/update the repositories?
My apologies for my silly questions, but it would really help.
If you are using shared hosting, you are unlikely to be able to use Composer on the host itself. Furthermore, you are not encouraged to use Composer "on production".
I would recommend you use Composer locally (on the O/S of your local machine), to compose your project and install your dependent packages. Once it's all working and tested with your own code, you upload your entire development directory tree including the resulting vendor library - as one big FTP/SCP upload of "flat files".
Once you get more advanced you could adventure into automated deployment techniques, but I feel for now you would be best to stick to using Composer as a local development tool to manage your codebase.
Update, further details:
Composer is really a tool to help you manage your codebase in development. It's not intended as a "deployment" tool. Previously you used to find a library you liked, download it, unzip it into your codebase somewhere random like "lib/stuff" and then link to it, and commit it into your version control system (VCS). OK, but they a year later you want to update it and you have to download it again, figure out where you saved it and how to overwrite the files, or delete old ones... it gets hard. Also your VCS repository gets full of 3rd-party components - even duplicates of the same one! Composer solved this by bringing order to this long-term dependency management chaos.
The reason you don't want to run Composer "on production" (i.e. your live website), is that during the process of download, update, composition your website will probably be broken. Even if the composer process works, this could be several minutes of broken site. After the update has finished - you now have a completely new set of 3rd party packages: how do you know they are compatible with your codebase?
So therefore you only do composer updates locally, test everything, amend your code to work the shiny new updates, and only then do you decide to upload the whole new site to the server - just as if you'd cobbled it all together manually. The deployment is independent.
In my project the deployable version needs to have a copy of each of the external libs, a different config file and install and setup files, for security concerns, the main project is set to refuse to run if they are present. Thus the upstream copies of the other projects need to be committed to repo. How can I work on code running on localhost where the file layout and sometimes file contents from dev and testing are different to what I need to commit?
Background
I am working on a project on hosted on github and my main IDE is netbeans which has imperfect git support (good enough for >99% of my needs). The project is in PHP and uses several other projects as libraries.
As Netbeans does not have the best support for sub-repos I have chosen to keep each additional project in a separate project. This is fine as the central project looks at the config data for where to find these outside libs.
Half an answer
My instinct is to suppose that there will need to be some "build stage" prior to committing to the github repo but how on earth do I go about setting all that up?
I could write some sort of homebrew thing but then when I pull other people's contributions I would need to reverse the process unless we had a branch for builds and a branch for working copies which seems needlessly complex and could leave the dev(s) config data on public display (not to mention updates being a mess).
I have seen that others have wrestled with somewhat similar problems to no conclusion (at time of asking) (How to push and pull from github without sharing sensitive information? Smudge & clean?) so I am looking for anything that might help me come up with a solution
my main IDE is netbeans which has imperfect git support
Most devs just use the command line. I switch to the NetBeans conflict resolver occasionally, which is very good, but for normal stuff the console is usually faster.
My instinct is to suppose that there will need to be some "build stage" prior to committing to the github repo
... unless we had a branch for builds and a branch for working copies
No, there is only ever one repository. It is better to think of your repo as your code history, rather than your deployment state. Branches should just be for features or large changes, which merge into your mainline/master.
There are a good deal of options available to you when deploying. The first is Composer, which Mark points out: when deploying you issue an install or update command, which fetches the dependencies that satisfy your library requirements recursively. You can use Bower to do the same thing for your JavaScript dependencies.
Some deployment strategies prefer to build locally and then scp/rsync to a remote server. Composer and Bower are still probably a good idea, but you write a build script (using Ant or Phing, for example) to create a build copy in a local temporary folder, and then send it to the server. It is common here also to push it to a new release folder on the server, and then swap a symlink or Apache config file when it's ready to go live.
the deployable version needs to have a copy of each of the external libs, a different config file and install and setup files, for security concerns
Assuming this is a web project, have you tried adding your sensitive environment data to your Apache configuration file? This can be trivially read in PHP, and of course PHP does not care that this information is different according to whether you are developing, testing, demoing a branch or operating live.
Further reading: an excellent PHP deployment book, free of charge, that suggests Phing and Capistrano.
Currently nearly the entire packagist.org-based dependency loading relies on GitHub-based repos. But GitHub users have the possibility to delete public repositories, which leads to the question:
What to do if a necessary Composer-loaded dependency does not exist anymore (or gets deleted by vandalism etc.) ? Are there archives somewhere to provide long-term service ?
Afaik packagist.org does not host any data (yet) and GitHub also does not keep public copies of deleted or renamed repositories.
That's where Satis comes into play. With Satis you can create a local copy of either the packages you need from "packagist.org", and also create local downloaded ZIP versions of all the versions found online.
This comes with the added benefit of being hosted in your local network, so it is much faster when accessing it, and you have a local copy available whenever your online connection goes down, or Github experiences issues, or whatever.
These locally created versions are yours to backup and take care alone, and if you install something from them, that location will be persisted in the composer.lock file (it registers the URL any ZIP was downloaded from, which is not the Github API URL, but your local HTTP server hosting the Satis files).
Using Satis you can ensure a bit more that every software you use is accessible when you need it in your local environment. This comes at a little cost of maintaining a list of all the software packages you need, running the Satis update once in a while, having a local HTTP server hosting everything, and adding your Satis repo in every composer.json file you create. Note that this last step makes it impossible to use your software if they does not have access to your Satis hosted files - it's a closed user group solution.
Although it's very unlikely to happen if you use popular third party components, you will have a copy on you development/production space so if it goes down you can create a new repo and upload a copy of that library.
If a third party component is not very popular and you are concerned about its continuity you can fork it just in case
I am currently interacting with my sites via FTP, which can get quite annoying and impossible to manage at times.
Is it possible to use GitHub as a version control system for my PHP based websites which is on a different domain name?
If not, can anyone give me some advice on what tool I should look into using in order to set up version control.
You can create a git repository with all of your code for your website, and host it on Github. Then you can make changes, commit them to your local repository, and push them to Github. Afterwards, when you want to deploy your changes, do a git pull on your Github repository from the remote server.
Git is a distributed version control system. You have local repositories sitting in your local machine (your laptop, desktop, etc.) Github is a remote repository hosting service (sort of like Dropbox). When you want to sync the repo in your computer with the one hosted by Github, you do a pull/push. Then, you can sync the repository hosted by Github with the thrid repository on the server that hosts your website using a push/pull. No ftp is needed.
After a bit of research, found the solution to my question.
http://net.tutsplus.com/tutorials/tools-and-tips/how-to-use-git-with-ftp/
When putting together a PHP project with composer, on installation / deployment, composer would fetch the dependencies usually from their original sources.
This could lead to problems when deploying, when a source (maybe only temporarily) becomes unavailable.
Is there any included mechanism to keep at least the current, stable versions of the dependencies some where to be always able to deploy the current version to other instances?
Right now there is no one click solution for this, but I plan to work on something soon that will give you more reliability.
Broker looks like a tool which could serve as a proxy to keep files, and is now integrated into Satis (see https://github.com/researchgate/broker)
broker is a full repository proxy for composer. It takes a composer file, downloads all
requirements and all dependencies, and then publishes a new repository with all these
packages. Instead of packagist or satis, all packages, including dist and source files will > be served directly by broker.
Note: this project is not actively maintained anymore. Since satis supports a similar
functionality now, you should use satis instead.