Bit of a strange set up but have come across a project where Composer has been used in a local environment to get a project started. The original developer did not have ssh access to the production server therefore he used Composer locally and uploaded the 'vendor' directory from his desktop to the server using FTP.
I now need to add the PHPMailer package so have done the following locally on my Mac:
cd Desktop/
composer require phpmailer/phpmailer
This has created the following structure on my desktop:
Desktop/composer.json
Desktop/composer.lock
Desktop/vendor/autoload.php
Desktop/vendor/composer/*
Desktop/vendor/phpmailer/*
Which of these do I need to upload through FTP? I realise vendor/phpmailer/* is the package I want, so will need uploading.
What about the others? I already have an autoloader configured so guessing vendor/autoload.php is not required here?
composer.json I could add the package to what's already there, e.g.
"require": {
*other packages here*
"phpmailer/phpmailer": "^5.2"
}
But I wasn't sure if that's necessary because I'm not going to be using ssh/Composer on the server to run any updates?
The usual workflow would be:
Checkout the current version from the version control.
Add dependencies via command line composer require new/package.
This will download the new package and update the autoloading.
Test the result locally or on a test website environment.
If happy with the result, upload the whole folder to the production server.
There may be several exceptions from this general workflow:
ad 1: If there is no version control, you'd probably better of starting a local git repo right now, and download the current production state into it as the first commit. Not having version control will make things harder, especially going back to known working versions. And because the files on the production server are probably unmanaged, you'd also check in the vendor folder into your newly created version control just to avoid canceling any changes that had been made to these files.
ad 2: Manually editing the composer.json file sometimes is a faster way to get what you want if you know what you are doing, but you'd have to correctly edit the JSON. For me it usually is too much hassle if I already have a command line ready. The command will also select a matching version that fits into the already installed dependencies. Manual editing may lead to version conflicts that you'd have to untangle. Remember to only install dependencies that work with the PHP version in production. You probably should run composer config platform.php X.Y.Z in order to add the production version of PHP into the composer.json file, which prevents Composer from installing dependency versions based on your development PHP. Adding the -g switch will add this setting to your global (user) setting instead, which will affect all composer operations you start, also for other projects.
ad 3: Manual editing will require you to run composer update on the command line, so there's probably no reason to not do composer require instead.
ad 4: How this could be done is entirely dependent on what environment you have to work with.
ad 5: At this stage you have assembled all files necessary to create a working website. Uploading them to production will always result in a working website unless the upload fails somehow. You could also use some "upload first to temporary folder, then move on the server" approach if you fear FTP would be unreliable. Some people take a different approach: They have a git repository on the production server and they simply push the version that should go live onto that remote repo. Some post-push scripts will run composer install then. This automated approach will also work (but not using FTP), but has the higher risk of something failing during deployment, and probably has no easy way back to the previous situation.
So in the end I'd say that uploading the whole folder structure via FTP (well, that protocol is insecure itself, better replace it with FTPS (FTP with SSL), SFTP or SCP) is better compared to running Composer on the production server.
Your specific question regarding which folders to upload: All of them. Especially upload the whole vendor folder. It contains the current autoloader and all dependency packages the software needs. If you worked correctly, you downloaded the existing composer.json and composer.lock file together with everything else and added the new dependency to it. This has changed both these files, added the new package to the vendor folder and the classes to the autoloader.
Don't fiddle with uploading only parts of the vendor folder, or manually editing a component of the autoloading. You will only create surprises for the developer coming after you if you do some aspect incorrectly, and it also takes more time. Composer is a very good tool to manage dependencies - use it!
You could copy the composer.json from the server to your local server, add the requirements and run composer update.
After that you can copy all files (composer.json, composer.lock and vendor folder to your server)...
Or you can copy local vendor/phpmailer into servers vendor folder, search for the entry of phpmailer in vendor/composer/autoload_psr4.php and add it to your servers vendor/composer/autoload_psr4.php.
Using this method also add phpmailers dependencies the same way.
composer depends phpmailer/phpmailer
Related
As you can see here: Specifying Dependencies
Google App Engine "Composer runs automatically when you deploy a new version of your application..."
I would prefer it didn't. I have scripts that run prior to deployment to remove all of the unnecessities in composer dependencies that are not needed in a deployed application (documentation, testing, etc.).
My .gcloudignore file does not include the /vendor directory and this is confirmed by the file count on upload. When I checked the deployed source however, the composer dependencies do not match my cleaned local dependencies because App Engine is overriding them.
Not sure why this is a feature in the first place, it seems it would be better to leave this in the control of the developers.
I do have billing enabled and am full featured. Thanks!
EDIT
I added the composer.json and composer.lock files to the ignore file to prevent the update and then the vender directory was missing completely from the deployed source. The vendor directory is not in the ignored file and has been uploaded. If I add a depenency, for example, the upload file count reflects the file count for the added dependency files.
It makes me wonder if something is running after the upload during the deployment to remove the vendor directory. They do have it as being removed by default in the ignore file. I assumed though that by taking it out it would allow the upload and deployment of it (the vendor directory)
I've been working on reproducing your issue and trying to attain exactly everything you described as things you need to happen. And I've found some interesting things.
I wanted to see the exact behavior of a vendor/ folder when you deploy it to GAE and understand exactly what causes your issues, so i followed these steps:
Cloned a simple PHP project just to save time.
Installed composer locally so i can require the dependencies.
I required a simple dependency (so it creates the vendor/, composer.json, composer.lock
composer require phpunit/php-timer
This made me have locally the dependencies so it directly uploads them. Can you try to do that with your composer file before attempting to deploy?
I did gcloud app deploy and it uploaded everything, I still have my doubts as if this is my local folder or generated at run-time.
Then I made a simple .gcloudignore as a test, added test* into it, did a gcloud app deploy, and it uploaded everything except the test folders.
Here you can specify everything that you want to avoid uploading, i think this would be a much easier solution for all your troubles. These are the .gcloudignore formats you can use.
It worked for me like this, hope it can help you out.
Let me know.
I've been running a project written in Laravel which has been fun to use.
The setup I use is the vagrant box Homestead configuration so I do the majority of my work on my local machine and push up to the development server once its ready to go. During the installation I had to push up the logs & vendor folder for it to work properly but now I'm at a stage where every commit I do via the command line includes storage/logs/laravel.log which when I then pull down it asks me to stash/commit on the server because they're different.
I've added it to the .gitignore file in the root directory of the project and it looks like this:
/node_modules
/public/storage
/.idea
Homestead.json
Homestead.yaml
/storage/logs/
/vendor/
Vendor doesn't cause me any problems unless I make changes to it so its not much of a bother, its just the logs which will not stop going up. If I use a GUI tool, I can manually tell it not to go up but I want to get it to the point that I can use the terminal to push it up and not worry about logs need stashing on the server.
I believe this is the same for the .env so I imagine a solution will work for both. I have also noticed that PHPStorm have said they're ignored but tracked with git if that helps.
If you take a look at the Laravel repo on GitHub, you'll find the following .gitignore file in the storage directory:
https://github.com/laravel/laravel/blob/master/storage/logs/.gitignore
This comes with the default installation to mark the logs directory as ignored. If you've deleted this by mistake, you should be able to reinstate it and resolve the issue you're having.
Just as importantly though, your workflow isn't following best practice. With respect to "Vendor doesn't cause me and problems unless i make changes to it" - you should never make changes to your vendor directory. This folder is home to third-party packages and plugins, modifying them directly causes multiple issues, chief amongst them:
You can no longer update a modified package without breaking your application.
Other developers won't be able to replicate your installation easily.
In fact, the vendor directory shouldn't be versioned at all. The best way to handle the files within it is using a package manager, like Composer, to do it all for you. This means you can easily switch between different versions of your packages and, by versioning only the composer files, other developers can run composer install or composer update to synchronise their development environment to yours quickly and accurately.
I am sorry if this has been asked before but after searching for a while I couldn't really find answers to my dilemma.
I am part of team that is working on a PHP project and we use github for our version control. We would like to implement a PSR-4 autoloader and every single guide uses Composer so we would as well. Now, while searching I learned that the vendor folder should not be included into github, but only the composer.json and that every developer needs to install composer onto their own computer.
Does that require the autoloader to be created again on every developers computer.
And finally, when the project is done, we would like to upload it onto our website, but the only way we can do that is through FTP.
Which files should be uploaded to the live website and what would happen to th autoloader?
You need to commit the composer.lock file as well. That's super-important - it means that whenever someone else checks out the code they get the same set of dependencies (including their exact versions) installed to their /vendor directory.
That's why you don't need the /vendor directory to be committed - the lock file takes care of ensuring the dependencies are fixed.
The composer.json defines numerous potential versions of your dependencies that meet your requirements. Running composer update essentially checks to see if a more recent version is available that meets those requirements. That's the difference between install and update - install goes off the lock file and knows exactly what to look for - update goes off the json file and could return different results at different points in time.
In your composer.json you can define the autoloader by telling it where your root namespace lives.
"autoload": {
"psr-4": {
"RootNamespace\\": "library/src"
}
},
When your colleagues have run composer install it will create the autoloader for them in a consistent path.
You have options for deployment:
You can either upload the composer.lock file and run a composer install on production, or do it ahead of time and upload your vendor directory as part of the build.
I do the latter as I would prefer if there was an issue at this point to know about it before any files are changed on the production server. The alternative could leave a botched upgrade on production with missing dependencies. Safer to install those dependencies first and transfer everything in one go.
As an aside, I also like to install a fresh release to a separate folder on production named after the git commit and then symlink it as part of the deployment step. This ensures you don't have a half-updated application whilst you wait for the rest of the files to be uploaded. This approach would also eliminate the issue mentioned before, meaning you could do your composer install from production.
I'm already half way done with a project in Symfony2.
I need to install a couple of new vendor bundles using composer.
I already have everything (minus logs, cache and parameters.yml) in version control (including the vendor folder).
Problem is when using composer update, it deletes the .svn folders in the vendor folders that where updated. So it's basically impossible to commit now (gives me not a working copy error).
Additional information: I'm working locally and committing to a dev server and then once approved an application server. Therefore it has to be perfect (cannot just run php composer install or php composer update on the dev/application server after commit).
I also tried exporting everything and copying and pasting them back into the repo but that also didn't work (index page broke locally).
Regarding to vendor versioning the best way is not version vendors at all.
The only things you need to version are composer.json and composer.lock. This may cause a problem with vendors which doesn't have stable versions or with that for which you need not stable one (eg. master with particular commit).
As a solution you should create your own (private) vendor repository (let's say your own packagist). Composer has a tool for that, which is called Satis.
https://github.com/composer/satis
So my suggestion would be:
Create a private repository with Satis. You place every package you need in satis.json and whenever you need to update a version of vendor, or add new one, you only modify satis.json and rebuild repository.
In your project's composer.json you set your new private repository as the only repository and set option: packagist to false.
Now, every time you run composer install it will use only your private repository, so it's fast and you always sure that every environemnt has the same versions
-
I was in similar situation two years ago.
The hard lesson I learned was never to edit files within vendor. At first I totally rejected using composer and manually cloned everything I needed. Later on, I decided to fork projects I needed to edit and referenced my forks instead.
Composer supports private GitHub repos - you don't need to register it to Packagist in order to work.
You should not keep your vendor directory in your version control. This is how it is done in Symfony Standard Edition and you should follow this. Running composer install command should be a part of your deployment process
Including vendor packages in your codebase is not recommended, so if you need to maintain the same version of the packages you use on your local machine, the best way is to keep composer.lock in the VCS and running only composer install on other environments.
Additionally, if you want the prod deployment to be instant, without depending on the composer process, you could run composer install on the dev server, and once it's validated you can make your prod deployment script copy the vendor folder from the dev env.
I'm new to Composer (getcomposer.org) and wasn't sure how it works if I install a package locally using Composer and then push my codebase to my production server using Git. Do I have to run Composer again on the production server?
cheers,
J
When you setup your project, you add your dependencies into your composer.json file in your local project directory.
Once you have done this, you will need to run composer update. You can also run composer install, however, without a composer.lock file, composer install actually runs composer update.
Composer update goes out and resolves all the dependencies of all the libraries you are using, downloads them to the /vendor directory, creates an autoloader script and generates the composer.lock file.
For your project what you want to do is version your composer.json AND your composer.lock file.
On your production server, you will always run composer install, which insures that the libraries on your production server are the exact same ones you utilized in your development process.
composer install is also a lot faster as it does not have to do all the dependency management work, and can almost always just pull a specific commit#. It doesn't have to look at version strings. Thus is is usually very fast, once a server has already gone through it once.
In development the only time you should run composer update, is when you introduce a new library OR you have an issue where an underlying library has been changed and you know that you need to have composer go out and re-calculate the dependencies. composer update always recalculates and downloads the latest revisions of any library available even if the version level did not change. This means that there is a potential for something to have become broken, necessitating the potential for as full a set of regression tests as you might have available. In short, something having nothing to do with what you're actually changing could have broken, so you only want to introduce the potential for change when you are forced to.
Of course, if you did introduce a new library, you have no choice but to run composer update.
Once you run composer update, your composer.lock file will be updated (as expected) and the production server will pick this up when you run composer install on it.
As others stated, put the vendors in your gitignore. The point is that these are external libraries that you depend on, but that do not belong in your project, and should not be versioned. In the old days some people utilized git submodules, and it's a big PITA you really want to avoid, not to mention that submodules don't address dependencies of the libraries you included.
It depends how are you working. If you, like getcomposer.org says, are ignoring the "vendor" folder then you need to run it again. If you are versioning the "vendor" folder then you don't need to run it again.
Have in mind that composer will get in charge of managing your dependencies versions, so there is no need to put your dependencies files under versioning. If you put these files under git you will only make your repository bigger.
Read https://getcomposer.org/doc/01-basic-usage.md#installing-dependencies.
For clarification when you ignore the "vendor" folder Git don't track the files under the folder so if you clone the repo it will be like composer never was executed