I have started to use Composer for a new PHP application (it uses a few frameworks and APIs such as Laravel, Smarty etc.) and all is good with it in development.
However, I am not too sure about how do I go about deploying this on a live production server. The sub-directories of the respective modules under the /vendor directory seem to include a lot of stuff which I would not normally include with an app, (such as Demo files, installation readmes, documentation etc.). Is this normal, or is it that the creators of those packages got the wrong idea of how to create a Composer package?
Is there a standard approach to create a clean app deployment that only includes the necessary distribution files and not the rest of the stuff which is not related and should not even be there (even for security reasons)?
I am asking about the most common workflow used, or maybe a specific command or configuration in composer.json I should be looking at to achieve this.
There are two recommendations I'd make which cover most of your worries.
1)
Use composer update --no-dev to prune out any dependencies for development only from your lock file. Or tweak your requirements in composer.json to achieve the same.
In theory the package developers should keep the production version cleaner, (e.g. not including all the phpunit tests). However yes it's pretty normal to have a lot of 'cruft' in the vendor library, mostly because the concept of build is relatively uncommon in PHP, therefore "sources" and "targets" are intermingled.
Demos and readme's are still a necessary part of the 'distribution' version of a component so will still be there if you specify no-dev, which is just to say "I'm not developing this package, I'm just consuming it".
It does feel like there's a missing composer feature: one level above this which is indeed a super-clean deployment minified package.
2)
Keep your vendor library above the web root.
This prevents any unwanted browsing into the vendor library, and removes any security issues of site visitors exploring your libraries (if that's what you were rightfully worried about).
e.g. I typically use
domain
/api
/etc
/vendor
/www
/js
/css
/images
/index.php
/foo
/bar.php
Where www is the virtual-host web root.
All entry-level scripts are in your web root as normal and path to the autoload at ../vendor/autoload.php
Or of course www/ could be the laravel root folder if you're using Laravel for your website as well as api.
api can host a separate vhost for your laravel APIs if they're done separately from a 'flat' website.
(I keep other folders above the web root build, docs, src for SASS, JS, Grunt etc, etc can securely store any config, passwords, keys etc.).
3)
If you're still not happy with the baggage, then as the other commenter suggests, you'd want to look at a build process which cleans things up. This can get complex though and difficult to maintain!
e.g. you could build to a local www deployment folder (i.e. composer update, plus any grunt tasks, bower installs, laravel/artisan publishing etc), then prune it back (custom scripts you'd have to engineer), and commit that into a separate repository representing a published, flattened deployment target. This is what you would deploy to your website.
This would have to be automated or you'd stop doing if after about the third time :)
But... you have to ask why else you'd want to prune back the vendor libs. Disk space? They're not that big. General tidiness? Consider the folders as black boxes and just read the API docs. i.e. don't look :)
Related
Thanks for your attention, this is a question of organization, I work with PHP and GIT for version control. I use Netbeans IDE to program, GIT integrated (although I am still a rookie).
Normally, I follow the approach that Symfony2 specifies for organize the project files, but I use my own framework for my projects.
The (main) question is: Any component or code part which has its own version control must be located under the /vendor/directory?
For example:
I have my project files in src\Acme\ProjectX\, also the utility package which use all my projects: src\Acme\Util\, and it is under the version control too (GIT).
and now let's remember the basic skeleton of a project based on Symfony or similar:
/app (application related elements)
/src (source code of the project)
/vendor (third party libraries)
/web (front end controller, the web directory, assets resources etc...)
So, Must be 'Acme\Util' included in the vendor directory? And, is necessary to use composer to declare the dependences?
In addition, the Utility package has a lot of classes but only few are used in projects. Must I remove those are not using by the project.
Summarizing, It will be nice if someone can contribute his knowledge for help me to represent an scenario like this.
I hope I could explained...
Thanks in advance!
Vendor directory
It's a good practice to separate external dependencies and the application code. If you are using Composer you can change it to something else.
Unused classes
Unused classes shouldn't matter if they aren't being loaded. They'll just take a bit of extra disc space.
It might be a good idea to separate the Utility package into multiple packages if you find yourself frequently using only a small part of it.
Dependency managers
It isn't necessary to use a dependency manager, but it sure does help. Having to install, configure and maintain everything manually (especially with many dependencies and sub-dependencies) would be a horror.
I currently deploy a PHP app's static assets using s3cmd.
If an asset for Amazon S3 has changed (like a JavaScript script), I'll rename it manually, so that the new version is synced and served by Amazon Cloudfront.
Are there documented practices for deploying a PHP app's static assets to Amazon S3? I've seen that sites use a hash to refer to a certain deployment version of their assets. I'm wondering the approach to get that hash (like the git commit SHA?) to be referenced by the web application.
The approach I could see working is writing to a stand-alone config file that holds the current SHA, and read from it for deployment.
Update 1 with current process
I'm looking to make the deployment of assets more automated:
Make a change to the app's JavaScript
Rename the script from app.23.js to app.24.js
Edit the site's header HTML template to refer to app.24.js
Commit all change to git
Merge the develop branch into master
Tag the commit with a new version number
Push code to BitBucket
Use s3cmd to syncronise the new script to Amazon S3 (and therefore Cloudfront CDN)
SSH into server, and pull the latest git tree. The production server serves the master branch.
I would like to think there is no specific answer to this question but here are a few things.
If you only want to automate the manual work you are doing then it might be worthwhile to look into a few deployment tools. Capistrano and Magallanes are two names that come to my mind but you can google for this and I am sure you will find a lot of options.
The Rails framework was built on the philosophy that there is a best way to do things. It also uses hashes to version static assets and does its thing out of the box on the fly. You can look into implementing hashing in your case.
Grunt is another automation tool that you can look into. I found this module that might come in handy https://github.com/theasta/grunt-assets-versioning
I would say for me, the 2,3,4 are the problem areas in your workflow. Renaming manually and updating code every time does not sound too nice. As you have pointed out, GIT hashes are unique so perhaps append the GIT hash to your assets during deployment and sync them to S3/Cloudfront ?
I have been approached to develop some front-end components (CSS/JS & templates) for an existing Typo3 website. The client already has a development server set up, with Typo3 & extensions installed, and a copy of their database. I have SSH & FTP access to this development server. I am looking for advice on the fastest & most practical way to begin work on the templates.
My level of experience is that I have done front-end work with Typo3 before, but always in a team with an existing version control, build & deployment workflow. In this case, I would be responsible for setting up my own workflow and tools.
I have spent some time reading through version control, build & deployment-related questions on Stack Overflow, but many of them assume:
A team with multiple developers
A long-running project with major changes to functionality, in which it makes sense to invest considerable time up-front to the build process
It is 2010.
An existing workflow (e.g existing development environments) into which an additional tool such as git will be added.
In my case, I will be the only developer working on this, and I have been hired only to make some layout updates. I have no existing development environment for Typo3, and would need to set this up. My work on this project is not intended to run for more than a couple of weeks. I would like to find a solution somewhere in between "edit files directly on the development server" (very bad practise) and "set up a fully-featured PHP application deployment service using Magellanes" (probably good practise, but a steep learning curve for me and a large investment of time).
Ideally, I want to wind up in a situation where I have:
A local development environment on my Mac with Typo3 installed where I can preview & test code changes
Version control such as git on my local system
A way to push my changes to the development site
Can anyone share with me tools or workflow suggestions for how to get to that point as quickly as possible?
My environment is similar to yours, and this is my typical setup:
Version control
The website is version controlled with git. There may or may not be a repository to regularly push to, but my local version is always included in the backups. The following things are excluded from version control:
Files in typo3temp, but not the directories (Put the .gitignore at the end of the post into a directory to keep it, but not the files in it)
The server specific configuration
deprecation logs
IDE files
...
To include the TYPO3 core, there are three variants:
Just add it to the repository
Add the core git repository as git submodule
Add symlinks to the core into the repository.
The latter may be problematic if the directory structure is different in both environments, and the first one clutters the repository, so when possible, go with the second one.
TYPO3 configuration
The TYPO3 configuration is split into three files:
Localconfiguration.php - version controlled, not manually edited (it is constantly overwritten by TYPO3).
AdditionalConfiguration.php - version controlled, manually edited.
ServerspecificConfiguration.php - not version controlled, manually edited. Contains database credentials, colorspaces for imagemagick when different on localhost and remote host, defines caching backends to use and similar stuff.
The additional configuration file includes the server specific file. Both use the \TYPO3\CMS\Core\Configuration\ConfigurationManagerConfigurationManager::setLocalConfigurationValueByPath('DB/host', 'localhost');-syntax for settings to make this work.
Deployment
To deploy the site, I have used two things:
git ftp - this is useful if there is only FTP access.
rsync - This is the better option.
To automate the deployment (and other stuff like building CSS from LESS or SASS), a task runner like grunt.js or ant is useful. Make sure you exclude the server specific configuration and typo3temp from synchronization.
Building such a setup from scratch does not take that much time, maybe 1 or 2 hours, but less then a day. This may of course differ depending on your experience.
Here is the .gitignore mentioned above:
*
!.gitignore
!*/
I hire tommrow a new developer, since now i worked alone, now i need to do some enviorment to developing and do a stage - online step
what is the leading tools (even if need to pay somthing) to do that?
i saw webenabled.. so far..
You'll need a some sort of version control system (VCS) for your project code. Since Drupal.org now use Git which is pretty good and awesome, you should too. There are several hosting solution for Git, the most popular seems to be GitHub.
In your code repository, I recommend not to put the whole site directory but only your own custom code. Regardless the used VCS, here is what I put in my code repository
A .make file used to download Drupal core, contrib modules and contrib themes and apply patches (if required)
a module folder with only the custom modules
a themes folder with only the custom themes
A build script to
run drush make on the .make file to download Drupal core and contribs to a (VCS ignored) dist folder
copy the modules folder to dist/sites/all/modules/custom
copy the themes folder to to dist/sites/all/themes/custom
This to
properly track changes to your project custom code
properly track used core and contribs versions (in the .make file)
prevent core or contribs hack but allow patching when required (Drush Make requires the applied patches to be available at a publicly accessible HTTP address)
For the build script, I use Phing but any scripting languages (ant, bash, php, ruby, etc.) could be used. With some additional work, the build script can also be used to run automated test (SimpleTest) and code validation (php -l and Coder Review). In the end, the build script produce and update dist folder ready for deployment.
For multi developpers project, I try to have as much configurations as possible exported into code instead of working at the database level to store. Mainly by using exportables through the Features module and by having a project specific profile to define and update non-exportable configurations through its hook_install and hook_update_N implementations. See The Development -> Staging -> Production Workflow Problem in Drupal and the Code driven development: using Features effectively in Drupal 6 and 7 presentation.
There are a few options for this, there is deployment module that is alpha but apparently works good. Then there is plain old svn ( or even rsync ). That get the job done pretty fast, and give you the added bonus of source code management but you need to transfer databases manually.
Last but not least, and the most powerful method of the 3 mentionned, is drush.
Whatever you chose depends on the time you are willing to invest in this step, because short-term they all involve a little more time than just copying a site in another folder but the task would be automated once you do it, so long-term you can easily repeat the deployment and this is where these tools will make you save time.
Good-luck!
What is a benefit of having "build/" folder where all the sources will be placed and "built"?
Maybe it's a silly question, but I'm trying to understand Continuous Integration with PHP. Any example of build.xml for phing uses such build/ folder, but what's a sense in that for PHP where a checked out project doesn't require a compilation, only a basic configuration. Copying it all into build/ will just complicate the things, because you'll have doubled files and +1 folder to the web root path (if you'd like to have web UI to run selenium tests on)
Particularly I need phing for two cases:
1) let new user setup his first installation (or update old), right on a working copy
2) run unit/func-tests, phpcc, phpcs, phpdoc etc (all that usually on CI server)
Should I have "build/" for the second task? What is the best practice for PHP?
There are several good reasons to have a build directory (i.e., deployment to multiple environments, performing some text replacement, minimizing and combining CSS and JS, optimizing images, handling of config files etc.)
However, these may not apply in your use cases. There is no rule saying you need this directory. Depending on your thinking on testing in production, a build directory may be a good reason to keep this directory.