I've been running a project written in Laravel which has been fun to use.
The setup I use is the vagrant box Homestead configuration so I do the majority of my work on my local machine and push up to the development server once its ready to go. During the installation I had to push up the logs & vendor folder for it to work properly but now I'm at a stage where every commit I do via the command line includes storage/logs/laravel.log which when I then pull down it asks me to stash/commit on the server because they're different.
I've added it to the .gitignore file in the root directory of the project and it looks like this:
/node_modules
/public/storage
/.idea
Homestead.json
Homestead.yaml
/storage/logs/
/vendor/
Vendor doesn't cause me any problems unless I make changes to it so its not much of a bother, its just the logs which will not stop going up. If I use a GUI tool, I can manually tell it not to go up but I want to get it to the point that I can use the terminal to push it up and not worry about logs need stashing on the server.
I believe this is the same for the .env so I imagine a solution will work for both. I have also noticed that PHPStorm have said they're ignored but tracked with git if that helps.
If you take a look at the Laravel repo on GitHub, you'll find the following .gitignore file in the storage directory:
https://github.com/laravel/laravel/blob/master/storage/logs/.gitignore
This comes with the default installation to mark the logs directory as ignored. If you've deleted this by mistake, you should be able to reinstate it and resolve the issue you're having.
Just as importantly though, your workflow isn't following best practice. With respect to "Vendor doesn't cause me and problems unless i make changes to it" - you should never make changes to your vendor directory. This folder is home to third-party packages and plugins, modifying them directly causes multiple issues, chief amongst them:
You can no longer update a modified package without breaking your application.
Other developers won't be able to replicate your installation easily.
In fact, the vendor directory shouldn't be versioned at all. The best way to handle the files within it is using a package manager, like Composer, to do it all for you. This means you can easily switch between different versions of your packages and, by versioning only the composer files, other developers can run composer install or composer update to synchronise their development environment to yours quickly and accurately.
Related
As you can see here: Specifying Dependencies
Google App Engine "Composer runs automatically when you deploy a new version of your application..."
I would prefer it didn't. I have scripts that run prior to deployment to remove all of the unnecessities in composer dependencies that are not needed in a deployed application (documentation, testing, etc.).
My .gcloudignore file does not include the /vendor directory and this is confirmed by the file count on upload. When I checked the deployed source however, the composer dependencies do not match my cleaned local dependencies because App Engine is overriding them.
Not sure why this is a feature in the first place, it seems it would be better to leave this in the control of the developers.
I do have billing enabled and am full featured. Thanks!
EDIT
I added the composer.json and composer.lock files to the ignore file to prevent the update and then the vender directory was missing completely from the deployed source. The vendor directory is not in the ignored file and has been uploaded. If I add a depenency, for example, the upload file count reflects the file count for the added dependency files.
It makes me wonder if something is running after the upload during the deployment to remove the vendor directory. They do have it as being removed by default in the ignore file. I assumed though that by taking it out it would allow the upload and deployment of it (the vendor directory)
I've been working on reproducing your issue and trying to attain exactly everything you described as things you need to happen. And I've found some interesting things.
I wanted to see the exact behavior of a vendor/ folder when you deploy it to GAE and understand exactly what causes your issues, so i followed these steps:
Cloned a simple PHP project just to save time.
Installed composer locally so i can require the dependencies.
I required a simple dependency (so it creates the vendor/, composer.json, composer.lock
composer require phpunit/php-timer
This made me have locally the dependencies so it directly uploads them. Can you try to do that with your composer file before attempting to deploy?
I did gcloud app deploy and it uploaded everything, I still have my doubts as if this is my local folder or generated at run-time.
Then I made a simple .gcloudignore as a test, added test* into it, did a gcloud app deploy, and it uploaded everything except the test folders.
Here you can specify everything that you want to avoid uploading, i think this would be a much easier solution for all your troubles. These are the .gcloudignore formats you can use.
It worked for me like this, hope it can help you out.
Let me know.
Bit of a strange set up but have come across a project where Composer has been used in a local environment to get a project started. The original developer did not have ssh access to the production server therefore he used Composer locally and uploaded the 'vendor' directory from his desktop to the server using FTP.
I now need to add the PHPMailer package so have done the following locally on my Mac:
cd Desktop/
composer require phpmailer/phpmailer
This has created the following structure on my desktop:
Desktop/composer.json
Desktop/composer.lock
Desktop/vendor/autoload.php
Desktop/vendor/composer/*
Desktop/vendor/phpmailer/*
Which of these do I need to upload through FTP? I realise vendor/phpmailer/* is the package I want, so will need uploading.
What about the others? I already have an autoloader configured so guessing vendor/autoload.php is not required here?
composer.json I could add the package to what's already there, e.g.
"require": {
*other packages here*
"phpmailer/phpmailer": "^5.2"
}
But I wasn't sure if that's necessary because I'm not going to be using ssh/Composer on the server to run any updates?
The usual workflow would be:
Checkout the current version from the version control.
Add dependencies via command line composer require new/package.
This will download the new package and update the autoloading.
Test the result locally or on a test website environment.
If happy with the result, upload the whole folder to the production server.
There may be several exceptions from this general workflow:
ad 1: If there is no version control, you'd probably better of starting a local git repo right now, and download the current production state into it as the first commit. Not having version control will make things harder, especially going back to known working versions. And because the files on the production server are probably unmanaged, you'd also check in the vendor folder into your newly created version control just to avoid canceling any changes that had been made to these files.
ad 2: Manually editing the composer.json file sometimes is a faster way to get what you want if you know what you are doing, but you'd have to correctly edit the JSON. For me it usually is too much hassle if I already have a command line ready. The command will also select a matching version that fits into the already installed dependencies. Manual editing may lead to version conflicts that you'd have to untangle. Remember to only install dependencies that work with the PHP version in production. You probably should run composer config platform.php X.Y.Z in order to add the production version of PHP into the composer.json file, which prevents Composer from installing dependency versions based on your development PHP. Adding the -g switch will add this setting to your global (user) setting instead, which will affect all composer operations you start, also for other projects.
ad 3: Manual editing will require you to run composer update on the command line, so there's probably no reason to not do composer require instead.
ad 4: How this could be done is entirely dependent on what environment you have to work with.
ad 5: At this stage you have assembled all files necessary to create a working website. Uploading them to production will always result in a working website unless the upload fails somehow. You could also use some "upload first to temporary folder, then move on the server" approach if you fear FTP would be unreliable. Some people take a different approach: They have a git repository on the production server and they simply push the version that should go live onto that remote repo. Some post-push scripts will run composer install then. This automated approach will also work (but not using FTP), but has the higher risk of something failing during deployment, and probably has no easy way back to the previous situation.
So in the end I'd say that uploading the whole folder structure via FTP (well, that protocol is insecure itself, better replace it with FTPS (FTP with SSL), SFTP or SCP) is better compared to running Composer on the production server.
Your specific question regarding which folders to upload: All of them. Especially upload the whole vendor folder. It contains the current autoloader and all dependency packages the software needs. If you worked correctly, you downloaded the existing composer.json and composer.lock file together with everything else and added the new dependency to it. This has changed both these files, added the new package to the vendor folder and the classes to the autoloader.
Don't fiddle with uploading only parts of the vendor folder, or manually editing a component of the autoloading. You will only create surprises for the developer coming after you if you do some aspect incorrectly, and it also takes more time. Composer is a very good tool to manage dependencies - use it!
You could copy the composer.json from the server to your local server, add the requirements and run composer update.
After that you can copy all files (composer.json, composer.lock and vendor folder to your server)...
Or you can copy local vendor/phpmailer into servers vendor folder, search for the entry of phpmailer in vendor/composer/autoload_psr4.php and add it to your servers vendor/composer/autoload_psr4.php.
Using this method also add phpmailers dependencies the same way.
composer depends phpmailer/phpmailer
More and more projects are starting to pile up and I want some workflow and also version control over the different projects. I'm trying to set the "right" workflow with Xampp, Git, GitDesktop and PhpStorm on a Windows 2012r2 machine.
Xampp base: d:\xampp
http://localhost = d:\xampp\htdocs
Dev repositories: d:\xampp\htdocs\repositories\dev\GitDemo
Live repositories: d:\xampp\htdocs\repositories\live\GitDemo
Live folder: d:\xampp\htdocs\GitDemo
Live URL: http://servername/GitDemo (intranet use only)
Right now I have my repositories folder inside the htdocs folder, otherwise I would need another alias/copy action to be able to see what I'm developing. But at the same time the repositories folder is exposed. How to hide this? htaccess?
I've ran git init --bare inside the live folder for this project. With GitDesktop I've created the repository for GitDemo inside d:\xampp\htdocs\repositories\dev.
Using PhpStorm I've created a project based upon local files and pointed it towards d:\xampp\htdocs\repositories\dev\GitDemo. I'm able to see the changes made using git status, add them using git add . and commit them succesfully with git commit -m "my commit..".
I've created a remote server git remote add live master and created a post-receive to checkout the files inside d:\xampp\htdocs\repositories\live\GitDemo to d:\xampp\htdocs\GitDemo.
This all feels like a "ton" of work to set up initially and somewhat redundant (having the same files in 3 locations).
Is this the ideal way to set this up, or do you suggest an alternative approach? Thanks for sharing your thoughts!
I've been thinking about the most logical solution (to my opinion at this moment). Here's how I solved my unclear items as described:
I've optimized it by:
Bringing the repositories directory outside htdocs to d:\repositories\dev\ and d:\repositories\live.
I've set up a symlink to http://localhost/dev/GitDemo that links to
d:\repositories\dev\GitDemo. In this way I don't need to place the
repositories folder inside the htdocs folder and still benefit from
Apache being able to serve the content, which actually resides outside the htdocs folder.
The live version is now placed at http://localhost/GitDemo and with a post-receive hook it gets deployed from d:\repositories\live\GitDemo.git to d:\xampp\htdocs\GitDemo.
If you think I made a mistake or have mistaken something from the way it's supposed to be, please correct me as I'm still not sure this is the correct way but at least it seems like it to me.
I'm trying to establish a development process with symfony where developers can clone the symfony repository to their localhost and simply test their code on their computers. However I'm running into issues, of which git is not adding all the necessary files to copy (.gitignore issue), but also symfony doesn't know in which directory it is installed in when clearing the cache. Symfony is trying to clear it's cache stored in the wrong directory.
It seems that simply copy and pasting the symfony project and linking the apache server to the app_dev.php file doesn't work well. What would be a better approach when working with symfony with multiple people?
You should never pass your symfony application around as a copy when working with multiple people ... and even worse including the cache ...
use a cvs like git ( ignoring the cache folder ) and provide DataFixtures and a parameters.yml.dist in order to provide your co-workers with the latest development status and commented changes/commits.
The easiest way for your colleagues to have a running copy quickly would be providing a Vagrantfile including some provisioning.
This way you can make sure not only all your people have the same code-base but the same webserver, php-version/settings aswell.
Between me and a Network Architect we manage a bunch of Web Servers (FreeBSD). He's responsible for all server/network related stuff (IPs, Firewalls, users/groups, etc.) and I'm responsible for all web-related stuff (Apache, PHP, MySQL). Sometimes the responsibilities overlap.
It happend few times that some changes were made to the config files which more or less affected the server and we were not able to figure out which of us made the changes and why.
I - being a Web Developer - think it'd be a good practice to put the files under version control (we currently use Subversion), so that whenever we change anything we have to commit an comment the changes. It'll solve all the problems with wondering who did what and why.
The particular config files I was thinking of were:
firewall config
apache config (with extras)
php config (php.ini)
MySQL config (my.conf)
I already know, that the idea version control of server config files is sound based on other question asked here. My only worry is how to do it properly on the Web Server side since the files are in different locations. Putting the whole /usr/local/etc under version control seems pointless as it contains not just the config files.
I was wondering whether not to create a new folder, say /config which would be under version control and would contain all the config files we need and then replace the original ones with symlinks to ones in the /config folder. E.g.:
/usr/local/etc/apache22/httpd.conf -> /config/apache22/httpd.conf
So the question is: Is this a good idea and if not, what is a better solution?
If you use GIT then puitting the whole /usr/local/etc under version control is not pointless at all.
you can only track a handfull of files if you so chose
the working directory with all config files tracked is hardly bigger in size
Just install git, then go to /usr/local/etc and write git init. This will create the .git folder in your current location (basically making this folder a repository).
Then add the config files you want to track:
git add firewall/firewall_config.conf apache2/httpd.conf etc
and commit: git commit -m "Initial Configuration"
Your config files are now being tracked.
Since you are versioning sensitive configuration files, I would recommend setting up an internal git server like gitlab. Create a git repository for each server, or server template/image/etc. Go to the / directory and 'git init'.
You can be selective about what you put under version control by only using 'git add /path/to/file for the files that you will be customizing.
Then, 'git -m 'commit comment' and 'git push -u origin master'.
Have you looked at any of the management tools made to do just this sort of thing? I recommend Puppet or Chef, having used them in previous jobs to do this sort of thing.