Allow Laravel app to create/modify project files? - php

I have plans on making a generator GUI, accessible on a route while in development mode. When taking some action in the GUI it will create or update corresponding files in the project folder. This means the web server user need permissions to handle the files.
What is a good way to accomplish that? I have thought of a few approaches but not sure.
After you pull in my package, let the user chmod everything themselves. This might be an issue as these changes will be committed to source control
Add some kind of installer script that you can run with sudo to do 1) for you.
Add a temporary workfolder for the generator where it has full access. Then you set up some symlink thing to the actual project folder so it stays in sync. That way you won't have the Git issues.
I am excited to hear what you think about this.

Related

GIT & Laravel stopping certain files

I've been running a project written in Laravel which has been fun to use.
The setup I use is the vagrant box Homestead configuration so I do the majority of my work on my local machine and push up to the development server once its ready to go. During the installation I had to push up the logs & vendor folder for it to work properly but now I'm at a stage where every commit I do via the command line includes storage/logs/laravel.log which when I then pull down it asks me to stash/commit on the server because they're different.
I've added it to the .gitignore file in the root directory of the project and it looks like this:
/node_modules
/public/storage
/.idea
Homestead.json
Homestead.yaml
/storage/logs/
/vendor/
Vendor doesn't cause me any problems unless I make changes to it so its not much of a bother, its just the logs which will not stop going up. If I use a GUI tool, I can manually tell it not to go up but I want to get it to the point that I can use the terminal to push it up and not worry about logs need stashing on the server.
I believe this is the same for the .env so I imagine a solution will work for both. I have also noticed that PHPStorm have said they're ignored but tracked with git if that helps.
If you take a look at the Laravel repo on GitHub, you'll find the following .gitignore file in the storage directory:
https://github.com/laravel/laravel/blob/master/storage/logs/.gitignore
This comes with the default installation to mark the logs directory as ignored. If you've deleted this by mistake, you should be able to reinstate it and resolve the issue you're having.
Just as importantly though, your workflow isn't following best practice. With respect to "Vendor doesn't cause me and problems unless i make changes to it" - you should never make changes to your vendor directory. This folder is home to third-party packages and plugins, modifying them directly causes multiple issues, chief amongst them:
You can no longer update a modified package without breaking your application.
Other developers won't be able to replicate your installation easily.
In fact, the vendor directory shouldn't be versioned at all. The best way to handle the files within it is using a package manager, like Composer, to do it all for you. This means you can easily switch between different versions of your packages and, by versioning only the composer files, other developers can run composer install or composer update to synchronise their development environment to yours quickly and accurately.

Jenkins build strategy with a PHP project from Git

I have Jenkins set up on a remote server which hosts my PHP project. Jenkins creates a build from a git repository and I would like then to move the files to the servers document root.
However the problem is all the files have the Jenkins user as their owner. So if I moved the workspace to my document root I wouldn't have permission to access them.
I'm wondering what people do to automate this? Maybe execute shell commands after the build is completed? I'm very new to Jenkins so I may be on the completely wrong track.
Short answer would be yes, you need a post-build step that runs chown (change ownership) and/or chmod (change permissions) linux commands. This will allow you to switch owning user (if jenkins user has enough rights for it) and set read/write permissions for files up to 777 (free-for-all).
However, it's more than that. You've stumbled upon deployment phase of project, and it's much more complex than that. Usually that's handled by external tools called via bash (i'm not very familiar with those, but i certainly know capistrano is an industry standard in ruby), and those tools usually (every project may require special scenario) update project files, update dependencies, deploy migrations to database, manage permissions, clean cache, etc., etc.

Multiple server update architecture

i'm using AWS for my application. current configuration is:
Load balancer --> Multiple EC2 instances (auto scaled) --> all mount a NFS drive with SVN sync
Every time we want to update the application we login to the NFS server (another EC2 instance), and execute svn update to the application folder.
I wanted to know if there is a better way (architecture) to work since i sometime get permission changes after SVN update and server take a while to update. (thought about mounting S3 as a drive).
My app is a PHP + Yii framwork + and mysql DB.
Thanks for the help,
Danny
You could use a slightly more sophisticated solution:
Move all your dynamic directories (protected/runtime/, assets/, etc.) outside the SVN-Directory (use svn:ignore) and symlink them into your app directory. This may require some configuration change in your webserver to follow symlinks, though.
/var/www/myapp/config
/var/www/myapp/runtime
/var/www/myapp/assets
/var/www/myapp/htdocs/protected/config-> ../../config
/var/www/myapp/htdocs/protected/runtime -> ../../runtime
/var/www/myapp/htdocs/assets -> ../assets
On deployment start with a fresh SVN copy in htdocs.new where you create the same symlinks and can fix permissions
/var/www/myapp/htdocs.new/protected/config-> ../../config
/var/www/myapp/htdocs.new/protected/runtime -> ../../runtime
/var/www/myapp/htdocs.new/assets -> ../assets
Move the htdocs to htdocs.old and htdocs.new to htdocs. You may also have to HUP the webserver.
This way you can completely avoid the NFS mount as you have time to prepare step 1 and 2. The only challange is to synchronize step 3 on all machines. You could for example use at to schedule the update on all machines at the same time.
As a bonus you can always undo a deployment if something goes wrong.
Situation might be more complex if you have to run migrations, though. If the migrations are not backwards compatible with your app, you probably can't avoid some downtime.

Symfony 2 without SSH access

I have a developed a small web-app in Symfony 2 and Doctrine 2.
Can i deploy it to a web-host that doesn't give SSH access?
I ask this because i see there are a lot of task that must be done from the terminal, like updating the database schema, creating symlinks for the assets, clearing cache, etc...
Should not be a problem:
Create a copy of the system somewhere, ideally with identical DB connection params like the production system.
Run all the necessary tasks with the --env=prod parameter, if your DB settings allow it.
Clone the created production database to the production system (with phpMyAdmin). You can clone the schema from the production database, run app/console doctrine:schema:update --dump-sql locally and then run the generated SQL on the production server.
Copy all the files, excluding the dirs in app/cache and app/log
I have done this many times with SF 1.4, and it should be just as easy with SF 2.
Some low end hosts have restrictions that will cause issues for symfony, so its important to run the symfony compatibility checker script (you can upload it and then enter its URL in your browser to get the output). Once thats done, follow these simple steps:
copy over all the files for the project. I usually zip/tar the project folder, upload it, and unpack.
Export the database from your development environment and upload it to your new server.
Edit the config and update your database settings. If you have hardcoded paths somewhere in your code, now is the time to fix those as well.
Make sure that the user for apache (or whatever server software your host uses) has full access to the cache and log directories. This can be tricky on some hosts, I have had to contact support in the past to have someone log in and change permissions.
In your web hosts configuration tool, set the webroot for your site to the web folder in your project.
Maybe there is a way (with sftp for example), but it would be like trying to ride a bike with square wheels ;)

php, user-uploaded files, version control, and website deployment

I have a website that I regularly update the code to. I keep it in version control. When I want to deploy a new version of the site, I do an export and then symlink the served directory name to the directory of the deployment.
There is a place where users can upload files, and I noticed once that, after I had deployed a new version, the user files were gone! Of course, I hadn't added them to the repository, and since the served site was from an export, they weren't uploaded into a version-controlled directory anyways.
PHP doesn't yet have integrated svn functionality, so I couldn't do much programmatically to user uploaded files. My solution was to create an additional website, files.website.com, which sits in a parallel directory to the served website, and is served out of a directory that is under version control. That way they don't get obliterated when I do an upgrade to the website. From time to time, I manually add uploaded files to the svn project, deleted user-deleted ones, and commit the new version. I'm working on a shell script to run from cron to do this, but it isn't my forte, so it's on the backburner as it's not a pressing need.
Is there a better way to do this?
I usually dont keep user generated data/file in svn. only the code, db schema/test data. What i usually do to deploy is an rsync from an up to date working copy which excludes the upload dir and .svn dirs. IMO content should be handled by more traditional filesystem/db backup mechanisms and not version control.
EDIT:
Just to be clear your symlinking strategy seems like a good practice. youre jsut missing the backup part it think. Id probably just tar | gzip the uploaded stuff in the cron job instead of interacting with SVN. And then probably have a seperate one to use mysqldump to dump the db and gzip that as well.
I would continue with the practice of exporting the site as upgrades are needed but have a symbolic link to a directory outside of the version controlled directory with the user uploaded content. This way when you do an export you only need to recreate the symlink if it gets blown away. You should then of course be backing up that user content as needed.
Rather than manually doing the export and managing symlinks, you could make your deployment directory a subversion checkout (from a production branch). That way, deploying is as simple as checking in your updates to the production branch.
This works as long as you have sufficient control of your subversion server and hosting setup, and your subversion repository is "ready to run." In this situation, your user directory could be an empty placeholder in subversion and would be left alone by the update process that runs on commit (business as usual for svn update). I'd still recommend (as mentioned by #Flash84x and #prodigitalson) a separate process to back up the user content.
There's an Ars Technica article with a description of how to set this up.
Update: If you follow this approach, make sure that your web server does not allow access to the .svn files in the deployment checkout.

Categories