How to handle user editable templates in a cloud stateless hosting environment? - php

I've been using since a few year now a home-made PHP CMS/ERP based on Symfony components for my clients. Every site using this framework is hosted on a "classic" linux hosting and I use git to deploy updates. Every site has a folder that stores html/css templates, images and downloadable files that the site admin can access to customize their site. This folder is git-ignored so my framework updates via git won't overwrite the site admin's changes.
I'm thinking about moving those sites in a cloud-based scalable environment like Heroku. As I understand it, this git-ignored folder would be reset every time I'd do git push to Heroku or for every new dyno created. How could I handle this without including this custom folder in the git repo ?
I've thought about storing HTML/CSS templates in the database, and copy them to the drive each time they are updated, after every git deployment or new instance creation. But that wouldn't solve the problem for images or downloadable files.

Instead of storing user-uploaded files locally, Heroku recommends putting them on an external service like Amazon S3.
You may want to use an existing library for this, e.g. KnpGaufretteBundle:
Easily use Gaufrette in your Symfony projects.
Gaufrette itself is "a PHP5 library that provides a filesystem abstraction layer". Amazon S3 is one of its supported backends.

Related

Laravel - One CMS / multiple sites

Laravel Folder Structure
I developed a CMS with laravel which I want to use for multiple websites. My goal is to have a single place to develop my CMS further and don't have old "CMS code" in older websites.
The problem is that I dont know how to structure my laravel folder structure.
This is what I have in mind:
--cms
--website-1
--website-2
--website-3
each website must have it's own
- public folder
- .env file (or config for database and other website specific settings)
- routes
- resources(views, assets)
- controller('s)
- lang files
- etc.
Development Environment
In development I want to use the website's domain(website-1.test) as indicator what website folder should be used.
Production Environment & Git
I use forge to deploy my websites. My idea is to have a git repository for the CMS and a git repository for each of the websites. In the websites repositories I only push the website folder. In the CMS repository the whole project is saved but all the website folders are .gitignored.
So for the production server i had this folder structure in mind:
CMS (git repo)
website (git repo)
When updating the front of a website I push the website repo, if I want to update the CMS I push the CMS repo.
Conclusion
Bare in mind that these are my first thoughts and I am aware that my ideas might not be possible at all. I'm looking for advice and suggestions for my situation. If it can be achieved in another way or a better way I'd be glad to receive advice on that.
Thanks in advance!
Ok .
It is not posible in same project.
But you can make subdirectories for you domain name.
It will become website-one.yoursitename.com.
And this each subdirectory will have its own files.
Thats all you need to accomplish what you want.
Regards.
I have two initial ideas of how what you are asking for potentially could be achieved:
API
Make one repo for backend creation through CMS in witch you can create multiple APIs for various sites. Then make a front-end repo for each site that calls the corresponding API.
https://laravel.com/docs/5.8/eloquent-resources
Package
You could make a composer package for your cms code and then pull that package in to each of your website project
https://blog.jgrossi.com/2013/creating-your-first-composer-packagist-package/

WordPress: Syncing media library between different servers

We have three servers with the same WordPress site - a Live version, a Beta version, and a Kiosk version (a limited version of the site used for an iPad kiosk). The Beta site is deployed from a beta branch on GitHub, while the Live and Kiosk sites are both deployed from the master branch. Any differences in functionality between the sites are controlled by environment variables, beyond that they share the same code.
The wp-content/uploads folder is shared between all three versions of the site, using a mounted Amazon EFS filesystem (couldn't use S3 due to plugin conflicts). Here's where the problem comes in.
The Live and Kiosk servers share the same RDS database, so their media libraries are always in sync. However, the Beta server uses its own separate database for testing purposes - and since WP media library objects are stored in the DB, this leads to the following issues:
When a file is uploaded to the Beta site, it isn't automatically added to the media library on the Live/Kiosk sites (and vice-versa)
When a file is removed from the shared upload folder, it isn't automatically removed from the media library, leading to broken/empty media library entries.
There's nothing necessarily wrong with this; it's just how Wordpress works. I've been circumventing the issue by using the "Add From Server" plugin which allows me to manually keep the media libraries in sync - but it would be nice to automate this.
So I think either of the following solutions would work:
A method to keep the "media" table of the two databases in sync, while keeping the rest of the tables separate
A method to automatically update the media library on all 3 servers according to the content of the wp-content/uploads folder, which is also smart enough to ignore thumbnail versions (the "Add From Server" plugin unfortunately doesn't do this, so when I tried I ended up with countless "thumbnails of thumbnails" if that makes sense)
Has anyone worked with a similar environment before? Any ideas regarding either of my proposed solutions, or possibly a better solution?
As for the dev/staging environments I usually use rsync in combination with crontab or Jenkins and symlink the copied media folder to wp-content/uploads everytime I build the staging with a dump of the production db.
This will also give you the perk of automatically having backups of your db and media folder on a different server.
I'd generally advice against updating the live db by altering a beta/dev environment since this usually defies the purpose of that kind of environment.
Regarding keeping kiosk and production in sync, the plugin WPSiteSync looks promising. I haven't tried it in production yet but it should fit your requirements.

Strategy to deploy versioned assets (for PHP app's assets to Amazon S3)

I currently deploy a PHP app's static assets using s3cmd.
If an asset for Amazon S3 has changed (like a JavaScript script), I'll rename it manually, so that the new version is synced and served by Amazon Cloudfront.
Are there documented practices for deploying a PHP app's static assets to Amazon S3? I've seen that sites use a hash to refer to a certain deployment version of their assets. I'm wondering the approach to get that hash (like the git commit SHA?) to be referenced by the web application.
The approach I could see working is writing to a stand-alone config file that holds the current SHA, and read from it for deployment.
Update 1 with current process
I'm looking to make the deployment of assets more automated:
Make a change to the app's JavaScript
Rename the script from app.23.js to app.24.js
Edit the site's header HTML template to refer to app.24.js
Commit all change to git
Merge the develop branch into master
Tag the commit with a new version number
Push code to BitBucket
Use s3cmd to syncronise the new script to Amazon S3 (and therefore Cloudfront CDN)
SSH into server, and pull the latest git tree. The production server serves the master branch.
I would like to think there is no specific answer to this question but here are a few things.
If you only want to automate the manual work you are doing then it might be worthwhile to look into a few deployment tools. Capistrano and Magallanes are two names that come to my mind but you can google for this and I am sure you will find a lot of options.
The Rails framework was built on the philosophy that there is a best way to do things. It also uses hashes to version static assets and does its thing out of the box on the fly. You can look into implementing hashing in your case.
Grunt is another automation tool that you can look into. I found this module that might come in handy https://github.com/theasta/grunt-assets-versioning
I would say for me, the 2,3,4 are the problem areas in your workflow. Renaming manually and updating code every time does not sound too nice. As you have pointed out, GIT hashes are unique so perhaps append the GIT hash to your assets during deployment and sync them to S3/Cloudfront ?

Using GitHub for an existing website to track my own and vendor code changes

I'm going to implement version control for a project via GitHub. It's a website using mostly PHP and JavaScript. And we're switching from one script to another. I'm using cPanel / ftp to access the files. I would prefer not to mess around with apache config files.
/home/website/ - home directory
/home/website/public_html/ - document root for website.com - has existing php script files which will be overwritten once the beta site is tested to be working properly
/home/website/public_html/beta.website.com/ - document root for beta.website.com - has existing php script files which have been minimally modified from the original vendor source. ie header / footer / config files.
/home/website/public_html/orig.website.com/ - a possible directory where I could update the original vendor files?
/home/website/public_html/addon-website.com/ - document root for another addon domain - completely separate website has nothing to do with the project
This script is to be used on many different websites. The difference would be in:
config files
template files
website specific changes
I have the original vendor source files on my local computer but no local web server. I need help in initial setup. How should I set things up so I can track all of the following:
track changes to the original vendor files done via patches and updates
track changes to 1 or many versions of the original script installed on different websites
track changes to a beta which has my own modifications to the vendor files
track website specific changes
push changes from vendor files to where I do my own modifications
push changes from my modifications to all different websites / website specific
I had a look at how do I maintain divergent code on github but I need more specific instructions than what is provided there. Should I even be trying to do this all in one repo with many branches / forks / whatever?
It would be beneficial to have an integration branch and each party to work on features that are merged there. This is the workflow that I developed with my colleagues and has worked out very well:
http://dymitruk.com/blog/2012/02/05/branch-per-feature/
For an introduction of Git itself, this is a very good resource:
http://git-scm.com/docs

How can I set static asset cache-control headers while using PHP on Heroku?

Is this worth trying to accomplish or should I move the assets to CloudFront (or some other CDN)?
Based on past experience, I'm hesitant to go down the CDN route with this project. I tried using a rake task to minify and concatenate my assets, then upload them to the CloudFront. However, when I deployed the project (now containing the rake task) Heroku decided that my app was now ROR and not PHP and the app was toast.

Categories