Setting .env.php file on heroku app - php

I have a big old project that has an .env.php file with configs for many clients.
Each client (in my case, client is a repository) has an env file with his specifics configurations.
I have a centralized repository that push for all others 'clients projects'. I have a remote in git called "main" and a lot of others remotes with the names of clients.
Then I can not put the env file inside the centralized repository. I need one env for each client (remote in git).
On heroku, each client as an app. And of course, this remote git address (of heroku app) is the address of my clients.
Today, I am trying to migrating to heroku. And I faced with this question about the many .env.php files that I have to put in each clients.
What would be the best way to send this files in heroku?

A typical approach is to write a script that pulls from the repos you need: the master one and a client specific one.
The script copies all the files into a directory and initializes a throwaway git repository on your local file system, adding all the combined files.
Finally the script pushes the throwaway repo to heroku.

Related

How to host Codeigniter app on Azure?

I wanted to ask you to help me (with some tutorial link or something), what i want to do is to upload php Codeigniter application on azure?
Also in my application, I got upload feature so I should make it work too (don't know if it's more complex becasue of that).
I was using wamp during developing, now I need to push it on server.
Generally, deploy a CI application to Azure Web Apps is simple, you can simply create an Azure Web Apps on Azure portal, then step in the manage page of your Azure Web Apps, click All settings=>deployment source=>chose source=>Local Git Repository to set the git deployment setting of your Azure Web Apps service.
Then you can find the Git clone url under the essential tab.
Additionally, Azure App Service can run php composer.phar install when you run git push, but it's not enabled by default. To enable it, you need to install the Composer extension for your web app.
Here is a CI template on Azure sample, you can refer to https://github.com/Azure-Samples/app-service-web-php-get-started for any hits.
And here is a video https://youtu.be/bBb_Hi2Odqc, you can refer to. It manages on the classic portal, but still works today.
Deploying Codeigniter is very easy, all you need to make sure is placing the project directory correctly.
Just upload the project folder to the hosting either by SSH or FTP.
Modify the .htaccess file accordingly.
Once all the above is set up, including uploading all the files to host, don't forget to update config.php database.php and route.php with new hosting parameters.
To make your upload feature work correctly you just need to give proper/required write access to the upload folder of the application.
And that's all you need to do while deploying Codeigniter application.

Using git with a website that has user uploads

okay i'll explain my issue short:
I have a website, lets say a board where user can chat and upload avatars, images, files etc.
I want to use git with this website.
So what i did:
created a repo with bitbucket
clone this repo local
adding a .gitignore file (see below)
copied my website to my local repo
made a first commit
push everything to bitbucket
Bitbucket now has my website, but without my ignored files and folders
my .gitignore looks like this (e.g):
uploades/user/avatars
cache/
logs/
Okay next step:
I'm logging in with ssh to my webserver
clone the bitbucket repo
Now my website is online without any user files (uploades, images, files etc...)
So if a user now uploades something in the git ignored folders i want the files to remain.
When i make local php changes and push it again to bitbucket and then do a "git pull" on my webserver all user files are deleted.
What do i miss? What is the best procedure?
SSH to the server and manually create uploades folder. You don't want uploades folder to be in git. I assume it doesn't have any code, and all it has is user content - so just leave in on the server.
This is how I push my local repo to a remote live server: I create a bare repo on the server, and push local repo to remote repo (on local add origin as remote). Once remote repo (which is outside my public directory) is up to date, I go to my public directory (public_html) and git clone from my other remote repo. Works like a charm.
First time setup: send local repo to remote server
On remote server create bare repo:
git init --bare myproject.git
On my local machine:
git remote add origin user#ip:sshport/git_path_of_bare_repo/myproject.git
Send local changes to remote bare repo
git push origin master
On the remote server, login into your public directory (public_html, or /var/www/, or whatever your document root is):
git clone /path/to/your/bare/repo/above/myproject.git
Inside your public directory, create whatever folders you need (uploads, logs, tmp, whatever). Further git pulls will not delete these folders.
That's it!
Sending further updates from local to remote
From your local machine just type:
git push origin master
Now your local changes are on your remote repo. On your server go to your public directory (step#4 above), and pull updates:
git pull
Works for me perfectly.

Azure Web Apps: FTP-uploaded file permissions

I'm in the process of getting a Laravel 5 app working on Azure Web Apps and am encountering an issue via Laravel's temporary storage.
Any time a template renders, Laravel attempts to cache it to the local filesystem. Unfortunately, for some reason Laravel doesn't have permission to write to its storage directory.
I am deploying my application from my build server via FTP
I am running on the free-tier shared infrastructure (just while I'm getting set up)
My deployment server is running Linux
In this circumstance, it's obvious what the problem is. Unfortunately, what I don't understand is why my web server doesn't have access to write to the directories my FTP user uploads.
Ideally any solution offered will be one that I can automate as part of my deploy process.
According to http://clivern.com/working-with-laravel-caching/, you can change the directory of the cache files using the cache.php configuration file. I'd like to suggest you to use $_SERVER['DOCUMENT_ROOT'] to obtain the root folder of your web app, and then construct a path for the cache files.

Same-origin policy bypass in local enviroment

I'm developing a SPA that runs on Backbone.js locally and setting server up with Grunt for livereload . I did a REST api with PHP for my app which i also run locally. Now i have a problem with cross domain policy since my servers are on different ports. I tried to combine two servers on one port both from apache and from grunt but i'm not sure if it is at all possible. How should i deal with this problem? i would like to develop my app locally and use the livereload features of grunt.
I propose installing nginx to act as reverse proxy. It can serve static files from one directory (aka frontend), and server side generated scripts (aka backend) from other server.
It serves backend, if the request do not corresponds to the file existing in frontend directory.
This is config example for it - https://github.com/vodolaz095/hunt/blob/master/examples/serverConfigsExamples/nginx.conf
It serves static html, css, js files from directory /home/nap/static and backend from the localhost:3000, and both of them are accessible on localhost:80 as the one server.
I hope this is what you need.
So i ended up using grunt-connect-proxy which did just what i needed.

unable to deploy symfony project to a shared hosting web server

i am using Symfony 1.4, to create my project. i have tried to deploy project using the command and methods given in chapter-16 in symfony book. but i am not able to deploy my symfony project from my local computer to web server. Can i directly copy paste my symfony project to my web server?? will that work?? or is there any other method??
NOTE: i am using Propel as ORM.
Your best option is:
1. gzip up the entire project
2. export all the data from the database using something like mysqldump
3. upload the files to the server
4. import the database dump into your server database
5. unzip your project
A couple notes:
1. Due to the shared hosting environment, you may have security issues due to other hosts on the server gaining access to your php files See here.
2. Hopefully your host allows SSH access to your server account, administering a symfony project is MUCH easier when this is the case.
3. If you don't have SSH access to your account you will have to upload the files to the server without gzipping them and this takes a LONG time due to the number of files symfony creates.
4. If possible place all your symfony files except for the web folder outside the public_html folder.
When I have to deploy something I do this:
Check if the requirement (php5, mysql4 etc) on the production server matches my application.
Change the configuration files to reflect the production server.
Upload everything to the server using ftp.
Your question is cryptic, you should really specify what kind of error you are getting.

Categories