How could I simulate an offline directory with Heroku? - php

I used to have an offline directory on my server with Perl scripts to dynamically create files.
Say this directory was in an offline directory for security reasons (/server/back/scripts) I used to access it with exec(/server/back/scripts/auto.pl $arguments)
Contents of auto.pl:
system('cp /server/back/includes/default /server/front/ann/'.$enc.'.php');
system('chmod 555 /server/front/ann/'.$enc.'.php');
system("perl -pi -e 's/string/".$key."/g' /server/front/ann/".$enc.".php");
This script copy-paste a default file with garbage values to a public directory, and replace garbage values with something else while setting up the rights we want.
how can I reproduce this on Heroku? - if not possible is there any way to at least reproduce the behavior of this script?

It looks like the goal of this script was to inject keys/credentials into your PHP application by searching/replacing.
Heroku encourages configuration via environment variables, especially keys/credentials.
You should add your keys via the Heroku command line tool:
heroku config:set MY_API_KEY=super-secret-hex-goes-here
... and then pull in the values from the environment on your dyno in your PHP code:
$api_key = getenv('MY_API_KEY');
This will allow you to provision keys/secrets for each running application on Heroku without having to store anything in source code.

Heroku does not expose the File system of the current web dyno. So you can't make any changes to the files to have an impact on the running server.
Files can be changed if you have the file in the code base itself. So, you could commit the change and deploy on Heroku.
If these variables, that you are talking about have to be used system wide then the best way is to use Environment variables as mentioned in Winfield's answer.
You can set and unset Environment variables using command :
heroku config:set VAR_NAME=value [VAR_NAME_2=value.........]
heroku config:unset VAR_NAME [VAR_NAME2.......]
If you need to maintain a file with current values of Environment variables , then you can use this command :
heroku config:pull --overwrite
This will get all the Environment variables currently set on server and store them in .env file.
Then you can update the values in .env file locally itself and update the same on server by command:
heroku config:push
This will replace all current values on server with the values in the .env file
Read more about environmet variables here : https://devcenter.heroku.com/articles/config-vars

Related

Move files from a docker container to an outside folder on the same server using php

I am fairly new to docker, please bear with me if its a basic question. I have a laravel project on a server, the project is dockerized. What I want to do is move a file from my project to another location on the same server that is not dockerized.
So, my project is setup on /var/www/my-project directory and I want to copy a file from my-project/storage/app/public/file.csv to /var/{destination_folder}. How can I do that in laravel? I think my issue is not related to laravel it is related to docker which is not allowing to move files out of it. Please don't add laravel or php file copy code snippets,I have tried plenty.
What I've tried?
1- I have tried copying file using:
Storage::disk('local')->put('/var/{destination_folder}', 'my-project/storage/app/public/file.csv' )
but, it does not copy the file.
2- I have also tried moving the file using bash script which I'm executing from my laravel controller using shell_exec or process but, it is also not working.
cp "/var/www/my-project/storage/app/public/file.csv" "/var/destination_folder"
What's hapening in this solution is that it is working when I run the command from terminal, but its not working when I call it from my controller and it gives me
cp: cannot create regular file '/var/destination_folder/file.csv': No such file or directory
After googling the above error it seemed that this is a permission issue wo, I changed the permission of the destination folder to 775 and I also checked the user from which I was running the laravel app and it gave me root when I ran whoami from the app.
Let me know how this could be achieved, thank you!
The entire point of docker is that it is isolated from the base host. You cannot simply copy the file out, as the docker host does not have access to any disk that is not mounted.
The easiest option is to create a destination directory and create a bind mount as per https://docs.docker.com/storage/bind-mounts/
You would then use the following argument for your docker run:
--mount type=bind,source=/var/destination_folder,target=/some_directory_inside_your_docker and copy the file to some_directory_inside_your_docker and it will appear in the parent host.
Another option is to generate a user account on the parent host, LOCK IT DOWN HARD for security reasons, and then have a private key inside your docker that would allow your docker to SSH to the parent host (note, this won't work with every network configuration). I don't think it's a good idea when you can do bind mounts, but it would work.

Symfony command php bin/console can't read Apache SetEnv?

With Symfony 5.2, when I execute this command
APP_ENV=prod php bin/console d:m:m
I have this message :
WARNING! You are about to execute a migration in database "db_name" that could ...
However, in my Apache environment variables, I customized the database name :
SetEnv DATABASE_URL "mysql://website:password#localhost:3306/website_prod"
I am sure that this configuration works (when I access the site, I am in the prod environment while I left dev in the .env generated by Symfony).
Why is the wrong database displayed on the APP_ENV=prod php bin/console d:m:m command line? I think Apache variables are not taken into account in php bin / console ... command line and I need to create a specific .env.prod.local.
Can you confirme ? If yes, I don't see why Symfony mentions this in their documentation (https://symfony.com/doc/current/setup/web_server_configuration.html#apache-with-mod-php-php-cgi)
Console commands do not run under the web-server, hence they do not have access to whatever configuration you have for the Apache vhost or anything like that.
The best way to deal with this kind of configuration is storing this values in environment variable, .env files, or use Symfony's secret management features.
This way this configuration will be available both to the application when accessed via the webserver or through a command line script.

.env file wont load variables into config.php - student setting up first php dev environment

Afternoon everyone,
I'm a student studying Computer Science and I'm trying to recreate the environment my friend is using to host their PHP based web app. They're on a Mac using heroku local (Procfile calling heroku-php-apache2) to set up their environment. I'm on a Windows 10 PC, and from what research I'm done, heroku local is not supported in any way. So I enabled WSL installed Ubuntu 18.04 and Apache2 and, as far as I can tell, downloaded and installed all of the other components necessary to make it run (composer, modrewrite, modenv, etc). phpinfo(), heroku's sample project, and any simple php pages I make display properly. My friend's app on the other hand is still giving me trouble.
They're using a .env to declare project specific environment variables that are further defined in a config.php. The app is deployed and works in Heroku and on their machine, but when I try to load the app locally on my machine I get an exception thrown saying the environment variables aren't being loaded. If I add "local: php -S localhost:80" to the Procfile and run heroku local on Ubuntu, it sees the .env file and says its loaded, only to kick back the same errors my apache2 instance is throwing.
What could be causing this? I've edited php.ini to include an "E", enabled modrewrite and modenv, made sure my .env file was encoded in UTF-8 - I've searched far and wide for a reason this might be happening but I keep coming to a dead end. Is there something about the "heroku local" command and instance that I'm missing? I'm still new to php, web servers, and programming in general, so any relevant information regarding why my .env file isn't working or any possible ways I can get heroku local to work on Ubuntu using WSL would be massively appreciated.
I'm using WSL and am similarly having problems with my .env not working properly.
Although I'm still looking for a more elegant fix, the hack I'm relying on now is:
Copy the contents of your .env to a temporary text file.
In that text file, do a regex find-and-replace searching for "\n" and replacing with "\nexport " (with a trailing space).
Move the final line's "export " to the first line.
Copy the contents of this temporary text file.
Paste into your terminal to run these commands.
Now you should be able to run your app.
Straightforward Method
You can do this very simply by adding the environment variable to your .bashrc file, assuming that your Windows Environment variables are set correctly and WSL is installed, I'm going to use Java as an example, but any environment variable will work.
Use text editor in WSL or type code .bashrc from WSL home to initialize the WSL VSCode editor. Add the environment variable to the file:
# Shared environment variables
export JAVA_HOME=/mnt/d/Java/jdk11.0.4_10
Just ensure that you change the directory to point to your directory. In my case, it's in D:\Java\jdk11.0.4_10 which in WSL is /mnt/d/Java/jdk11.0.4_10
Also, since you're using Windows binaries, you must specify the file type when running from a WSL bash shell:
Example
Windows:
javac MyClass.java
java MyClass
WSL:
javac.exe MyClass.java
java.exe MyClass
Note that the .exe file extension is being used, since we're calling the Windows binary. If it was a native Linux distribution of the JDK you could simply use java command.
This holds true for any Windows binary that is being run through WSL.

Laravel Artisan - Cli doesnt reads environment variables

I had made a custom artisan command. In homestead everything goes right, but when I execute the command on server it cant read environment variables. When I ssh the server and try php artisan external:import the connection timeouts cause of null env vars.
On artisan tinker env() function returns null.
How can I read env vars on server?
IMPORTANT: This server is part of AWS, so it does not have .env file.
did you clear config cache already?
php artisan config:clear
i think env function did not work after you cached the config
I know it's a bit late to answer this question, however, I'll share my experience in case it can help someone.
The environment variables will be available depending on how they have been configured. The different ways to configure these variables are:
Setting them for the current user ~/.bash_profile, ~/.bash_login or ~/.profile
Setting them globally with the /etc/profile file or with a file in the /etc/profile.d/ directory
Establishing them at the web server software level. For example, in the httpd.conf file with SetEnv, in apache.
If you have set environment variables through server software, they will be available only to scripts that handle http requests. This leaves out the artisan commands.
My recommendation is that you add a step to your deployment process to set environment variables globally by creating a new file in /etc/profile.d/. For example, /etc/profile.d/webapp.sh. So webapp.sh would have lines like
MY_VAR=my-value
I hope this helps you.

Laravel 4 - cloning the local project on the VPS

I use Laravel 4 to develop my projects.
I develop on my Mac, commit to Git, then clone it on the server (linode 1G VPS).
Since "vendor" folder is by default "GIT-ignored", I usually do "composer install" after cloning the project on the server.
After that, any other packages I install locally, I do "composer update" on the server.
Yesterday, I reported this problem - PHP Composer update "cannot allocate memory" error (using Laravel 4)
So far, I have not found a solution. I even tried to do a "fresh" cloning and "composer install", it's giving me the memory error. This is extremely frustrating.
My question is then, is it ok to just upload my entire project to the server? Since "vendor" folder is only thing that is "git-ignored", if I just copy everything there, would it work? (I haven't tried it since my server is alive at the moment and I don't want to damage anything).
What is the actual role of "compiled.php" file? Is it platform dependent? Can I copy that file too?
I've seen this memory issue quite a few times now and read other people reporting the similar issue. I hope I can just upload the entire project folder and cross my fingers that it will work.
Thanks for your help!
I do not have VPS, or even shell access to my custom/shared hosting from my provider, but I can run git and composer commands without that.
Use sshfs http://osxfuse.github.io/
sshfs actually does SFTP connection to your server and mounts server to local directory.
This way, you can run git and composer commands localy. You do not depend on your VPS/hosting server. sshfs sends files in background to remote server.
To mount VPS to local dir, run this
sshfs user#serverip:. /path/to/existing/local/dir // to mount root dir
cd !$ // to get into mounted dir
// or
sshfs user#serverip:foldername /path/to/existing/local/dir // to mount specific dir
cd !$ // to get into mounted dir
Now you can do whatever you want.
a good thing to know for you - it is possible to set up Laravel config in such a way, that the same app (the very same copy of code) can act differently on different servers (environments).
I am wrtiting that, because if you sync your remote server with a local copy of the code sooner or later you will stumble upon issues like changing the db credentials or app setup after every sync - which of course doesn't make sense :)
Check out Laravel 4 Docs Environment configuration to read more about that, or follow this tutorial by Andrew Elkins - How to set Laravel 4 Environments
The environment is based on url matches.
You’ll find that configuration in /bootstrap/start.php
$env = $app->detectEnvironment(array(
'local' => array('your-machine-name'),
));
Now say you are developing locally and use the prefix/postfix local. E.g: my-new-site.local or local.my-new-site
$env = $app->detectEnvironment(array(
'local' => array('local.*','*.local'),
));
That sets the environment, now to use it you’ll need to create a local folder in /app/config/
1 mkdir app/config/local
And so you want to have a different database configuration for local. Just copy the database config file in to the local directory and modify it.
1 cp app/config/database.php app/config/local/database.php
To sum up and answer your question:
1) I guess it's OK to copy the whole project dir to remote server (although, if your copying vendor it might take a lot of time - it usually contains a big number of files)
2) if you do so, remember to have the updated composer.json on remote server (to reflect all the necessary requirements)
3) If you are using different database servers local and remote - you obviously have to run migrations and seeders on the remote server (this concernes also package migrations/seeds)
4) after you migrate all your files, do
composer dump-autoload
php artisan optimize --force --env=YourProductionEnvironmentName
which should rebuild the bootstrap/autoloaders
5) if you are using the Laravel Environments setup mentioned above, remember to have your remote server seen as production (if your local is testing/staging).

Categories