I'm facing a problem with my Laravel application. It is deployed on the server and uses nginx as the main web server.
I am unable to read two files from the path I set in the deployment server, but on the local environment, it is working fine. These two files are confidential, one is .crt file and second is .pem file. I placed both of these files in laravel-app/app/Certificates directory.
laravel-app/app/Certificates/test2-2023_cert.pem
laravel-app/app/Certificates/curl-bundle.crt
The file path in .env is set like this:
PEM=/Certificates/test2-2023_cert.pem
CRT=/Certificates/curl-bundle.crt
My config/services file:
<?php
return [
'pay' => [
'pem' => app_path() . env('PEM', ''),
'crt' => app_path() . env('CRT', '')
]
];
My Controller construct function:
public function __construct()
{
$this->pem = config('services.pay.pem');
$this->crt = config('services.pay.crt');
}
It is returning the following path on server:
/var/www/laravel-app/app/Certificates/test2-2023_cert.pem
/var/www/laravel-app/app/Certificates/curl-bundle.crt
but the server is not loading the file and gives me an error response in the API call. What I'm doing wrong?
Related
I am starting the program with Laravel and I have in my ".env" file my access data to the server:
USER_SERVER=user
PASSWORD_SERVER=123456789
I have my role in my controller:
public function connectServer()
{
$connection=ssh2_connect(172.17.2.33, 22);
ssh2_auth_password($connection, USER_SERVER, PASSWORD_SERVER);
$stream=ssh2_exec($connection, "df -h");
$errorStream = ssh2_fetch_stream($stream, SSH2_STREAM_STDERR);
}
My role above has problems. What is the correct way to call the user and password of my ".env" file for my function?
You should use the configuration system for these values. You can add to a current configuration file in the config folder or create your own that returns an associative array. We will use the env() helper to pull the values you set in the .env file:
<?php
return [
...
'server' => [
'user' => env('USER_SERVER', 'some default value if you would like'),
'password' => env('PASSWORD_SERVER', ...),
],
...
];
If you added this key to the services.php file in the config folder you would then access this configuration value via the Config facade or the config() helper function:
// if we want a value from the services.php file
Config::get('services.server.user');
config('services.server.user');
// if you want all of them for that server array
$config = config('services.server');
echo $config['user'];
Using the configuration system this way allows you to use the "config caching" that Laravel provides, as you will not have any env() calls in your application, except for in the config files, which can be cached. [When configuration caching is in place it doesn't load the env]
"If you execute the config:cache command during your deployment process, you should be sure that you are only calling the env function from within your configuration files. Once the configuration has been cached, the .env file will not be loaded and all calls to the env function will return null." - Laravel 7.x Docs - Configuration - Configuration Caching
Laravel 7.x Docs - Configuration - Accessing Configuration Values config()
Laravel 7.x Docs - Configuraiton - Retrieving Environmental Configuration env()
I want to store excel file to a network shared folder using laravel 5.8. I have read laravel documentation for using filesystem to do this. I have do symbolic link using mklink to directory C:\Users\Administrator\Documents\Generate. And I have setting permission read and write for the Generate folder.
This is my filesystem :
'trading' => [
'driver' => 'local',
'root' => 'C:\Users\Administrator\Documents\Generate\tradingid',
],
This is my function :
public function ($id){
$data = Excel::store(new ExportData($id), 'ExcelData.xlsx', 'trading');
if($data){
echo 'success';
}else{
echo 'error';
}
}
For store the excel file i'm using library Maatwebsite.
When function executed, i got error response like this
"The root path C:\\Users\\Administrator\\Documents\\Generate\\tradingid is not readable.".
How to fix this issue?
I've to deploy several instances of a Laravel app to a unique server. Each instance requires a different database configuration. The default Laravel's environment configuration based on hostnames doesn't work for me because all the apps are going to be on the same server, so there's no way to tell which config fiel to use. Here is my bootstrap/start.php file:
$env = $app->detectEnvironment(array(
'development' => array('Ariels-MacBook-Pro.local'),
'server' => array('srv-hostname'),
));
It would be great that I can define the environment based upon a domain (because my apps area gonna be on diferent domains), so in this way I can define a different config for each domain (hosted on the same server)
Any ideas?
Laravel's detectEnvironment method has a nify feature where you can progamatically determine the current enviornment with a closure. For example, this would configure Laravel to always use the local enviornment.
$env = $app->detectEnvironment(function()
{
return 'local';
});
The domain name should be somewhere in $_SERVER, so something like this untested pseudo-code should get you what you want.
$env = $app->detectEnvironment(function()
{
switch($_SERVER['HTTP_HOST'])
{
case: 'example.com':
return 'production';
case: 'another.example.xom':
return 'production';
default:
return 'local'; //default to local
}
});
My option, in file app/config/database.php add this line at the end
'enviorement' => 'local',
// 'enviorement' => 'development', // this is for dev
// 'enviorement' => 'production', // this is for production
and then access from your controller with this
$env = Config::get('database.enviorement');
echo $env;
database.php file is different, you have for ur local another for developemet server and another for producction because there is the "database conection" so i used it to implicit write the enviorement.
Have fun.
i just can't get this code to work. I'm getting an image from the URL and storing it in a temporary folder so I can upload it to a bucket on Google Cloud Storage. This is a CodeIgniter project. This function is within a controller and is able to get the image and store it in the project root's 'tmp/entries' folder.
Am I missing something? The file just doesn't upload to Google Cloud Storage. I went to the Blobstore Viewer in my local App Engine dev server and notice that there is a file but, it's empty. I want to be able to upload this file to Google's servers from my dev server as well. The dev server seems to overwrite this option and save all files locally. Please help.
public function get()
{
$filenameIn = 'http://upload.wikimedia.org/wikipedia/commons/1/16/HDRI_Sample_Scene_Balls_(JPEG-HDR).jpg';
$filenameOut = FCPATH . '/tmp/entries/1.jpg';
$contentOrFalseOnFailure = file_get_contents($filenameIn);
$byteCountOrFalseOnFailure = file_put_contents($filenameOut, $contentOrFalseOnFailure);
$options = [ "gs" => [ "Content-Type" => "image/jpeg" ]];
$ctx = stream_context_create($options);
file_put_contents("gs://my-storage/entries/2.jpg", $file, 0, $ctx);
echo 'Saved the Image';
}
As you noticed, the dev app server emulates Cloud Storage locally. So, this is the intended behaviour-- and it lets you test without modifying your production storage.
If you run the deployed app you should see the writes actually going to your GCS bucket.
I am using PHP 5.4 RC5, and starting the server via terminal
php -S localhost:8000
Currently using Aura.Router , and at the root I have index.php file with code
<?php
$map = require '/Aura.Router/scripts/instance.php';
$map->add('home', '/');
$map->add(null, '/{:controller}/{:action}/{:id}');
$map->add('read', '/blog/read/{:id}{:format}', [
'params' => [
'id' => '(\d+)',
'format' => '(\.json|\.html)?',
],
'values' => [
'controller' => 'blog',
'action' => 'read',
'format' => '.html',
]
]);
$path = parse_url($_SERVER['REQUEST_URI'], PHP_URL_PATH);
$route = $map->match($path, $_SERVER);
if (! $route) {
// no route object was returned
echo "No application route was found for that URI path.";
exit;
}
echo " Controller : " . $route->values['controller'];
echo " Action : " . $route->values['action'];
echo " Format : " . $route->values['format'];
A request for http://localhost:8000/blog/read/1 works as expected.
But when a dot json or dot html like http://localhost:8000/blog/read/1.json , http://localhost:8000/blog/read/1.html request comes, the php throws
Not Found
The requested resource /blog/read/1.json was not found on this server.
As I am running the server with the built in php server, where can I fix not to throw the html and json file not found error ?
Or do I want to go and install apache and enable mod rewrite and stuffs ?
You are trying to make use of a router script for the PHP built-in webserver without specifying it:
php -S localhost:8000
Instead add your router script:
php -S localhost:8000 router.php
A router script should either handle the request if a request matches, or it should return FALSE in case it want's the standard routing to apply. Compare Built-in web serverĀDocs.
I have no clue if Aura.Router offers support for the built-in web server out-of-the-box or if it requires you to write an adapter for it. Like you would need to configure your webserver for that router library, you need to configure the built-in web server, too. That's the router.php script in the example above.
you can specify the document root by passing the -t option like so:
php -S localhost:8888 -t c:\xampp\htdocs
Sorry, I'm not familiar with Aura.Router. However, I would use whatever is going on the production server. You might find some unexpected errors when you go live with the project if you do not sync the same program versions on your test and production servers.