Laravel not creating log file. - php

I'm learning Laravel 4.0 to develop a webserver.
I'm using a LAMP stack (Apache 2, php 5.5).
I can't find the log file where Log::error() calls writes to.
As far as I know it's supposed to be to app/storage/logs/log-cli-.txt but there are no files there.
Here is the code:
app/commands/MsgCommand.php
public function fire(){
Log::error('messages - log');
}
It's called from artisan:
app/start/artisan.php
Artisan::add(new MsgCommand());
Am I looking in the right place?
How can I check that this is indeed the right folder (i.e. where is it configured)? To check for faulty installation or setup.
Thanks to marcanuy, I am now sure it is writing to app/storage/logs.
Also I found out it writes fine if I call the command through artisan. Running on apache 2 nothing happens though. I'm starting to think I set up the command wrong.

By default app/storage is the location for log files, storage folder is configured in bootstrap/paths.php
'storage' => __DIR__.'/../app/storage',
Also be sure that this folder is writable by the web server.

The problem was permissions.
User permissions for the www-var user in the ../app/storage
And MySQL settings: Creating a user corresponding to what is set in the app/config/database.php
'mysql' => array(
'driver' => 'mysql',
'host' => 'your host',
'database' => 'your db',
'username' => 'your user',
'password' => 'your pass',
),

Related

Cannot read credentials from /.aws/credentials - PHP script call AWS-SDK

I've looked at every answer on here and it seems my problem is a little different or there hasn't been a proper solution. I'm doing the following in my PHP file:
use Aws\Route53\Route53Client;
$client = Route53Client::factory(array(
'profile' => 'default',
'region' => 'us-east-1',
'version' => '2013-04-01'
));
Getting this error:
Fatal error: Uncaught Aws\Exception\CredentialsException: Cannot read credentials from /.aws/credentials
Seems like the easy fix would be ensure that the HOME directory is the right one. Indeed it already is. Files are readable and my ec2-user is already the owner. Key and Secret is already installed in the 'credentials' file. Profile name is already set to 'default.' Tried to copy /.aws to other directories such as the root, /home, etc and changed permissions, chmod, all the above. Still nothing.
Then I tried to hard-code the credentials (I know -- not recommended) just to give it a little kick, and it completely ignores that I did this:
$client = Route53Client::factory(array(
'profile' => 'default',
'region' => 'us-east-1',
'version' => '2013-04-01',
'credentials' => [
'key' => $key,
'secret' => $secret,
]
));
As a last resort, I even tried including the CredentialProvider class, and passing this into my array -- still nothing:
'credentials' => CredentialProvider::ini('default', '/home/ec2-user/.aws/credentials'),
What on earth am I doing wrong?
Just remove 'profile' => 'default', and you should work fine
$client = Route53Client::factory(array(
'region' => 'us-east-1',
'version' => 'latest',
'credentials' => [
'key' => $key,
'secret' => $secret,
]
));
Running on AWS Centos 7, I tried everything (chmod/chown /root /home/user, env, bashrc, etc) to get the /.aws/credentials to work outside the apache /var/www directory. The SDK reported that it could not read the credentials file.
I looked at PHP to see if I could set/override the HOME variable and it still did not read the credentials file until I placed the .aws folder in the '/var/www' folder and set the HOME variable in my php file like so:
<%php
putenv('HOME=/var/www');
//ZIP File SDK Install requires aws-autoloader
require 'aws-autoloader.php'; //Your php code below
Facing this issue, here was my exact approach:
PHP version : 7.2.24 AWS PHP SDK version: 3.180.4
First copy your existing aws folder to your root home directory
sudo cp -r ~/.aws /
Then your code should look like:
$client = Route53Client::factory(array(
'profile' => 'default',
'region' => 'us-east-1',
'version' => '2013-04-01'
));
In my case, it was interesting to realize that the PHP SDK looks for the credentials file in the root folder and not in the current users's home directory. That's the most feasible reason why my approach worked.
However, you want to find a more general place for your local configs and use the following approach to load it.
$path = '/my/config/folder/.aws/credentials';
$provider = CredentialProvider::ini('default', $path);
$provider = CredentialProvider::memoize($provider);
$client = Route53Client::factory(array(
'region' => 'us-east-1',
'version' => '2013-04-01',
'credentials' => $provider
));
Hopefully this throws more light into the AWS PHP community. It's really important to get this configuration right to build secure PHP applications
Here is what I ended up doing for purposes of this question, although EJ's answer above is actually the right answer. Hopefully this helps someone to get their credentials file to be read:
use Aws\Credentials\CredentialProvider;
use Aws\Route53\Route53Client;
$profile = 'default';
$path = '/var/www/html/.aws/credentials';
$provider = CredentialProvider::ini($profile, $path);
$provider = CredentialProvider::memoize($provider);
$client = Route53Client::factory(array(
'region' => 'us-east-1',
'version' => '2013-04-01',
'credentials' => $provider
));
Not sure what you are doing wrong, but I'd suggest bypassing the problem altogether and assigning an EC2 Instance role to the vm in question and then you won't have to worry about it; it's a better/more secure solution.
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html
I think the AWS manual is a bit confusing.
I created the .aws directory at the filesystem root (/), not in the /root or /home dir, and everything worked.
/.aws/credentials
I upgraded from PHP 8.0 to PHP 8.1 and PHP suddenly complained it couldn't find the credential file. The xdebug error trace showed me the expected location, which was one level below my public html directory. I don't know why it changed, but I simply ran this command in that directory:
ln -s /.aws/ .aws
The symlink I created works fine to provide the credentials. I'm back up and running.
check the permission of .aws/* files using "ls -l"
Change the permission to grand read or grant all permision "sudo chmod 777 .aws/*"
rerun the code

Laravel Dusk: how to use in-memory DB for testing

What I've been trying is to use in-memory database while testing with Laravel Dusk.
Here we have a file, .env.dusk.local, with the following values.
DB_CONNECTION=sqlite
DB_DATABASE=:memory:
Here is a snippet of a browser testing file.
class ViewOrderTest extends DuskTestCase
{
use DatabaseMigrations;
/** #test */
public function user_can_view_their_order()
{
$order = factory(Order::class)->create();
$this->browse(function (Browser $browser) use ($order) {
$browser->visit('/orders/' . $order->id);
$browser->assertSee('Order ABC'); //Order name
});
}
}
When php artisan dusk is executed, Dusk starts browser testing.
However, Dusk seems to be accessing my local DB, because there is an order name on the testing browser which only exists in my local DB, while 'Order ABC' is expected to be displayed on the browser.
According to the doc, Laravel Dusk allows us to set the environmental variables.
To force Dusk to use its own environment file when running tests, create a .env.dusk.{environment} file in the root of your project. For example, if you will be initiating the dusk command from your local environment, you should create a .env.dusk.local file.
I don't feel that Dusk is accessing the seperate DB.
Any advice will be appreciated.
You can't use :memory: database while Laravel dusk browser testing. Your development server and dusk testing runs on separate processes. dust test cannot access to memory of process running on development server.
Best solution it to create sqlite file database for testing.
'sqlite_testing' => [
'driver' => 'sqlite',
'database' => database_path('sqlite.testing.database'),
'prefix' => '',
],
Create sqlite.testing.database file inside database folder.
Make sure to run development server before running tests using
php artisan serve --env dusk.local
You need a connection in config/database.php
'sqlite_testing' => [
'driver' => 'sqlite',
'database' => ':memory:',
'prefix' => '',
],
Then in your phpunit.xml file use:
<env name="DB_DEFAULT" value="sqlite_testing" />
or in your tests use:
putenv('DB_DEFAULT=sqlite_testing');
Don't forget to use the RefreshDatabase trait to reset the database before each test.

laravel | How to confirm which caching driver is being used?

I am trying to use redis in my application but I am not sure if my app is using redis or file driver as I can't create tags but I can create normal keys fine.
I have set CACHE_DRIVER=redis and also in my cache.php I have:
'default' => env('CACHE_DRIVER', 'redis'),
also in my database.php there is:
'redis' => [
'client' => 'predis',
'default' => [
'host' => env('REDIS_HOST', '127.0.0.1'),
'password' => env('REDIS_PASSWORD', null),
'port' => env('REDIS_PORT', 6379),
'database' => 0,
],
],
The reasons for my suspicion are I cannot create tags and running redis-cli flushall under homestead(ssh) does not seem to get rid of the cahce. I had to use Cache::flush() in laravel instead.
So How can I effectively find out which cache driver my application is using?
Its pretty simple , you can use redis cli monitor command to check the get/set are happening or not
redis-cli monitor
And try to run the application .. if u able to see the keys then redis cache is running
u can also check the redis key by following command
redis-cli
Then enter following
keys *
I hope its useful.
You should simply query your Redis DB like so:
redis-cli
Then, being on redis console:
SELECT <DB-NUMBER>
KEYS *
If you see some keys like
1) PREFIX:tag:TAG:key
2) PREFIX:tag:DIFFERENT-TAG:key
it is quite a hint that Laravel is using Redis as its cache backend. Otherwise, take a look at
<YOUR-APP-DIR>/storage/framework/cache
If you find some files/sub-folders in there, it is quite a hint, Laravel is using file-based caching.

Laravel, 'env' in database config

i was configuring my sqlite connection in framework laravel. While using 'php artisan migrate' command i've got a message that there is no connection.
I changed this two lines of my code
'default' => env('DB_CONNECTION', 'sqlite'),
'database' => env('DB_DATABASE', database_path('database.sqlite'))
to
'default' => 'sqlite',
'database' => database_path('database.sqlite'),
Now everything works fine, but my question is what does the ENV function do??
Am i right to delete this?
In Laravel env() is the helper function, which gets the value of an environment variable or returns a default value:
Example:
$env = env('DB_CONNECTION');
// Return a default value if the variable doesn't exist...
$env = env('DB_CONNECTION', 'sqllite');
To give your application a speed boost, you should cache all of your configuration files into a single file using the php artisan config:cache. Which cache the env values as well, so in order to take effect of cached values changes one must have to clear it by using php artisan cache:clear
The env function gets the value of an environment variable from your .env file or returns a default value which is the second argument.
For more information read: Documentation

Updating remote site with drush and ssh

I'm very new to drush. We have a git repo of a drupal site that I would like to push to the remote server using drush. I can easily scp the drupal files or setup a cron on the remote that runs git pull but I still would like to learn how to push code and sync a remote drupal site with my local drupal.
Currently, I have drupal running locally and I use git to update the repo. The ssh is already configured and I can ssh to the remote drupal server using keys. I have also created .drush/aliases.drushrc.php file and I tested it by running drush #dev status. It worked well
<?php
$aliases['dev'] = array(
'root' => '/var/www/html',
'uri' => 'dev.example.com',
'remote-host' => '192.168.1.50'
);
?>
Now, I would like my local drupal site to be synchronized with our server on 192.168.1.50 server. The local drupal files are on my /home/ubuntu/drupal_site.
I have few questions:
What is the drush command/parameters to update remote drupal server?
What will be the drush command/parameters if remote server doesn't have drupal files yet?
Backup before synchronizing with drush ard or drush #dev ard or with the suited alias. You can set the backup path in the alias settings.
I think you named your remote server dev. That is why I keep this in the following and use the alias local for the local drupal site.
Add the alias for your local drupal site. Then you can use the following command to synchronize the files:
drush rsync #local #dev
There #local is the source and #dev the target. More details on how to use the command rsync can be displayed with:
drush help rsync
You also need to synchronize the database to get the remote site running. For this add the database account data to the alias data for #local and #dev. It will look something like this:
'databases' => array(
'default' => array(
'default' => array(
'driver' => 'mysql',
'username' => 'USERNAME',
'password' => 'PASSWORD',
'port' => '',
'host' => 'localhost',
'database' => 'DATABASE',
)
)
)
Replace the space holders with your data. Then databases can be synchronized with:
drush sql-sync #local #dev
There #local is the source and #dev the target.
Initially the synchronization will happen in one direction. After this it is good practice to synchronize files from development or test site to the productive site. The database is synchronized the other way around from productive site to development or test site.
Drush and Git workflows differ in a way, as Drush can pull packages separately – you could probably use Git to push to the server. Be sure to check the /files directory, which is usually in the .gitignore file – a possible approach would be to mirror the files directory directly from the live site.
A common approach to update and check 2 or several sites at the same time (being local and remote) would be to use Drush aliases for your sites on a script on your machine.
Articles like this one are a good starting point.

Categories