Searching using the TNTSearch driver works in a Homestead environment however on production it returns error: the below error,
Symfony\Component\Debug\Exception\FatalThrowableError: Class
'AlgoliaSearch\Version' not found on
vendor/laravel/scout/src/EngineManager.php:31
However my .env has SCOUT_DRIVER=tntsearch and the config file scout.php has:
'driver' => env('SCOUT_DRIVER', 'tntsearch'),
'tntsearch' => [
'storage' => storage_path(),
'fuzziness' => env('TNTSEARCH_FUZZINESS', false),
'fuzzy' => [
'prefix_length' => 2,
'max_expansions' => 50,
'distance' => 2
],
'asYouType' => false,
'searchBoolean' => env('TNTSEARCH_BOOLEAN', false),
]
The problem is that I am not using Algolia search and my composer file has Scout and TNTSearch driver. The search works in my local Homestead environment just not on the production server.
Confirm that SCOUT_DRIVER=tntsearch has been added to your .env file.
For me personally, I had added SCOUT_DRIVER=tntsearch to my local .env file, but not my .env file for the environment with the issue. Don't forget to run php artisan config:clear after adding the env var.
Thanks to #m33bo for pointing me in the right direction!
I worked it out, I had uploaded my project but for some reason the .index file that is needed sync'd but did not work. If this happens to you on live make sure you Git or SVN or whatever the index or run php artisan scout:import App\\Your\\Model
Related
i'm trying to convert a varity of audio files into a specific audio type of my chossing on my laravel project.
on my search i landed on a package called Laravel FFMpeg, i started with a fresh installation with laravel 6.2, and did the installation as it shows on github, as soon as i started using it i had this error :
FFMpeg\Exception\ExecutableNotFoundException: Unable to load FFMpeg in file C:\Users\Hatim\Desktop\Projects\Audio\vendor\php-ffmpeg\php-ffmpeg\src\FFMpeg\Driver\FFMpegDriver.php on line 55
#0 C:\Users\Hatim\Desktop\Projects\Audio\vendor\pbmedia\laravel-ffmpeg\src\Support\ServiceProvider.php(61): FFMpeg\Driver\FFMpegDriver::create(Object(Illuminate\Log\LogManager), Object(Alchemy\BinaryDriver\Configuration))
I'm currently just working with a simple thing :
FFMpeg::fromDisk('s3');
and yes I have called it my Controller: use ProtoneMedia\LaravelFFMpeg\Support\FFMpeg;
I have also tried to install the PHP-FFMpeg Package and also tried to install FFmpeg on my device and link it using the path in my config\laravel-FFmpeg
This is how it looks now without any modification, I went to my .env folder but it has nothing regarding FFmpeg so it takes the default value, that's why I tried installing it on my device and replacing the default value with the path of the FFmpeg on my device.
return [
'ffmpeg' => [
'binaries' => env('FFMPEG_BINARIES', 'ffmpeg'),
'threads' => 12,
],
'ffprobe' => [
'binaries' => env('FFPROBE_BINARIES', 'ffprobe'),
],
'timeout' => 3600,
'enable_logging' => true
];
I'm currently working on my local environment, but this has to work on a server.
I have solved the problem, this response is for future developers having a problem with the package.
first of all you have to install FFmpeg on your device, [click me][1]
chose your os and download it, unzip it rename it to FFmpeg, and then copy the folder and place it in your c: folder for example.
after that just go to your config\laravel-FFmpeg file in your project and change the defaults to the exact path in c: folder
'ffmpeg' => [
'binaries' => env('FFMPEG_BINARIES', 'C:\ffmpeg\bin\ffmpeg.exe'),
'threads' => 12,
],
'ffprobe' => [
'binaries' => env('FFPROBE_BINARIES', 'C:\ffmpeg\bin\ffprobe.exe'),
],
Like a suggestions: set in your local .env file:
FFMPEG_BINARIES=C:\bin\ffmpeg.exe
FFPROBE_BINARIES=C:\bin\ffprobe.exe
I use cron job to do some CRUD operation using laravel Task Scheduling. On localhost and on my Share-Hosting server it worked fine for months until recently I keep getting this error when I run cron job on my Share-Hosting server. I did not make any changes to the code on my Share-Hosting server.
[2017-07-14 09:16:02] production.ERROR: exception 'Symfony\Component\Process\Exception\RuntimeException' with message 'The Process class relies on proc_open, which is not available on your PHP installation.' in /home/xxx/xx/vendor/symfony/process/Process.php:144
Stack trace:
But on localhost it works fine. Based on my finding online I have tried the following.
Contacted my hosting company to remove proc_open form disable PHP functions.
Hosting company provided custom php.ini file. I remove all disable_functions
Share-Hosting Server was restarted and cache was cleared
None of this fixed the issue. I am not sure of what next to try because the same project works fine on different Share-Hosting Server.
After many weeks of trying to resolve this error. The following fixes worked
Upgrade project from Laravel 5.2 to 5.4
On CPanel using "Select Php version" set PHP version to 7
Or on CPanel using "MultiPHP Manager" set PHP version to ea-php70
Now, cron job runs smoothly. I hope this helps someone.
Laravel 6 and higher (proc_open Error)
It is because of Flare error reporting service enabled in debug mode
There is a workaround for this.
Publish flare config file
php artisan vendor:publish --tag=flare-config
and in config/flare.php
Set
'collect_git_information' => false
'reporting' => [
'anonymize_ips' => true,
'collect_git_information' => false,
'report_queries' => true,
'maximum_number_of_collected_queries' => 200,
'report_query_bindings' => true,
'report_view_data' => true,
],
You can use this at your own risk:
/usr/local/bin/php -d "disable_functions=" /home/didappir/public_html/api/artisan schedule:run > /dev/null 2>&1
When Flare error reporting service enabled in debug mode you'll see this error
The solution is:
Publish flare config file
php artisan vendor:publish --tag=flare-config
in config/flare.php Set:
'reporting' => [
'anonymize_ips' => true,
'collect_git_information' => false,
'report_queries' => true,
'maximum_number_of_collected_queries' => 200,
'report_query_bindings' => true,
'report_view_data' => true,
],
'send_logs_as_events' => false,
For me removing cached version of config.php file solve problem(Laravel 6).
go to bootstrap/cache/config.php and remove file.
Also don't forget to change APP_URL to your domain address. PHP version should be as required by laravel version.
for shared host if you can't change php.ini, you should use laravel 5.8.
I use cron job to do some CRUD operation using laravel Task Scheduling. On localhost and on my Share-Hosting server it worked fine for months until recently I keep getting this error when I run cron job on my Share-Hosting server. I did not make any changes to the code on my Share-Hosting server.
[2017-07-14 09:16:02] production.ERROR: exception 'Symfony\Component\Process\Exception\RuntimeException' with message 'The Process class relies on proc_open, which is not available on your PHP installation.' in /home/xxx/xx/vendor/symfony/process/Process.php:144
Stack trace:
But on localhost it works fine. Based on my finding online I have tried the following.
Contacted my hosting company to remove proc_open form disable PHP functions.
Hosting company provided custom php.ini file. I remove all disable_functions
Share-Hosting Server was restarted and cache was cleared
None of this fixed the issue. I am not sure of what next to try because the same project works fine on different Share-Hosting Server.
After many weeks of trying to resolve this error. The following fixes worked
Upgrade project from Laravel 5.2 to 5.4
On CPanel using "Select Php version" set PHP version to 7
Or on CPanel using "MultiPHP Manager" set PHP version to ea-php70
Now, cron job runs smoothly. I hope this helps someone.
Laravel 6 and higher (proc_open Error)
It is because of Flare error reporting service enabled in debug mode
There is a workaround for this.
Publish flare config file
php artisan vendor:publish --tag=flare-config
and in config/flare.php
Set
'collect_git_information' => false
'reporting' => [
'anonymize_ips' => true,
'collect_git_information' => false,
'report_queries' => true,
'maximum_number_of_collected_queries' => 200,
'report_query_bindings' => true,
'report_view_data' => true,
],
You can use this at your own risk:
/usr/local/bin/php -d "disable_functions=" /home/didappir/public_html/api/artisan schedule:run > /dev/null 2>&1
When Flare error reporting service enabled in debug mode you'll see this error
The solution is:
Publish flare config file
php artisan vendor:publish --tag=flare-config
in config/flare.php Set:
'reporting' => [
'anonymize_ips' => true,
'collect_git_information' => false,
'report_queries' => true,
'maximum_number_of_collected_queries' => 200,
'report_query_bindings' => true,
'report_view_data' => true,
],
'send_logs_as_events' => false,
For me removing cached version of config.php file solve problem(Laravel 6).
go to bootstrap/cache/config.php and remove file.
Also don't forget to change APP_URL to your domain address. PHP version should be as required by laravel version.
for shared host if you can't change php.ini, you should use laravel 5.8.
I'm using spatie/laravel-backup in a WAMP localhost.
It works fine when I type manually in the windows cmd:
php artisan backup:run
But when I try to run the backup using the laravel Artisan class:
Artisan::call('backup:run');
It throw an error:
'mysqldump' not recognized ...
In the laravel mysql config I've also specified the path to the dumper:
'mysql' => [
'driver' => 'mysql',
// ...
'dump' => [
'dump_binary_path' => 'E:/wamp/wamp64/bin/mysql/mysql5.7.9/bin',
],
],
How can i fix that?
EDIT
Probably it's just support "bug" for windows (finded out thanks Loek's answer), as the author says, so can I run a backup in a controller without a command safely? maybe with something like:
use Spatie\Backup\Tasks\Backup\BackupJobFactory;
BackupJobFactory::createFromArray(config('laravel-backup'))->run();
As the command itself.
I believe it's the forward slashes. Try this:
'mysql' => [
'driver' => 'mysql',
// ...
'dump' => [
'dump_binary_path' => 'E:\\wamp\\wamp64\\bin\\mysql\\mysql5.7.9\\bin',
],
],
EDIT
Support for Windows is wonky at best, with multiple "This package doesn't support Windows" comments from the creators on GitHub issues. This one is the latest: https://github.com/spatie/laravel-backup/issues/311
It could also be a permission problem. Executing from the command line is probably happening from another user than executing from the web server, so Windows is denying access to mysqldump.
2nd edit
As long as you make sure the controller only gets called when it needs to be, I don't see why this wouldn't work!
I'm using Laravel 5 but I'm not being able to migrate my database table. I have a macbook pro and I'm using Terminal. I'm using php artisan command:
php artisan migrate.
When I execute this command, I get the following error message: [PDOException]
SQLSTATE[HY000] [2002] Connection refused.
I have configured my database.php following the official tutorial videos on laracasts.com. My database.php file looks like the following:
...
'fetch' => PDO::FETCH_CLASS,
...
'default' => env('DB_CONNECTION', 'sqlite'),
...
'connections' => [
'sqlite' => [
'driver' => 'sqlite',
'database' => database_path('database.sqlite'),
'prefix' => '',
],
....
I have read many comments on stackoverflow.com about this issue. Most of them are talking about modifying the ".env" file. The thing is I can't find this file! Which makes me wonder if my installation of Laravel is complete or not! I read that my ".env" file might be overriding my "database.php" file but I can't file the ".env" file!
The env file is most likely hidden.
Run defaults write com.apple.finder AppleShowAllFiles YES and Killall Finder in Terminal.
Open Finder and you should be able to see the .env file.
Edit your .env file.