How to manage big files upload in L5 class Storage using Flysystem package? - php

In my Laravel 5 set up, I included the Flysystem package, configured a disk in config/filesystems.php (ftp)
'ftp' => [
'driver' => 'ftp',
'host' => env('FTP_HOST'),
'username' => env('FTP_USER'),
'password' => env('FTP_PASS'),
'passive' => TRUE,
],
Then I'm able to perform ftp upload and download with the following
instructions:
Storage::disk('ftp')->put($filePath, $contents);
Storage::disk('ftp')->get($filePath);
Until here everything is fine.
Problems start when I'm uploading big files. Above 200MB.
PHP memory limit is reached and execution stops with fatal error.
In fact when Storage->put is called my PC memory increases dramatically.
I've read somewhere that a solution might be to use Streams to perform read write from my "virtual" disk.
Actually I'm still missing how to implement it in order to optimize memory usage during these operations.

Related

saving files to scaleway storage php methode

how to upload an image to scaleway storage by laravel or PHP methods?
Laravel uses FlySystem under the hood to abstract file storage. It provides several drivers out of the box including: S3, Rackspace, FTP etc.
If you want to support Scaleway, you would need to write a Custom Driver, which you can read more about it here.
Edit: It seems from the documentation of Scaleway, it supports AWS CLI clients, which means, this should be quite easy to add support for in FlySytem. I tried the following and it worked.
I added a new driver in config/filesystems.php as follows:
'scaleway' => [
'driver' => 's3',
'key' => '####',
'secret' => '#####',
'region' => 'nl-ams',
'bucket' => 'test-bucket-name',
'endpoint' => 'https://s3.nl-ams.scw.cloud',
]
and then, to use the disk, I did the following:
\Storage::disk('scaleway')->put('file.txt', 'Contents');
My file was uploaded.
EDIT: I also made a PR to get Scaleway accepted in the list of adapters for League's FlySystem. It got merged. You can see it live here.

laravel | How to confirm which caching driver is being used?

I am trying to use redis in my application but I am not sure if my app is using redis or file driver as I can't create tags but I can create normal keys fine.
I have set CACHE_DRIVER=redis and also in my cache.php I have:
'default' => env('CACHE_DRIVER', 'redis'),
also in my database.php there is:
'redis' => [
'client' => 'predis',
'default' => [
'host' => env('REDIS_HOST', '127.0.0.1'),
'password' => env('REDIS_PASSWORD', null),
'port' => env('REDIS_PORT', 6379),
'database' => 0,
],
],
The reasons for my suspicion are I cannot create tags and running redis-cli flushall under homestead(ssh) does not seem to get rid of the cahce. I had to use Cache::flush() in laravel instead.
So How can I effectively find out which cache driver my application is using?
Its pretty simple , you can use redis cli monitor command to check the get/set are happening or not
redis-cli monitor
And try to run the application .. if u able to see the keys then redis cache is running
u can also check the redis key by following command
redis-cli
Then enter following
keys *
I hope its useful.
You should simply query your Redis DB like so:
redis-cli
Then, being on redis console:
SELECT <DB-NUMBER>
KEYS *
If you see some keys like
1) PREFIX:tag:TAG:key
2) PREFIX:tag:DIFFERENT-TAG:key
it is quite a hint that Laravel is using Redis as its cache backend. Otherwise, take a look at
<YOUR-APP-DIR>/storage/framework/cache
If you find some files/sub-folders in there, it is quite a hint, Laravel is using file-based caching.

PHP + S3: Permission denied while deleting a file using unlink()

I am trying to solve an extremely trivial issue since long but no luck.
I want to delete a file immediately after uploading it to AWS S3 from a PHP WebServer. Following are the steps:
//Upload file to S3 using PHP SDK's S3Client::putObject method:
$result = $s3_client->putObject( array(
'Bucket' => AWS_BUCKET_NAME,
'Key' => $file_name,
'SourceFile' => $file_path,
'Metadata' => array(
'metadata_field' => 'metadata_value'
)
));
//Poll the object until it is accessible
$s3_client->waitUntil('ObjectExists', array(
'Bucket' => AWS_BUCKET_NAME,
'Key' => $file_name
));
//Delete the file
unlink( $file_path );
These steps work perfectly in case I upload a small file (~500KB).
However, if I upload a larger file (5MB-10MB), I get the following error:
Warning: unlink(<Complete Path to File>): Permission denied in <Complete path to uploader.php> on line N
I am working on Windows and have tried elevating user permissions for the directory and file. (using chmod, chown php commands and made sure that the directory is writable and accessible)
It seems that AWS S3 PutObject method is not releasing the file handle (in case of large files only). I have also tried adding sleep() but not luck.!
Moreover, in case I skip uploading the file to S3 (just to test my delete workflow), the file gets deleted without any issue.
Please help.!
The issue has raised on https://github.com/aws/aws-sdk-php/issues/841
Try using the gc_collect_cycles() function, it solved the problem for me. See the page above for further reference.
Regards,
Andor
Maybe you need to set the value of upload_max_filesize and post_max_size in your php.ini:
; Maximum allowed size for uploaded files.
upload_max_filesize = 40M
; Must be greater than or equal to upload_max_filesize
post_max_size = 40M
After modifying php.ini file(s), you need to restart your HTTP server to use new configuration.
In case anybody else is also stuck on this, I moved nginx server deployment to CentOS and this issue was not observed.
The waitUntil 'ObjectExists' have an timeout/max attempts by default.
You can change using:
$s3Client->waitUntil('ObjectExists', array(
'Bucket' => AWS_BUCKET_NAME,
'Key' => $file_name,
'waiter.interval' => 10,
'waiter.max_attempts' => 6
));

Laravel not creating log file.

I'm learning Laravel 4.0 to develop a webserver.
I'm using a LAMP stack (Apache 2, php 5.5).
I can't find the log file where Log::error() calls writes to.
As far as I know it's supposed to be to app/storage/logs/log-cli-.txt but there are no files there.
Here is the code:
app/commands/MsgCommand.php
public function fire(){
Log::error('messages - log');
}
It's called from artisan:
app/start/artisan.php
Artisan::add(new MsgCommand());
Am I looking in the right place?
How can I check that this is indeed the right folder (i.e. where is it configured)? To check for faulty installation or setup.
Thanks to marcanuy, I am now sure it is writing to app/storage/logs.
Also I found out it writes fine if I call the command through artisan. Running on apache 2 nothing happens though. I'm starting to think I set up the command wrong.
By default app/storage is the location for log files, storage folder is configured in bootstrap/paths.php
'storage' => __DIR__.'/../app/storage',
Also be sure that this folder is writable by the web server.
The problem was permissions.
User permissions for the www-var user in the ../app/storage
And MySQL settings: Creating a user corresponding to what is set in the app/config/database.php
'mysql' => array(
'driver' => 'mysql',
'host' => 'your host',
'database' => 'your db',
'username' => 'your user',
'password' => 'your pass',
),

How to implement memcache in cakephp?

I am a newbie to php and cakephp, recently I was assigned a job to implement memcache in my app so that its performance can be increased. Can anyone suggest some documentation on this topic for me?
Thanks.
This might be a bit late ... but the Cake core has support for Memcached built in (at least in the latest versions, 2.0.x and 2.1).
Have a look at Config/core.php in your app, and you should see these lines (commented):
Cache::config('default', array(
'engine' => 'Memcache', //[required]
'duration' => 3600, //[optional]
'probability' => 100, //[optional]
'prefix' => Inflector::slug(APP_DIR) . '_', //[optional] prefix every cache file with this string
'servers' => array(
'127.0.0.1:11211' // localhost, default port 11211
), //[optional]
'persistent' => true, // [optional] set this to false for non-persistent connections
'compress' => false, // [optional] compress data in Memcache (slower, but uses less memory)
));
You can uncomment these lines and test it out with a Memcached install. Make sure you have Memcached installed somewhere (localhost or elsewhere) and you are pointing to it.
Memcache is one of the supported Cache engines by the built-in Cache class. The Cache class is a wrapper for interacting with your Cache and you can read everything about it here: http://book.cakephp.org/2.0/en/core-libraries/caching.html
Warlock
Here is a more specific implementation of Memcache and Cakephp that may help with your bottle necks
Send your database on vacation by using CakePHP + Memcached
http://nuts-and-bolts-of-cakephp.com/2009/06/17/send-your-database-on-vacation-by-using-cakephp-memcached/

Categories