Problem with uploading big files with Laravel to S3 - php

I'm stuck with this specific problem for two days and can't find a solution.
So I have Laravel 7.0 project hosted on AWS Elastic Beanstalk, which is running fine. I also have S3 bucket, used for saving uploaded videos, which are uploaded to server via user form.
The problem is, smaller files(< 10 MB) are uploaded without a problem. But once it comes to bigger files, Storage::disk('s3')->put('videos/lorem.mp4', fopen($request->file('file'), 'r+')); method returns false and file is not uploaded to S3. If I use 'public' disk instead of 's3' disk, file is uploaded without a problem.
I also tried to upload file manually via AWS CLI with the same IAM user and it was uploaded without a problem.
PHP and nginx configuration are correctly configured to accept big files.
I know that this is very specific question, but if anyone have a hint or solution, please do share.

This is likely a timeout issue on the upload. Increasing the PHP's max_execution_time might help. I'd try that first.
Else, you should look into uploading to local disk and have an entirely different process for picking up the files from disk and uploading them to S3. I was amazed at how much throughput you can gain from this lab. It's also worth checking out the job queues feature in Laravel.

Related

Upload Large file from My computer to S3 Server without editing php.ini

I want to upload a large file from My computer to S3 Server without editing php.ini. Firstly,I choose file from browse button and submit upload button and then upload to s3 server. But I can't post form file data when I upload a large file. But I don't want to edit php.ini.Is there any way to upload a large local file to s3 server?
I've done this by implementing Fine Uploader's php implementation for S3. As of recently it is through an MIT license. It's an easy way to upload huge files to S3 without changing your php.ini at all.
It's not the worst thing in the world to set up. You'll need to set some environment variables for the public/secret keys, set up CORS settings on the bucket, and write a php page based on one of the examples which will call a php endpoint that'll handle the signing.
One thing that was not made obvious to me was that, when setting the environment variables, they expect you to make two separate AWS users with different privileges for security reasons.
ini_set("upload_max_filesize","300M");
try this

S3 Bucket Amazon issue

1) I have upload form
2) It uploads file to my local storage move_uploaded_file.
3) It uses zend putObject function to move file to s3 object.
Everything works ok till I have file size of around 30Mb to 40 Mb. The problem is when I try uploading larger files like 80 Mb, 100 Mb or so, the file moving to s3 takes ages to complete the upload. My code is something like this:
$orginalPath = APPLICATION_PATH."/../storage/".$fileName;
move_uploaded_file($data['files']['tmp_name'], "$orginalPath");
$s3 = new Zend_Service_Amazon_S3($accessKey, $secretKey);
$s3->putObject($path, file_get_contents($orginalPath),
array(Zend_Service_Amazon_S3::S3_ACL_HEADER =>Zend_Service_Amazon_S3::S3_ACL_PUBLIC_READ));
Can you help how to handle large files move quickly I tried using streamWrapper like this
$s3->registerStreamWrapper("s3");
file_put_contents("s3://my-bucket-name/orginal/$fileName", file_get_contents($orginalPath));
But no luck, it take same long time to move file.
Hence, is there an efficient way to move file quickly to s3 bucket?
The answer is a worker process. You can start a PHP worker script via PHP CLI on server boot, perhaps with a GearmanClient php extension and gearman server running on your box. Then you queue a background job to upload the file to S3 while your main site PHP code returns success after issuing the job and the file happily uploads in the background while your foreground site continues on it's merry way. Another way of doing this is making another server do all of this task while your main site remains utilization free of this process. I am doing this now. It works well.
You could consider using the more direct POST to S3 feature. The AWS SDK for PHP has a class to help generate the data for the form.

php temp file upload directory - off local server

When uploading an image PHP stores the temp image in a local on the server.
Is it possible to change this temp location so its off the local server.
Reason: using loading balancing without sticky sessions and I don't want files to be uploaded to one server and then not avaliable on another server. Note: I don't necessaryly complete the file upload and work on the file in the one go.
Preferred temp location would be AWS S3 - also just interested to know if this is possible.
If its not possible I could make the file upload a complete process that also puts the finished file in the final location.
just interested to know if the PHP temp image/file location can be off the the local server?
thankyou
You can mount S3 bucket with s3fs on your Instances which are under ELB, so that all your uploads are shared between application Servers. About /tmp, don't touch it as destination is S3 and it is shared - you don't have to worry.
If you have a lot of uploads, S3 might be bottleneck. In this case, I suggest to setup NAS. Personally, I use GlusterFS because it scales well and very easy to set up. It has replication issues, but you might not use replicated volumes at all and you are fine.
Another alternatives are Ceph, Sector/Sphere, XtreemFS, Tahoe-LAFS, POHMELFS and many others...
You can directly upload a file from a client to S3 with some newer technologies as detailed in this post:
http://www.ioncannon.net/programming/1539/direct-browser-uploading-amazon-s3-cors-fileapi-xhr2-and-signed-puts/
Otherwise, I personally would suggest using each server's tmp folder for exactly that-- temporary storage. After the file is on your server, you can always upload to S3, which would then be accessible across all of your load balanced servers.

Looking for a flash uploader to upload large files

I need a flash uploader, to use it in my CMS project.
I need something like this, but with greater max upload size (it doesn't allow to upload files larger ini_get('upload_max_filesize')).
My server doesn't allow me to overwrite ini settings, so I'm looking for an uploader which can upload large files independently from the ini settings.
If you want to go around the ini limit, one option would be to change and use an FTP uploader.
I've used once net2ftp and it was easy enough in its installation; I've never used it again since (almost 1 year and a half), but I see from their page that the project is updated and not dead, so you might give it a try.
You just download the package, place it in your webapp, customize it, and you're set.
You might want to create a dedicated FTP user with appropriate permissions, and not use the root one, of course.
You wont be able to post more data to the server than the max_upload_size.
As a workaround you can upload the data to Amazon S3 and sync it back via s3sync.
We have a setup with plupload in place for one of our clients and are able to upload up to 2GB per file (that's a client restriction, I don't know about S3 restrictions)
Mind you that S3 costs some money.

Filesize with SWFUpload and Amazon S3

I'm currently using SWFUpload to upload files to my S3 bucket. And it's working great.
I'm using the script from a website here: http://www.anedix.com/news/article/50
Again, the upload to my S3 works fine, however, I've been running into an issue when attempting to upload larger files. It seems that I cannot upload anything over 50MB. I have tried this from both my webhost and locally, using my local testing environment.
My question is this: When uploading with SWFUpload, it should be going straight to Amazon S3, correct? If so, then PHP settings such as MAX_UPLOAD_SIZE should not affect it? (Even though in my local environment, I've set it to 1024MB.)
Essentially, what the script does is, shows that it's uploading the file (it takes the appropriate amount of time), redirects to the success page, and does not throw any errors.
Any ideas on why this would be happening, or how I can troubleshoot this?
Thanks!
file_size_limit is also a param of SWFUpload ... have you checked it ?
also consider a 30% more as the content posted be encoded

Categories