S3 video uploader, file not transcoding - php

I've recently pushed a video uploader for this website I've been working on for months. Recently, we have had two separate instances where the video the user uploaded wasn't completely uploaded resulting in a broken file that wont play. Currently, we are not sure as to why something like is occurring, but we have a couple of hunches, namely the files that are being uploaded are too big to handle and the user's upload speed is horrible. I was wondering what I can do to handle situations such as these where the upload conditions are less than optimal.

I upload firstly to the tmp directory on my Linux server then move the file to S3, our system is for end users to upload and we limit files to 40GB.

Related

PHP Uploading Files Until Server Full

I have one problem. People can upload files as attachment on my website to contact us/mail us (list displayed once uploaded etc). There is upload limit for one session, but if another one is created (it's more like random id) you can start over uploading to the server. So basically you can upload thousand files until the server is full.
Is there a possible way to prevent this? You can limit it for one IP or cookie, but I can have another one and this is still a problem.
Thank you

Large file upload through Browser (100 GB)

Is there any way to upload large files (more than 80 Gb) through a web browser? Previously I have been uploading files (img, png, jpg) using plupload but it seems not to be working for larger files. I would also like to know how to implement a web page where users could upload like Mega.co.nz or Drive.google.com.
If it is impossible to do it using web development tools, can anyone guide me about how I can divide & upload a file in segments?
Thanks.
You can use the JavaScript Blob object to slice large files into smaller chunks and transfer these to the server to be merged together. This has the added benefit of being able to pause/resume downloads and indicate progress.
If you don't fancy doing it yourself there are existing solutions that use this approach. One example is HTML5 Uploader by Filkor.
If I was you I would use something like FTP to accomplish this. If you can use ASP.NET there are already good libraries that exist for file transfer.
Here is a post that shows an example of uploading a file: Upload file to ftp using c#
The catch is you will need a server. I suggest Filezilla. https://filezilla-project.org/

where to save images generated by php before moving to cloud

I have a laravel php app were a user is going to upload an image. This image is going to be converted into a number of different sizes as required around the application and then each image is going to be uploaded to aws s3.
When the user uploads the image php places it in /tmp until the request has completed if it hasnt been renamed. I am planning on pushing the job of converting and uploading the versions to a queue. What is the best way to ensure that the image stays in /tmp long enough to be converted and then uploaded to s3
Secondly where should I save the different versions so that I can access them to upload them to s3 and then remove them from the server(preferably automatically)?
I would create a new directory and work on it. tmp folder is flushed every now and then depending on your system.
As for different sizes, i would create separate buckets for each size which you can access with whatever constant you use to store the image (ex: email, user id, etc..).

s3 direct upload restricting file size and type

A newbie question but I have googled abit and can't seem to find any solution.
I want to allow users to directly upload files to S3, not via my server first. By doing so, is there any way the files can be checked for size limit and permitted types before actually uploading to S3? Preferably not to use flash but javascript.
If you are talking about security problem (people uploading huge file to your bucket), yes, You CAN restrict file size with browser-based upload to S3.
Here is an example of the "policy" variable, where "content-length-range" is the key point.
"expiration": "'.date('Y-m-d\TG:i:s\Z', time()+10).'",
"conditions": [
{"bucket": "xxx"},
{"acl": "public-read"},
["starts-with","xxx",""],
{"success_action_redirect": "xxx"},
["starts-with", "$Content-Type", "image/jpeg"],
["content-length-range", 0, 10485760]
]
In this case, if the uploading file size > 10mb, the upload request will be rejected by Amazon.
Of course, before starting the upload process, you should use javascript to check the file size and make some alerts if it does.
getting file size in javascript
AWS wrote a tutorial explaining how to create HTML POST forms that allow your web site visitors to upload files into your S3 account using a standard web browser. It uses S3 pre-signed URLs to prevent tampering and you can restrict access by file size.
To do what you are wanting to do, you will need to upload through your own web service. This is probably best anyway, as providing global write access to your end users to your S3 bucket is a security nightmare, not too mention there would be nothing stopping them from uploading huge files and jacking up your charges.

Looking for a flash uploader to upload large files

I need a flash uploader, to use it in my CMS project.
I need something like this, but with greater max upload size (it doesn't allow to upload files larger ini_get('upload_max_filesize')).
My server doesn't allow me to overwrite ini settings, so I'm looking for an uploader which can upload large files independently from the ini settings.
If you want to go around the ini limit, one option would be to change and use an FTP uploader.
I've used once net2ftp and it was easy enough in its installation; I've never used it again since (almost 1 year and a half), but I see from their page that the project is updated and not dead, so you might give it a try.
You just download the package, place it in your webapp, customize it, and you're set.
You might want to create a dedicated FTP user with appropriate permissions, and not use the root one, of course.
You wont be able to post more data to the server than the max_upload_size.
As a workaround you can upload the data to Amazon S3 and sync it back via s3sync.
We have a setup with plupload in place for one of our clients and are able to upload up to 2GB per file (that's a client restriction, I don't know about S3 restrictions)
Mind you that S3 costs some money.

Categories