where to save images generated by php before moving to cloud - php

I have a laravel php app were a user is going to upload an image. This image is going to be converted into a number of different sizes as required around the application and then each image is going to be uploaded to aws s3.
When the user uploads the image php places it in /tmp until the request has completed if it hasnt been renamed. I am planning on pushing the job of converting and uploading the versions to a queue. What is the best way to ensure that the image stays in /tmp long enough to be converted and then uploaded to s3
Secondly where should I save the different versions so that I can access them to upload them to s3 and then remove them from the server(preferably automatically)?

I would create a new directory and work on it. tmp folder is flushed every now and then depending on your system.
As for different sizes, i would create separate buckets for each size which you can access with whatever constant you use to store the image (ex: email, user id, etc..).

Related

Zipping 100's of files stored on S3

We use S3 to store various media uploaded via our application such as images, documents etc. We work in the property software industry and as a means of exchanging data stored in our system with property portals a common exchange format between portals is the Rightmove BLM data feed specification. This is essentially a zip file containing a delimited text file and any associated media which is sent via FTP to each portal. However a bottleneck in the process is downloading the media from S3 for zipping. For example one single account on our system could have in the region of 1000 images/documents to the downloaded and zipped in preparation for transfer (each file has to be name in a particular format for that particular portal (unique number, sequence numbers etc). However downloading 1000 images/documents from S3 to an EC2 server in the same region via the PHP SDK takes some time (60+ seconds). When doing this for multiple accounts at the same time it puts considerable load on the server.
Is there an better/faster way to download files from S3 so they can be prepped and zipped on the EC2 instance?
Thanks.
One option would be to aggregate the zip as the files are added. Meaning, instead of zipping the files all at once, use a Lambda function to add them to a zip file as they're added to or updated on the S3 bucket. Then, the zip would be available more-or-less on demand.

S3 video uploader, file not transcoding

I've recently pushed a video uploader for this website I've been working on for months. Recently, we have had two separate instances where the video the user uploaded wasn't completely uploaded resulting in a broken file that wont play. Currently, we are not sure as to why something like is occurring, but we have a couple of hunches, namely the files that are being uploaded are too big to handle and the user's upload speed is horrible. I was wondering what I can do to handle situations such as these where the upload conditions are less than optimal.
I upload firstly to the tmp directory on my Linux server then move the file to S3, our system is for end users to upload and we limit files to 40GB.

Integrate Cloudfront with dynamic image resize and S3 storage

I've been reading a lot about dynamic image manipulation, storage and content delivery, the company I'm working for already uses AWS for some of their services.
The application I'm working on, store document images to a S3 bucket (not limited to), and i need to display them on demand.
The first version of this application, stored the images locally and performs the image manipulation on-demand on the same server.
Now, the documents storage has increased and a lot of images are being stored, all this via web application, this means that one user may upload say 100+ images and the server needs to process them as fast as it can.
That's why the images are uploaded to an EC2 instance and they are streamed to a S3 bucket internally, that's how we save the original image in the first place, no thumbnails here to speed up the uploading process.
Then a different user may want to preview this images or see them in original size, this is why i need to dynamically re-size them, i will implement Cloudfront for the image caching after they are re-sized, and here comes the issue.
The workflow is like this:
1. User Request CDN image
2.a Cloudfront Serves the cached image
2.b Cloudfront request the image to a custom origin if its not cached
3. The origin server query S3 for the image
4.a If the image size exists on S3
5. Return the image to Cloudfront, Cache and return to user
4.b If the image size does not exists on S3
5. Generate a image size from the original S3 image
6. Save the new size to S3
7. Return the new size to Cloudfront, Cache and return to user
The custom origin is responsible of creating the missing image size and save it to S3, the Cloudfront can use the cached image or request this new image size to S3 as it now exists.
I think this is possible, as i already read a lot of documentation about it, but i still haven't found documentation of someone who has made this before.
Does this looks like a good way of handle the image manipulation, has anyone saw any documentation about how to do this.
I'm a PHP developer but i might be able to implement a non-PHP solution in favor of performance on the image server.
If you are open to non-PHP based solutions https://github.com/bcoe/thumbd is a good option since it already integrates S3, etc. However you will need to know the sizes you need ahead of time. I would recommend such an approach rather than generating sizes on the fly since it means faster response times for your user. Your user will not have to wait while the new size is being generated. Storage on S3 is incredibly cheap and so you will not be wasting any $$ by creating multiple sizes either.

s3 direct upload restricting file size and type

A newbie question but I have googled abit and can't seem to find any solution.
I want to allow users to directly upload files to S3, not via my server first. By doing so, is there any way the files can be checked for size limit and permitted types before actually uploading to S3? Preferably not to use flash but javascript.
If you are talking about security problem (people uploading huge file to your bucket), yes, You CAN restrict file size with browser-based upload to S3.
Here is an example of the "policy" variable, where "content-length-range" is the key point.
"expiration": "'.date('Y-m-d\TG:i:s\Z', time()+10).'",
"conditions": [
{"bucket": "xxx"},
{"acl": "public-read"},
["starts-with","xxx",""],
{"success_action_redirect": "xxx"},
["starts-with", "$Content-Type", "image/jpeg"],
["content-length-range", 0, 10485760]
]
In this case, if the uploading file size > 10mb, the upload request will be rejected by Amazon.
Of course, before starting the upload process, you should use javascript to check the file size and make some alerts if it does.
getting file size in javascript
AWS wrote a tutorial explaining how to create HTML POST forms that allow your web site visitors to upload files into your S3 account using a standard web browser. It uses S3 pre-signed URLs to prevent tampering and you can restrict access by file size.
To do what you are wanting to do, you will need to upload through your own web service. This is probably best anyway, as providing global write access to your end users to your S3 bucket is a security nightmare, not too mention there would be nothing stopping them from uploading huge files and jacking up your charges.

Looking for a flash uploader to upload large files

I need a flash uploader, to use it in my CMS project.
I need something like this, but with greater max upload size (it doesn't allow to upload files larger ini_get('upload_max_filesize')).
My server doesn't allow me to overwrite ini settings, so I'm looking for an uploader which can upload large files independently from the ini settings.
If you want to go around the ini limit, one option would be to change and use an FTP uploader.
I've used once net2ftp and it was easy enough in its installation; I've never used it again since (almost 1 year and a half), but I see from their page that the project is updated and not dead, so you might give it a try.
You just download the package, place it in your webapp, customize it, and you're set.
You might want to create a dedicated FTP user with appropriate permissions, and not use the root one, of course.
You wont be able to post more data to the server than the max_upload_size.
As a workaround you can upload the data to Amazon S3 and sync it back via s3sync.
We have a setup with plupload in place for one of our clients and are able to upload up to 2GB per file (that's a client restriction, I don't know about S3 restrictions)
Mind you that S3 costs some money.

Categories