Uploading multiple files to Amazon S3 - php

I need help on how to upload multiple files on Amazon S3. So basically I have three input fields for files upload, two inputs will take 10-20 pictures and last input is only one for one image and upload them to Amazon S3 when a form is submitted.
The form that I'm using for uploading images:
I have a bucket and everything, what I need is some kind of solution to upload multiple images to Amazon S3.
I'm using PHP as my backend and for now, images are stored on hosting when a form is submitted. But I will have more then 150gb of images uploaded every month and I need S3 for hosting those images.
When I connect the form with Amazon S3 and try to upload more than one image, I get this message "POST requires exactly one file upload per request.".

Here is the NodeJS code which will give you an idea on how to upload all the files and then send a response back to the UI when the upload is complete.
I am using promises and the promise.all() method which will resolve all promises.
I am also using multer for Node.JS which handles the files that I received from the UI.
app.post('/uploadMultipleFiles',upload.array('file', 10),function(req,res){
var promises=[];
for(var i=0;i<req.files.length;i++){
var file = req.files[i];
promises.push(uploadLoadToS3(file));
}
Promise.all(promises).then(function(data){
res.send('Uploadedd');
}).catch(function(err){
res.send(err.stack);
})
})
function uploadLoadToS3(ObjFile){
var params={
ACL :'public-read',
Body : new Buffer(ObjFile.buffer),
Bucket:'ascendon1',
ContentType:ObjFile.mimetype,
Key:ObjFile.originalname
}
return s3.upload(params).promise();
}

S3 is highly scalable and distributed storage.
If you have those images locally in your machine you can simply use
aws s3 sync local_folder s3://bucket_name/
https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
cli takes cares of syncing the data.
You can also configure how much parallelism you want on the cli with the configuration settings.
https://docs.aws.amazon.com/cli/latest/topic/s3-config.html
You can also do this programmatically if that is going to be a continuous data movement.
EDIT1:
Only one file can be uploaded from the UI at one time.
You can sequence them via javascript and upload one at a time.
If you want to take it to the backend you can do so,
https://w3lessons.info/2013/09/06/jquery-multiple-file-upload-to-amazon-s3-using-php/
Hope it helps.

Related

how to upload multiple files like a bulk insert on google cloud storage

I have the following situation. I'm using laravel and google cloud storage so basically, I have my own custom filesystem attached to google cloud storage. My code looks like this -
try{
$disk = Storage::disk('gcs');
$disk->put('/service_provider/transports/tech_passports', $file1);
$disk->put('/service_provider/transports/tech_passports', $file2);
$disk->put('/service_provider/transports/pictures', $file3);
$disk->put('/service_provider/transports/pictures', $file4);
$disk->put('/service_provider/transports/pictures', $file5);
$disk->put('/service_provider/transports/pictures', $file6);
} catch(\Exception $e){
}
Now i'm worried. what if the first three files get uploaded and the fourth one has an error. so it will go to catch block and finally, only 3 files will be uploaded which is not what I need.
What I need is all get uploaded or no file gets uploaded just like atomicity. how do I achieve that?
To be sure that all your files get uploaded, try using resumable uploads on each of your individual files, with the following steps:
Initiate upload > Process response > Upload The file > Check Upload Status > Process response > Resume the upload
Retry last three steps to resume upload as many times as necessary or implement another solution (for example: delete all files already uploaded), if it is what you need in your application.
You can also check Google Cloud Storage API PHP resumable behavior in the example from another question on StackOverflow.
Please note that also Cloud SDK utility Gsutil has resumable upload implemented in it. It can be used to Synchronize content of two buckets/directories.

API-centric web application file upload

I am creating an API-centric web application using PHP. I have read a lot of articles on API-centric arhitecture and design, but I have a problem related to file uploads.
Suppose I have an image and I want to upload it to the API server. How should I upload that image and how then to receive the link to that image?
Here is how I want to create it now:
Select an image using <input type="file"> on client www.domain.com
Upload it to the www.domain.com using POST with multipart/form-data
Send this image with PUT/POST API call to the api.domain.com
api.domain.com will save this image to another server like static.domain.com and will store image's id in the database
Then, when I will need this image, I can use GET API call to the api.domain.com and I will receive image's url (something like static.domain.com/image.jpg)
Aditional questions:
Is this approach the right one or I am doing completely wrong?
Will I need an aditional server to store uploaded files if my application will be small, or I can store files right on the API server?
If I will store images on same server as API server, won't it be strange if image urls will look like api.domain.com/image.jpg?
P.S: We can skip a lot of API-related things as I need only an idea on how to deal with file uploads.
You haven't really said what kind of API that you are going to be implementing here, so I assume that it is just a restful API.
Is this approach the right one or I am doing completely wrong?
No, I wouldn't say you're doing it wrong. You would essentially send the file using POST.
Will I need an aditional server to store uploaded files if my application will be small, or I can store files right on the API server?
Yes, it will allow you to store this on the same server, I don't see why not. I doubt that you will use a lot of storage, if the application is small.
If I will store images on same server as API server, won't it be strange if image urls will look like api.domain.com/image.jpg?
The api.domain.com/image.jpg technically is just the URL that you connect to the API with and GET/POST data. It does not mean the file is going to be that URL. The API could return like:
{
type: "IMG",
id: "1",
url: "example.com/uploads/image.jpg"
}
I hope this this helps, even a little!

retrieve image in post with codeigniter

I am developing the server side php code of a mobile application using PHP with code igniter. As of now, I would like to allow users to upload images to Amazon S3 to and further allow the user to retrieve the Amazon Cloud Front url of the image that they uploaded. However, I am stuck on the image uploading part of the code. I currently get post input parameters the following way:
$userName = $this->input->post('userName');
if (!empty($userName))
{
...
}
When I search on google, all I find are scripts on how to upload images from multipart forms for example, or forms in general: Like in this link, for example
We do not use any form at this point in our app. Is there any other way to do so in php+codeigniter? Is there a similar CI way to get images from post?
EDIT: We are also thinking of uploading the images directly from our application to Amazon S3 and then storing the amazon s3 url in our database, which would also reduce traffic to our server.
Any thoughts?
Check out CodeIgniter's file upload class.

Display images stored in AWS S3 on website with php

I'm building a MYSQL database driven website on a AWS EC2 instance. Users can submit their data and we will automatically create a web page for them. The webpage will display their submitted data including a photo. Upon original submission of the user's data, we store the photo in S3 using AWSSDKforPHP. When their information is approved by an administrator a webpage is created via a php script.
I've tried the method of generating a signed url. This however requires a expiration time. Is there a way around this? It also includes your access and secret key in the request. Not sure if that is the most secure way to do things. I'd like the method to be as secure as possible so I don't want to make the bucket public. Unless there is a way to make the stored images available only for viewing.
What's the best method to use in a situation like this? Thanks for your help!!
Basically URLs to amazon S3 buckets are s3.amazonaws/bucket_name/key_name all you need to do is make sure the content on those buckets is publicly available and that mime type for those keys is indeed image (image/jpeg or image/png)
I used this web tool to manage my site. I opend my bucket just to its ip and now i can control the restriction from the tool, for the basic viewer I gave read access only

1 simultaneous download + force waiting time + Amazon S3

I must say this is the first time I ask anything here, and I'm not a developer, so please be patient with my lack of knownledge. This requirement is for a website I am creating with some friends, so it's not that I'm making money with this.
This is the problem: I want to implement some kind of restriction to downloads, very much in the same way Rapidshare or any other file sharing service does:
The user should be able to download only 1 file simultaneously
The user should wait before being able to download another file, let's say 2 hours.
However, I am not trying to create a file sharing website. I am going to upload all the files to Amazon S3, and the only thing I need is to be able to restrict the downloads. I will create the links to the files. I don't care if users are registered or not, they should be able to download anyway.
The website is built in Joomla!, which uses Apache + MySQL. The files would be located at Amazon's servers.
My question is the following. Is there any way to implement this in a not-so-extremely-complicated way? Do you know some script or web-service that could help me get this done?
I have looked around, but the only thing I've found are Payment gateways, and we don't plan to charge for downloads.
Any help will be much appreciated.
Thanks!
UPDATE: I solved this problem using this script: http://www.vibralogix.com/linklokurl/features.php
As far as I know, there is no way to check on the current status of a download from S3. Having said that, S3 really does have plenty of bandwidth available, so I wouldn't worry too much about overloading their servers :) Just last week, Amazon announced that S3 is now serving an average of 650,000 objects / second.
If you want to implement something like #Pushpesh's solution in PHP, one solution would be to use the Amazon SDK for PHP and do something like this:
<?php
#Generate presigned S3 URL to download S3 object from
# Include AWS SDK for PHP and create S3
require_once("./aws-sdk/sdk.class.php");
$s3 = new AmazonS3();
# Let S3 know which file we want to be downloading
$s3_bucket_name = "yours3bucketname";
$s3_object_path = "folder1/object1.zip";
$s3_url_lifetime = "10 minutes";
$filename = "download.zip";
#Check whether the user has already downloaded a file in last two hours
$user_can_download = true;
if($user_can_download) {
$s3_url = $s3->get_object_url($s3_bucket_name, $s3_object_path, $s3_url_lifetime, array('response' => array('content-type' => 'application/force-download', 'content-disposition' => 'attachment; filename={$filename}')));
header("Location: {$s3_url}");
}
else {
echo "Sorry, you need to wait a bit longer before you can download a file again...";
}
?>
This uses the get_object_url function, which generates pre-signed URLs that allow you to let others download files you've set to private in S3 without making these files publicly available.
As you can see, the link this generates will only be valid for 10 minutes, and it's a unique link. So you can safely let people download from this link without having to worry about people spreading the link: the link will have expired. The only way people can get a new, valid link is to go through your download script, which will refuse to generate a new link if the IP/user that is trying to initiate a download has already exceeded their usage limit. It's important that you set these files to private in S3, though: if you make them publicly available, this won't do much good. You probably also want to take a look at the docs for the S3 API that generates these pre-signed URLs.
Only 2 ways comes to my mind - you either copy a file with unique hash and let apache serve it.. then you don't have any control over when user actually ends his download (or starts). Useful for big files. Another way is to pass it through php. Still, you would need to kill download session in case user stops download.
If there is no plugin for that you won't be able to do it the easy way by adding a script or copy & paste "some" code.
So either hire somebody or you'll need to learn what you need to approach the task an your own, in other words learn how to program. Your question already contains the steps you need to implement: Record who is downloading what and when and keep track of the status of the download.
I have not tried to track a download status before but I'm not sure if it is possible to get the status somehow directly from the server that is sending the file. But you can get it from the client: Download Status with PHP and JavaScript
I'm further not sure if this will work properly in your scenario because the file will come from S3.
S3 itself has a so called feature "query string protection":
With Query string authentication, you have the ability to share Amazon
S3 objects through URLs that are valid for a predefined expiration
time.
So you need to lookup the S3 API to figure out how to implement this.
What you could try is to send an ajax request to your server when the user clicked the download link, send the amazon s3 link your server generated back as a response and have the client side javascript somehow trigger that file download then.
You can monitor user downloads by their ip address, store it in a database along with the time at which the user downloaded and the session id (with hashes of course) and check this before each download request. If the current time is less than 2 hours within the same session, block requests, else give them the download.
Table Structure:
Id Ip_Addr Session_Hash Timestamp
1 10.123.13.67 sdhiehh_09# 1978478656
2 78.86.56.11 mkdhfBH&^# 1282973867
3 54.112.192.10 _)*knbh 1445465565
This is a very basic implementation. I'm sure more robust methods exist. Hope this helps.

Categories