I tried to upload large file into Amazon s3 using PHP. I have found nice solutions on various forums but these solutions are for SDK version 1 .
http://docs.aws.amazon.com/AmazonS3/latest/dev/LLuploadFilePHP.html
Of course, I have found examples on Amazon API documentation. This example expects file on local disk and can not handle with input stream.
I couldn't find similar examples for the SDK for PHPv2 as shown in first link.
Did someone solved similar problem successfully?
I recently just prepared a code sample for this. In this example I am using a file, but you can use a stream as well.
use Aws\S3\S3Client;
use Aws\Common\Enum\Size;
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => '*** your-aws-access-key-id ***',
'secret' => '*** your-aws-secret-key ***'
));
$file = fopen($filename, 'r');
// 1. Create a new multipart upload and get the upload ID.
$response = $s3->createMultipartUpload(array(
'Bucket' => $bucket,
'Key' => $keyname
);
$uploadId = $result['UploadId'];
// 2. Upload the data in parts.
$parts = array();
$partNumber = 1;
while (!feof($file)) {
$result = $s3->uploadPart(array(
'Bucket' => $bucket,
'Key' => $key,
'UploadId' => $uploadId,
'PartNumber' => $partNumber,
'Body' => fread($file, 5 * Size::MB),
));
$parts[] = array(
'PartNumber' => $partNumber++,
'ETag' => $result['ETag'],
);
}
// 3. Complete multipart upload.
$result = $s3->completeMultipartUpload(array(
'Bucket' => $bucket,
'Key' => $key,
'UploadId' => $uploadId,
'Parts' => $parts,
));
$url = $result['Location'];
fclose($file);
Related
This question already has answers here:
PutObject into directory Amazon s3 / PHP
(3 answers)
Closed 3 years ago.
I have this code below that should upload an image to my s3 bucket but I don't know how to tell it to upload to a specific folder (named videouploads) in my s3 bucket. Can anyone help?
require 'vendor/autoload.php';
use Aws\S3\S3Client;
// Instantiate an Amazon S3 client.
$s3 = new S3Client([
'version' => 'latest',
'region' => 'grabage',
'credentials' => [
'key' => 'garbage',
'secret' => 'grabage'
]
]);
$bucketName = 'cgrabage';
$file_Path = __DIR__ . '/my-image.png';
$key = basename($file_Path);
// Upload a publicly accessible file. The file size and type are determined by the SDK.
try {
$result = $s3->putObject([
'Bucket' => $bucketName,
'Key' => $key,
'Body' => fopen($file_Path, 'r'),
'ACL' => 'public-read',
]);
echo $result->get('ObjectURL');
} catch (Aws\S3\Exception\S3Exception $e) {
echo "There was an error uploading the file.\n";
echo $e->getMessage();
}
You need to put the directory information in the key.
$result = $s3->putObject([
'Bucket' => $bucketName,
'Key' => 'videouploads/' . $key,
'Body' => fopen($file_Path, 'r'),
'ACL' => 'public-read',
]);
Please assist.
I am successfully uploading objects to S3 using the following code snippet:
// Send a PutObject request and get the result object.
$result = $this->s3EncryptionClient->putObject([
'#MaterialsProvider' => $this->materialsProvider,
'#CipherOptions' => $this->cipherOptions,
'Bucket' => $this->s3BucketName,
'Key' => $key,
'ContentType' => $mimeType,
'ContentLength' => filesize($filePath),
'ContentDisposition' => "attachment; filename='" . $fileName . "'",
'Body' => fopen($filePath ,'r')
]);
And I can successfully download the object using the following snippet:
// Download the contents of the object.
$result = $this->s3EncryptionClient->getObject([
'#MaterialsProvider' => $this->materialsProvider,
'#CipherOptions' => $this->cipherOptions,
'Bucket' => $this->s3BucketName,
'Key' => $key
]);
The problem comes in when i try copy the object using the following code snippet:
$result = $this->s3Client->CopyObject([
'Bucket' => $this->s3BucketName,
'CopySource' => $CopySource,
'Key' => $dstKey,
]);
The object seems to be copied incorrectly and when i try download the new object using the download code i pasted earlier in this post, The object is not found and an AWSException
resulted in a 404 Not Found response
Any idea how I can resolve the issue?
Here is my code, which works for forms upload (via $_FILES) (I'm omitting that part of the code because it is irrelevant):
$file = "http://i.imgur.com/QLQjDpT.jpg";
$s3 = S3Client::factory(array(
'region' => $region,
'version' => $version
));
try {
$content_type = "image/" . $ext;
$to_send = array();
$to_send["SourceFile"] = $file;
$to_send["Bucket"] = $bucket;
$to_send["Key"] = $file_path;
$to_send["ACL"] = 'public-read';
$to_send["ContentType"] = $content_type;
// Upload a file.
$result = $s3->putObject($to_send);
As I said, this works if file is a $_FILES["files"]["tmp_name"] but fails if $file is a valid image url with Uncaught exception 'Aws\Exception\CouldNotCreateChecksumException' with message 'A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.'. Does anyone know why this happens? What might be off? Tyvm for your help!
For anyone looking for option #3 (CachingStream), you can pass the PutObject command a Body stream instead of a source file.
use GuzzleHttp\Psr7\Stream;
use GuzzleHttp\Psr7\CachingStream;
...
$s3->putObject([
'Bucket' => $bucket,
'Key' => $file_path,
'Body' => new CachingStream(
new Stream(fopen($file, 'r'))
),
'ACL' => 'public-read',
'ContentType' => $content_type,
]);
Alternatively, you can just request the file using guzzle.
$client = new GuzzleHttp\Client();
$response = $client->get($file);
$s3->putObject([
'Bucket' => $bucket,
'Key' => $file_path,
'Body' => $response->getBody(),
'ACL' => 'public-read',
'ContentType' => $content_type,
]);
You have to download the file to the server where PHP is running first. S3 uploads are only for local files - which is why $_FILES["files"]["tmp_name"] works - its a file that's local to the PHP server.
I'm having problems when I want to upload an image to my amazon s3 bucket.
I'm trying to upload a jpg image with the size of 238 KB. I've put a try/catch in my code to check what the error was. I always get this error:
Your proposed upload is smaller than the minimum allowed size
I've also tried this with images from 1MB and 2MB, same error ... .
Here's my code:
<?php
// Include the SDK using the Composer autoloader
require 'AWSSDKforPHP/aws.phar';
use Aws\S3\S3Client;
use Aws\Common\Enum\Size;
$bucket = 'mybucketname';
$keyname = 'images';
$filename = 'thelinktomyimage';
// Instantiate the S3 client with your AWS credentials and desired AWS region
$client = S3Client::factory(array(
'key' => 'key',
'secret' => 'secretkey',
));
// Create a new multipart upload and get the upload ID.
$response = $client->createMultipartUpload(array(
'Bucket' => $bucket,
'Key' => $keyname
));
$uploadId = $response['UploadId'];
// 3. Upload the file in parts.
$file = fopen($filename, 'r');
$parts = array();
$partNumber = 1;
while (!feof($file)) {
$result = $client->uploadPart(array(
'Bucket' => $bucket,
'Key' => $keyname,
'UploadId' => $uploadId,
'PartNumber' => $partNumber,
'Body' => fread($file, 5 * 1024 * 1024),
));
$parts[] = array(
'PartNumber' => $partNumber++,
'ETag' => $result['ETag'],
);
}
// Complete multipart upload.
try{
$result = $client->completeMultipartUpload(array(
'Bucket' => $bucket,
'Key' => $keyname,
'UploadId' => $uploadId,
'Parts' => $parts,
));
$url = $result['Location'];
fclose($file);
}
catch(Exception $e){
var_dump($e->getMessage());
}
(I've changed the bucket, keys and image link.) Has anyone had this before? Searching on the internet didn't help me much. Also searching to change the minimum upload size didn't provide much help.
UPDATE:
When I tried this with a local image (changed the filename), it worked! How can I make this work with an image that's online? Now I save it in my temp file and then upload it from there. But isn't there a way to store it directly without saving it locally?
The minimal multipart upload size is 5Mb (1), even if your total size is 100MB, each individual multipart upload (other than the last one) can't be smaller than 5MB. You probably want to use a "normal" upload, not a multipart upload.
(1) http://docs.aws.amazon.com/AmazonS3/latest/API/mpUploadUploadPart.html
This error occurs when one of the parts is less than 5MB in size and isn't the last part (the last part can be any size). fread() can return a string shorter than the specified size, so you need to keep calling fread() until you have at least 5MB of data (or you have reached the end of the file) before uploading that part.
So your 3rd step becomes:
// 3. Upload the file in parts.
$file = fopen($filename, 'r');
$parts = array();
$partNumber = 1;
while (!feof($file)) {
// Get at least 5MB or reach end-of-file
$data = '';
$minSize = 5 * 1024 * 1024;
while (!feof($file) && strlen($data) < $minSize) {
$data .= fread($file, $minSize - strlen($data));
}
$result = $client->uploadPart(array(
'Bucket' => $bucket,
'Key' => $keyname,
'UploadId' => $uploadId,
'PartNumber' => $partNumber,
'Body' => $data, // <= send our buffered part
));
$parts[] = array(
'PartNumber' => $partNumber++,
'ETag' => $result['ETag'],
);
}
I have decide to avail of amazons new server-side encryption with s3, however, I have run into a problem which I am unable to resolve.
I am using the s3 PHP class found here : https://github.com/tpyo/amazon-s3-php-class
I had been using this code to put objects originally (and it was working) :
S3::putObjectFile($file, $s3_bucket_name, $file_path, S3::ACL_PRIVATE,
array(),
array(
"Content-Disposition" => "attachment; filename=$filename",
"Content-Type" => "application/octet-stream"
)
);
I then did as instructed here : http://docs.amazonwebservices.com/AmazonS3/latest/API/index.html?RESTObjectPUT.html and added the 'x-amz-server-side-encryption' request header. But now when I try to put an object it fails without error.
My new code is :
S3::putObjectFile($file, $s3_bucket_name, $file_path, S3::ACL_PRIVATE,
array(),
array(
"Content-Disposition" => "attachment; filename=$filename",
"Content-Type" => "application/octet-stream",
"x-amz-server-side-encryption" => "AES256"
)
);
Has anybody experimented with this new feature or can anyone see an error in the code.
Cheers.
That header should be part of the $metaHeaders array and not $requestHeaders array.
S3::putObjectFile($file, $s3_bucket_name, $file_path, S3::ACL_PRIVATE,
array(
"x-amz-server-side-encryption" => "AES256"
),
array(
"Content-Disposition" => "attachment; filename=$filename",
"Content-Type" => "application/octet-stream"
)
);
Here's the method definition from the docs:
putObject (mixed $input,
string $bucket,
string $uri,
[constant $acl = S3::ACL_PRIVATE],
[array $metaHeaders = array()],
[array $requestHeaders = array()])
You might also consider using the SDK for PHP?
We can upload files with encryption using the code following
$s3->create_object($bucket_name,$destination,array(
'acl'=>AmazonS3::ACL_PUBLIC,
'fileUpload' => $file_local,
'encryption'=>"AES256"));
And you can download latest sdk from here
With the official SDK:
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
// $filepath should be absolute path to a file on disk
$filepath = '*** Your File Path ***';
// Instantiate the client.
$s3 = S3Client::factory();
// Upload a file with server-side encryption.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ServerSideEncryption' => 'AES256',
));
Changing Server-Side Encryption of an Existing Object (Copy Operation)
use Aws\S3\S3Client;
$sourceBucket = '*** Your Source Bucket Name ***';
$sourceKeyname = '*** Your Source Object Key ***';
$targetBucket = '*** Your Target Bucket Name ***';
$targetKeyname = '*** Your Target Object Key ***';
// Instantiate the client.
$s3 = S3Client::factory();
// Copy an object and add server-side encryption.
$result = $s3->copyObject(array(
'Bucket' => $targetBucket,
'Key' => $targetKeyname,
'CopySource' => "{$sourceBucket}/{$sourceKeyname}",
'ServerSideEncryption' => 'AES256',
));
Source: http://docs.aws.amazon.com/AmazonS3/latest/dev/SSEUsingPHPSDK.html
With laravel 5+ it can be done easily through filesystems.php config, you don't need to get driver or low level object.
's3' => [
'driver' => 's3',
'key' => "Your Key",
'secret' => "Your Secret",
'region' => "Bucket Region",
'bucket' => "Bucket Name",
'options' => [
'ServerSideEncryption' => 'AES256',
]
],
//Code
$disk->put("filename", "content", "public"); // will have AES for file