here is my code here:
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => 'image.jpg',
'SourceFile' => $_FILES['image'],
'ContentType' => 'image/jpg',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY'
));
as you can see, I want to pass $_FILES['image'] into the SourceFile because that is what I want to upload to AWS S3. how can I do this because the error I get is this:
Fatal error: Uncaught RuntimeException: Unable to open Array using mode r: fopen() expects parameter 1 to be a valid path, array given in
$_FILES['image'] is an array that contains other information about the file upload. You can see all the keys here: http://php.net/manual/en/features.file-upload.post-method.php
Your file is actually at $_FILES['image']['tmp_name'].
Related
Please assist.
I am successfully uploading objects to S3 using the following code snippet:
// Send a PutObject request and get the result object.
$result = $this->s3EncryptionClient->putObject([
'#MaterialsProvider' => $this->materialsProvider,
'#CipherOptions' => $this->cipherOptions,
'Bucket' => $this->s3BucketName,
'Key' => $key,
'ContentType' => $mimeType,
'ContentLength' => filesize($filePath),
'ContentDisposition' => "attachment; filename='" . $fileName . "'",
'Body' => fopen($filePath ,'r')
]);
And I can successfully download the object using the following snippet:
// Download the contents of the object.
$result = $this->s3EncryptionClient->getObject([
'#MaterialsProvider' => $this->materialsProvider,
'#CipherOptions' => $this->cipherOptions,
'Bucket' => $this->s3BucketName,
'Key' => $key
]);
The problem comes in when i try copy the object using the following code snippet:
$result = $this->s3Client->CopyObject([
'Bucket' => $this->s3BucketName,
'CopySource' => $CopySource,
'Key' => $dstKey,
]);
The object seems to be copied incorrectly and when i try download the new object using the download code i pasted earlier in this post, The object is not found and an AWSException
resulted in a 404 Not Found response
Any idea how I can resolve the issue?
I am creating a gzip string and uploading it as an object to s3. However when I download the same file from s3 and decompress it locally with gunzip I get this error: gunzip: 111.gz: not in gzip format When I look at the mime_content_type returned in the file downloaded from s3 it is set as: application/zlib
Here is the code I am running to generate the gzip file and push it to s3:
for($i=0;$i<=100;$i++) {
$content .= $i . "\n";
}
$result = $this->s3->putObject(array(
'Bucket' => 'my-bucket-name',
'Key' => '111.gz',
'Body' => gzcompress($content),
'ACL' => 'authenticated-read',
'Metadata' => [
'ContentType' => 'text/plain',
'ContentEncoding' => 'gzip'
]
));
The strange thing is that if I view the gzip content locally before I send it to s3 I am able to decompress it and see the original string. So I must be uploading the file incorrectly, any thoughts?
According to http://docs.aws.amazon.com/aws-sdk-php/v3/api/api-s3-2006-03-01.html#putobject the ContentType and ContentEncoding parameters belong on top level, and not under Metadata. So your call should look like:
$result = $this->s3->putObject(array(
'Bucket' => 'my-bucket-name',
'Key' => '111.gz',
'Body' => gzencode($content),
'ACL' => 'authenticated-read',
'ContentType' => 'text/plain',
'ContentEncoding' => 'gzip'
));
Also it's possible that by setting ContentType to text/plain your file might be truncated whenever a null-byte occurs. I would try with'application/gzip' if you still have problems unzipping the file.
I had a very similar issue, and the only way to make it work for our file was with a code like this (slightly changed according to your example):
$this->s3->putObject(array(
'Bucket' => 'my-bucket-name',
'Key' => '111.gz',
'Body' => gzcompress($content, 9, ZLIB_ENCODING_GZIP),
'ACL' => 'public-read',
'ContentType' => 'text/javascript',
'ContentEncoding' => 'gzip'
));
The relevant part being gzcompress($content, 9, ZLIB_ENCODING_GZIP), as AWS S3 wouldn't recognize the file nor serve it in the right format without the last ZLIB_ENCODING_GZIP parameter.
Here is my code, which works for forms upload (via $_FILES) (I'm omitting that part of the code because it is irrelevant):
$file = "http://i.imgur.com/QLQjDpT.jpg";
$s3 = S3Client::factory(array(
'region' => $region,
'version' => $version
));
try {
$content_type = "image/" . $ext;
$to_send = array();
$to_send["SourceFile"] = $file;
$to_send["Bucket"] = $bucket;
$to_send["Key"] = $file_path;
$to_send["ACL"] = 'public-read';
$to_send["ContentType"] = $content_type;
// Upload a file.
$result = $s3->putObject($to_send);
As I said, this works if file is a $_FILES["files"]["tmp_name"] but fails if $file is a valid image url with Uncaught exception 'Aws\Exception\CouldNotCreateChecksumException' with message 'A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.'. Does anyone know why this happens? What might be off? Tyvm for your help!
For anyone looking for option #3 (CachingStream), you can pass the PutObject command a Body stream instead of a source file.
use GuzzleHttp\Psr7\Stream;
use GuzzleHttp\Psr7\CachingStream;
...
$s3->putObject([
'Bucket' => $bucket,
'Key' => $file_path,
'Body' => new CachingStream(
new Stream(fopen($file, 'r'))
),
'ACL' => 'public-read',
'ContentType' => $content_type,
]);
Alternatively, you can just request the file using guzzle.
$client = new GuzzleHttp\Client();
$response = $client->get($file);
$s3->putObject([
'Bucket' => $bucket,
'Key' => $file_path,
'Body' => $response->getBody(),
'ACL' => 'public-read',
'ContentType' => $content_type,
]);
You have to download the file to the server where PHP is running first. S3 uploads are only for local files - which is why $_FILES["files"]["tmp_name"] works - its a file that's local to the PHP server.
I'm trying to upload a file to my bucket. I am able to upload with Body but not SourceFile. Here's my method:
$pathToFile='/explicit/path/to/file.jpg'
// Upload an object by streaming the contents of a file
$result = $s3Client->putObject(array(
'Bucket' => $bucket,
'Key' => 'test.jpg',
'SourceFile' => $pathToFile,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
but I get this error:
You must specify a non-null value for the Body or SourceFile parameters.
I've tried different types of files and keep getting this error.
The issue had to do with not giving a good path. Looks like file_exists() only checks locally, meaning that it had to be within localhost's index.
Change
'SourceFile' => $pathToFile,
to
'Body' => $pathToFile,
I tried the following two functions, none of them works, they could upload the file to S3 but if you visit the uploaded file from a browser you could see it's being treated as application/octet-stream, that's not right...
$s3->upload('mybucket', // bucket
$filename, // key
$imagebinarydata, // body
'public-read', // acl
array('contentType' => 'image/jpeg')); // options
And
$s3->putObject(array(
'Bucket' => 'mybucket',
'Key' => $filename,
'ACL' => 'public-read',
'contentType' => 'image/jpeg',
'Body' => $imagebinarydata));
I'm using the latest AWS.PHAR
You need to use uppercase ContentType with putObject as described in the docs:
'ContentType' => 'image/jpeg'