Upload ZipArchive file to AWS S3 without Saving to Directory - php

My script takes files that have been passed into a function, and combines/saves them as a compressed file using ZipArchive onto the directory on my server.
Then I upload the zipped file to an AWS S3 bucket, and delete the uploaded file off my server.
However, is there a way to save the ZipArchive as a variable or temporary file and upload it directly to AWS without saving to and then deleting from my server?
$files = $_GET['json'];
$zipFolder = new ZipArchive;
$zipPath = "folder/compressedfile.zip";
if ($zipFolder->open($zipPath, ZipArchive::CREATE) === TRUE) {
foreach ($files as $file) {
$textString = $file['text'];
$zipFolder->addFromString($file['name'] . '.txt', $textString);
}
}
$zipFolder->close();
$s3Client = new S3Client([
'region' => '--region--',
'version' => 'latest',
'credentials' => [
'key' => '--key--',
'secret' => '--secret--',
],
]);
$bucketName = '--bucket--';
$result = $s3Client->putObject([
'Bucket' => $bucketName,
'Key' => 'compressedfile.zip',
'SourceFile' => $zipPath
]);
unlink($zipPath);

Yes, you can stream data from memory to an S3 object. You don't have to upload a file from disk.
Take a look at the S3 Stream Wrapper as one option.

Related

Uploading file to specific folder in s3 bucket, sdk php [duplicate]

This question already has answers here:
PutObject into directory Amazon s3 / PHP
(3 answers)
Closed 3 years ago.
I have this code below that should upload an image to my s3 bucket but I don't know how to tell it to upload to a specific folder (named videouploads) in my s3 bucket. Can anyone help?
require 'vendor/autoload.php';
use Aws\S3\S3Client;
// Instantiate an Amazon S3 client.
$s3 = new S3Client([
'version' => 'latest',
'region' => 'grabage',
'credentials' => [
'key' => 'garbage',
'secret' => 'grabage'
]
]);
$bucketName = 'cgrabage';
$file_Path = __DIR__ . '/my-image.png';
$key = basename($file_Path);
// Upload a publicly accessible file. The file size and type are determined by the SDK.
try {
$result = $s3->putObject([
'Bucket' => $bucketName,
'Key' => $key,
'Body' => fopen($file_Path, 'r'),
'ACL' => 'public-read',
]);
echo $result->get('ObjectURL');
} catch (Aws\S3\Exception\S3Exception $e) {
echo "There was an error uploading the file.\n";
echo $e->getMessage();
}
You need to put the directory information in the key.
$result = $s3->putObject([
'Bucket' => $bucketName,
'Key' => 'videouploads/' . $key,
'Body' => fopen($file_Path, 'r'),
'ACL' => 'public-read',
]);

How to Download S3 Object and store it on server

I'm trying to Download Private S3 Object and store it on website Server
Here is what I'm Trying
$s3 = new S3Client([
'version' => 'latest',
'region' => 'ap-south-1',
'credentials' => array(
'key' => '*****',
'secret' => '*******'
)
]);
$command = $s3->getCommand('GetObject', array(
'Bucket' => 'bucket_name',
'Key' => 'object_name_in_s3'
'ResponseContentDisposition' => 'attachment; filename="'.$my_file_name.'"'
));
$signedUrl = $command->createPresignedUrl('+15 minutes');
echo $signedUrl;
How can i save these files on my server
From Get an Object Using the AWS SDK for PHP:
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
$filepath = '*** Your File Path ***';
// Instantiate the client.
$s3 = S3Client::factory();
// Save object to a file.
$result = $s3->getObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SaveAs' => $filepath
));
If you just want to download a file from the command line (instead of an app), you can use the AWS Command-Line Interface (CLI) -- it has an aws s3 cp command.
The Pre-signed URL in your code can be used to grant time-limited access to a private object stored in an Amazon S3 bucket. Typically, your application generates the URL and includes it in a web page for users to click and download the object. There is no need to use it on the server-side, because the server would have credentials that are authorized to access content in Amazon S3.

Can't upload image to S3 bucket using direct url of image

Here is my code, which works for forms upload (via $_FILES) (I'm omitting that part of the code because it is irrelevant):
$file = "http://i.imgur.com/QLQjDpT.jpg";
$s3 = S3Client::factory(array(
'region' => $region,
'version' => $version
));
try {
$content_type = "image/" . $ext;
$to_send = array();
$to_send["SourceFile"] = $file;
$to_send["Bucket"] = $bucket;
$to_send["Key"] = $file_path;
$to_send["ACL"] = 'public-read';
$to_send["ContentType"] = $content_type;
// Upload a file.
$result = $s3->putObject($to_send);
As I said, this works if file is a $_FILES["files"]["tmp_name"] but fails if $file is a valid image url with Uncaught exception 'Aws\Exception\CouldNotCreateChecksumException' with message 'A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.'. Does anyone know why this happens? What might be off? Tyvm for your help!
For anyone looking for option #3 (CachingStream), you can pass the PutObject command a Body stream instead of a source file.
use GuzzleHttp\Psr7\Stream;
use GuzzleHttp\Psr7\CachingStream;
...
$s3->putObject([
'Bucket' => $bucket,
'Key' => $file_path,
'Body' => new CachingStream(
new Stream(fopen($file, 'r'))
),
'ACL' => 'public-read',
'ContentType' => $content_type,
]);
Alternatively, you can just request the file using guzzle.
$client = new GuzzleHttp\Client();
$response = $client->get($file);
$s3->putObject([
'Bucket' => $bucket,
'Key' => $file_path,
'Body' => $response->getBody(),
'ACL' => 'public-read',
'ContentType' => $content_type,
]);
You have to download the file to the server where PHP is running first. S3 uploads are only for local files - which is why $_FILES["files"]["tmp_name"] works - its a file that's local to the PHP server.

The last step of a AWS EC2 to S3 file upload

I have this code :
require '/home/ubuntu/vendor/autoload.php';
$sharedConfig = [
'region' => 'us-west-2',
'version' => 'latest'
];
$sdk = new Aws\Sdk($sharedConfig);
$s3Client = $sdk->createS3();
$result = $s3Client->putObject([
'Bucket' => 'my-bucket',
'Key' => $_FILES["fileToUpload"]["name"],
'Body' => $_FILES["fileToUpload"]["tmp_name"]
]);
It works, basically. It sends a file to S3. But it apparently sends it badly since it always shows as a corrupted file... Can anyone tell me what I am doing wrong?
To be specific - the image I am uploading is a jpg image and when I try to look at it on the S3 instance, I am told that it "cannot be displayed because it contains errors"

Copying remote file to S3

I'm trying to transfer files from Shopify to S3, and I'm getting this error:
"You must specify a non-null value for the Body or SourceFile parameters." I believe it's because I'm transferring files from a remote location but I don't know how to get around this problem. I have validated that the remote file does exist, and that my code works fine if I'm uploading using a form.
My code:
require dirname(__FILE__).'/../../vendor/autoload.php';
use Aws\S3\S3Client;
$s3Client = null;
$bucket="mybucketname";
$myfile="myfilename.jpg";
$mypath="https://shopifyfilepath/".$myfile;
function SendFileToS3($filename, $filepath, $bucket)
{
global $s3Client;
if (!$s3Client)
{
$s3Client = S3Client::factory(array(
'key' => $_SERVER["AWS_ACCESS_KEY_ID"],
'secret' => $_SERVER["AWS_SECRET_KEY"]
));
}
$result = $s3Client->putObject(array(
'Bucket' => $bucket,
'Key' => $filename,
'SourceFile' => $filepath,
'ACL' => 'public-read'
));
return $result;
}
SendFileToS3($myfile, $mypath, $bucket);
You can't transfer a file directly from a HTTP path to S3. You'll need to download it to a local file (or variable) first, then transfer that.

Categories