I'm trying to transfer files from Shopify to S3, and I'm getting this error:
"You must specify a non-null value for the Body or SourceFile parameters." I believe it's because I'm transferring files from a remote location but I don't know how to get around this problem. I have validated that the remote file does exist, and that my code works fine if I'm uploading using a form.
My code:
require dirname(__FILE__).'/../../vendor/autoload.php';
use Aws\S3\S3Client;
$s3Client = null;
$bucket="mybucketname";
$myfile="myfilename.jpg";
$mypath="https://shopifyfilepath/".$myfile;
function SendFileToS3($filename, $filepath, $bucket)
{
global $s3Client;
if (!$s3Client)
{
$s3Client = S3Client::factory(array(
'key' => $_SERVER["AWS_ACCESS_KEY_ID"],
'secret' => $_SERVER["AWS_SECRET_KEY"]
));
}
$result = $s3Client->putObject(array(
'Bucket' => $bucket,
'Key' => $filename,
'SourceFile' => $filepath,
'ACL' => 'public-read'
));
return $result;
}
SendFileToS3($myfile, $mypath, $bucket);
You can't transfer a file directly from a HTTP path to S3. You'll need to download it to a local file (or variable) first, then transfer that.
Related
Here is my code which works if I run it locally but not if I run on my server to transfer my csv to s3 using PHP from my server.
<?php
// Include the AWS SDK using the Composer autoloader.
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-2'
]);
$bucket = 'edtopia';
$keyname = 'sample_1.csv';
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'Body' => fopen('numberclub.org/edtopiadb/df_corr.csv', 'r'),
'ACL' => 'public-read'
));
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
EDIT- This now working for me on the server, Now my challenge is to access the file on the local system to be sent to s3 using this php script where in the BODY i need to define my local system file path and this code running on the server. What is the effective way to achieve it
Here is my code, which works for forms upload (via $_FILES) (I'm omitting that part of the code because it is irrelevant):
$file = "http://i.imgur.com/QLQjDpT.jpg";
$s3 = S3Client::factory(array(
'region' => $region,
'version' => $version
));
try {
$content_type = "image/" . $ext;
$to_send = array();
$to_send["SourceFile"] = $file;
$to_send["Bucket"] = $bucket;
$to_send["Key"] = $file_path;
$to_send["ACL"] = 'public-read';
$to_send["ContentType"] = $content_type;
// Upload a file.
$result = $s3->putObject($to_send);
As I said, this works if file is a $_FILES["files"]["tmp_name"] but fails if $file is a valid image url with Uncaught exception 'Aws\Exception\CouldNotCreateChecksumException' with message 'A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.'. Does anyone know why this happens? What might be off? Tyvm for your help!
For anyone looking for option #3 (CachingStream), you can pass the PutObject command a Body stream instead of a source file.
use GuzzleHttp\Psr7\Stream;
use GuzzleHttp\Psr7\CachingStream;
...
$s3->putObject([
'Bucket' => $bucket,
'Key' => $file_path,
'Body' => new CachingStream(
new Stream(fopen($file, 'r'))
),
'ACL' => 'public-read',
'ContentType' => $content_type,
]);
Alternatively, you can just request the file using guzzle.
$client = new GuzzleHttp\Client();
$response = $client->get($file);
$s3->putObject([
'Bucket' => $bucket,
'Key' => $file_path,
'Body' => $response->getBody(),
'ACL' => 'public-read',
'ContentType' => $content_type,
]);
You have to download the file to the server where PHP is running first. S3 uploads are only for local files - which is why $_FILES["files"]["tmp_name"] works - its a file that's local to the PHP server.
I'm trying to upload an image to an AWS s3 bucket that I have created, so I have copied an pasted the standard code from the AWS documentation to accomplish this but I get an absolute blank page. Like even if I put an echo 'hi' before any statements, even that doesn't show up. Wondering if anyone can please help shed light on this topic.
<?php
//this is the path to my S3 resource
use PHP\resources\aws\Aws\S3\S3Client;
$bucket = 'i put my bucket name here';
$keyname = 'yoyo';
// $filepath should be absolute path to a file on disk
$filepath = 'images/verified.png';
// Instantiate the client.
$s3 = S3Client::factory(array(
'profile' => 'default',
));
//the contents of my credentials.ini file are
//[default]
//aws_access_key_id = myaccessidhere
//aws_secret_access_key = mysecretkeyhere
// Upload a file.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY',
)
));
echo $result['ObjectURL'];
echo 'done';
?>
Does anyone know why this code doesn't work at all? I would really really appreciate any help offerd, thank - you!
I'm trying to upload a file to my bucket. I am able to upload with Body but not SourceFile. Here's my method:
$pathToFile='/explicit/path/to/file.jpg'
// Upload an object by streaming the contents of a file
$result = $s3Client->putObject(array(
'Bucket' => $bucket,
'Key' => 'test.jpg',
'SourceFile' => $pathToFile,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
but I get this error:
You must specify a non-null value for the Body or SourceFile parameters.
I've tried different types of files and keep getting this error.
The issue had to do with not giving a good path. Looks like file_exists() only checks locally, meaning that it had to be within localhost's index.
Change
'SourceFile' => $pathToFile,
to
'Body' => $pathToFile,
I've been trying to figure out how to push an image to an S3 bucket using the new PHP 2.0 SDK. All I've found are tutorials for how to upload an image from your web server rather than from a local computer.
Here is the code that I'm using
$result = $s3->putObject(array(
'Bucket' => $bucketname,
'Key' => $filename,
'SourceFile' => $path,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg',
'StorageClass' => 'STANDARD'
));
$filename is just the file name that I want to appear in the bucket and $path is the local path to the file on my computer. This puts a file onto the bucket but when I try to open the image, it just shows up as an empty screen with the no image thumb nail. I checked and it seems to only be uploading like 30 bytes. Can someone please point me in the right direction?
So in order to upload you need to specify the Body as well. So if you're uploading from your computer this would be the code
$s3 = $aws->get('s3');
$filename = $_FILES['file']['name'];
$tmpFile = $_FILES['file']['tmp_name'];
$imageType = $_FILES['file']['type'];
// Upload a publicly accessible file.
// The file size, file type, and MD5 hash are automatically calculated by the SDK
try {
$s3->putObject(array(
'Bucket' => $bucketname,
'Key' => $filename,
'Body' => fopen($tmpFile, 'r'),
'ACL' => 'public-read',
'ContentType' => $imageType,
'StorageClass' => 'STANDARD'
));
} catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
where $tmpFile is the path to your file on your computer. I'm using an upload mechanism hence why i use temp. but you can just put the static path.
I had the same problem and Shahin's answer is correct, you can set
'Body' => fopen($tmpFile, 'r'),
However when I did this on my WAMP localhost, I got the error
Warning: fopen(C:\Windows\Temp\phpBDF3.tmp): failed to open stream: No such file or directory
This seems to be a windows permissions issue, and you can resolve it by copying the windows temp file to another temp file in a directory where there are no permissions problems (eg web root):
// $tmpFile was 'C:\Windows\Temp\php8A16.tmp';
// create a new temp file in webroot, and copy the windows tempfile there
$new_temp_location = tmpfile();
$new_temp_location = basename($tmpFile);
copy($tmpFile, $new_temp_location);
// now put the file in the bucket
try {
$s3->putObject(array(
'Bucket' => $bucketname,
'Key' => $filename,
'Body' => fopen($new_temp_location, 'r'),
'ACL' => 'public-read',
'ContentType' => $imageType,
'StorageClass' => 'STANDARD'
));
} catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
It worked for me - hope it saves someone else some time...
EDIT:
Or slightly simpler, you can use
'SourceFile' => $new_temp_location,
Instead of
'Body' => fopen($new_temp_location, 'r'),