Uploading image to s3 bucket usket using aws sdk php - php

I am trying to move a file from my php code to s3 bucket using this code:
$bucket = "bucket-name";
$s3 = S3Client::factory(array(
'key' => 'xxxxxx',
'secret' => 'xxxxxxxxxxxxxxxxxxxxxxxxxxxx'
));
$baseImageUrl = "http://www.xxxxx.com/banner_mobile.jpg";
$keyname = "31.jpg";
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'Body' => $baseImageUrl,
'ContentType' => 'image/jpg',
'ACL' => 'public-read'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
} catch(Exception $ex){
echo $ex->getMessage()."\n";
}
An empty file (or corrupted file) gets uploaded to s3 bucket. The actual size of image is 37KB, but the uploaded file was of size 40B. What is the reason for this?

As it seems you're inserting the url as the body of the file, so you're uploading a image file with text, if you want to upload that file you'll need to have the path or the content of the file

Related

AWS downloading images/document using PHP

I hope you are doing well.
My issue using AWS S3 is when I download images (png, jpg, jpeg) and documents (txt, doc, docx) from the bucket, the files are corrupted. I can't open them even though in the AWS S3 storage i can see them and downloaded manually and they open.
The only type I can upload and download correctly is PDF. With the other types I upload them correctly but when I download them the issue happens.
Here is my code for download:
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
// AWS Info
$bucketName = 'some text';
$IAM_KEY = 'some text';
$IAM_SECRET = 'some text';
try {
// You may need to change the region. It will say in the URL when the bucket is open
// and on creation.
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
),
'version' => '2006-03-01',
'region' => 'us-east-2',
'signature' => 'v4'
)
);
} catch (Exception $e) {
// We use a die, so if this fails. It stops here. Typically this is a REST call so this would
// return a json object.
die("Error: " . $e->getMessage());
}
$keyName = 'filepath';
// Add it to S3
try {
// Uploaded:
$result = $s3->getObject(
array(
'Bucket'=>$bucketName,
'Key' => $keyName
)
);
header("Content-Type: {$result['ContentType']}");
header('Content-Disposition: filename="' . basename($keyName) .'"');
echo $result['Body'];
} catch (Exception $e) {
die("Error: " . $e->getMessage());
}

AWS uploading image results in uploading empty file

I am trying to upload image after processing it with PHP to AWS. When I try to upload image that is already in the folder on my server the image uploads with no issue. However adding the same script into script that process uploaded image (which works saving images locally, creating cropped version and uploading link to them to local db), resize it and after reading exif data rotates correctly I only upload empty file (correctly named) that does not look like na image (downloading it from AWS I get error message that the image is broken) and after inspecting it it usually has only around 76b. What gives? Does that mean that the script needs some time to process the resizing / rotating information and therefore it uploads image that is half ready? I do not get any error messages or warnings back.
My script for the local file upload:
try{
$result = $s3->putObject([
'Bucket' => $config['s3']['bucket'],
'Key' => 'uploads/test.jpg',
'Body' => WWW_ROOT.'/penguin/xcropped/test.jpg',
'ACL' => 'public-read',
]);
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
$image_path = $result['ObjectURL'];
My script in the resizing and cropping script:
require_once('inc/aws_connect.inc.php');
try{
$result = $s3->putObject([
'Bucket' => $config['s3']['bucket'],
'Key' => 'uploads/test2.jpg',
'Body' => WWW_ROOT.$thumb_path,
'ACL' => 'public-read',
]);
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
$aws_result = $result['ObjectURL'];
echo $aws_result;
Does anyone knows why is this happening and what might be the issue. The images are around 1.7MB size. Is this maybe a problem?
In order for it to correctly work you need to have Guzzle and stream the file over to AWS. This worked for me:
use GuzzleHttp\Psr7\Stream;
use GuzzleHttp\Psr7\CachingStream;
try{
$aws_result = $s3->putObject([
'Bucket' => $config['s3']['bucket'],
'Key' => 'thumbs/thumb_' . $random_string,
'Body' => new CachingStream(
new Stream(fopen(WWW_ROOT.$thumb_path, 'r'))
),
'ACL' => 'public-read',
'ContentType' => 'image/jpg',
]);
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
$aws_result = $aws_result['ObjectURL'];

Upload resized image on amazon S3 with PHP SDK

I have a script wich resize and crop an image and I would like to upload the image on my amazon S3 on the fly.
The problem is that I get an error message when I try to run my script because I guess the sourcefile is not recognized as a direct path from the disk ($filepath). Do you have any idea to get through this situation?
Fatal error: Uncaught exception 'Aws\Common\Exception\InvalidArgumentException' with message 'You must specify a non-null value for the Body or SourceFile parameters.' in phar:///var/www/submit/aws.phar/Aws/Common/Client/UploadBodyListener.php:...
$myResizedImage = imagecreatetruecolor($width,$height);
imagecopyresampled($myResizedImage,$myImage,0,0,0,0, $width, $height, $origineWidth, $origineHeight);
$myImageCrop = imagecreatetruecolor(612,612);
imagecopy( $myImageCrop, $myResizedImage, 0,0, $posX, $posY, 612, 612);
//Save image on Amazon S3
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'yol';
$keyname = 'image_resized';
$filepath = $myImageCrop;
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'private-key',
'secret' => 'secrete-key',
'region' => 'eu-west-1'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
You need to convert your image resource to an actual string containing image data. You can use this function to achieve this:
function image_data($gdimage)
{
ob_start();
imagejpeg($gdimage);
return(ob_get_clean());
}
You are setting the SourceFile of the upload to $filepath, which is assigned from $myImageCrop = imagecreatetruecolor(...). As such, it's not actually a path at all — it's a GD image resource. You can't upload these to S3 directly.
You'll need to either write that image out to a file (using e.g. imagejpeg() + file_put_contents()), or run the upload using data in memory (again, from imagejpeg() or similar).
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'kkkk';
$keyname = 'test';
// $filepath should be absolute path to a file on disk
$newFielName = tempnam(null,null); // take a llok at the tempnam and adjust parameters if needed
imagejpeg($myImageCrop, $newFielName, 100); // use $newFielName in putObjectFile()
$filepath = $newFielName;
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'jjj',
'secret' => 'kkk',
'region' => 'eu-west-1'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}

Upload file on amazon S3 with PHP SDK

I'm trying to upload a picture on my amazon S3 via their PHP SDK. So I made a little script to do so. However, my script doesn't work and my exception doesn't send me back any error message.
I'm new with AWS thank you for your help.
Here is the code :
Config.php
<?php
return array(
'includes' => array('_aws'),
'services' => array(
'default_settings' => array(
'params' => array(
'key' => 'PUBLICKEY',
'secret' => 'PRIVATEKEY',
'region' => 'eu-west-1'
)
)
)
);
?>
Index.php
<?php
//Installing AWS SDK via phar
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infact';
$keyname = 'myImage';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory('config.php');
// Upload a file.
try {
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
EDIT : I'm now using this code but its still not working. I don't even have error or exception message.
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infactr';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'key',
'secret' => 'privatekey',
'region' => 'eu-west-1'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
Try something like this (from the AWS docs):
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = '<your bucket name>';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = '/path/to/image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'your AWS access key',
'secret' => 'your AWS secret access key'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
It works fine for me as long as you have the right credentials. Keep in mind that the key name is the name of your file in S3 so if you want to have your key have the same name of your file you have to do something like: $keyname = 'image.jpg'; . Also, a jpg is generally not a plain/text file type, you can ommit that Content-type field or you can just simply specify: image/jpeg
$s3 = S3Client::factory('config.php');
should be
$s3 = S3Client::factory(include 'config.php');
For those looking an up to date working version, this is what I am using
// Instantiate the client.
$s3 = S3Client::factory(array(
'credentials' => [
'key' => $s3Key,
'secret' => $s3Secret,
],
'region' => 'us-west-2',
'version' => "2006-03-01"
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $s3Bucket,
'Key' => $fileId,
'SourceFile' => $filepath."/".$fileName
));
return $result['ObjectURL'];
} catch (S3Exception $e) {
return false;
}
An alternative way to explain is by showing the curl, and how to build it in php - the pragmatic approach.
Please don't stone me for ugly code, just thought that this example is easy to follow for uploading to Azure from PHP, or other language.
$azure1 ='https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/';
$azure3 ='?sr=c&si=bucketnameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D';
$shellCmd='ls -la '.$outFileName;
$lsOutput=shell_exec($shellCmd);
#print_r($lsOutput);
$exploded=explode(' ', $lsOutput);
#print_r($exploded);
$fileLength=$exploded[7];
$curlAzure1="curl -v -X PUT -T '" . $outFileName . "' -H 'Content-Length: " . $fileLength . "' ";
$buildedCurlForUploading=$curlAzure1."'".$azure1.$outFileName.$azure3."'";
var_dump($buildedCurlForUploading);
shell_exec($buildedCurlForUploading);
This is the actual curl
shell_exec("curl -v -X PUT -T 'fileName' -H 'Content-Length: fileSize' 'https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/fileName?sr=c&si=bucketNameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D'")
Below are the code for upload image/file in amazon s3 bucket.
function upload_agreement_data($target_path, $source_path, $file_name, $content_type)
{
$fileup_flag = false;
/*------------- call global settings helper function starts ----------------*/
$bucketName = "pilateslogic";
//$global_setting_option = '__cloud_front_bucket__';
//$bucketName = get_global_settings($global_setting_option);
/*------------- call global settings helper function ends ----------------*/
if(!$bucketName)
{
die("ERROR: Template bucket name not found!");
}
// Amazon profile_template template js upload URL
$target_profile_template_js_url = "/".$bucketName."/".$target_path;
// Chatching profile_template template js upload URL
//$source_profile_template_js_url = dirname(dirname(dirname(__FILE__))).$source_path."/".$file_name;
// file name
$template_js_file = $file_name;
$this->s3->setEndpoint("s3-ap-southeast-2.amazonaws.com");
if($this->s3->putObjectFile($source_path, $target_profile_template_js_url, $template_js_file, S3::ACL_PUBLIC_READ, array(), array("Content-Type" => $content_type)))
{
$fileup_flag = true;
}
return $fileup_flag;
}

phpqrcode + save cached file to Amazon instead of folder on server

What I do now is creating a QR code image (PNG) and save a copy of it in a folder on the root of my server. Now I would like to save this image not on my server but on my amazon bucket.
I know how I can save a file on amazon but I can't figure out how to make this work together.
This is my original code for saving it in a root folder on my server:
$fileName = $quiz_url . '.png';
$pngAbsoluteFilePath = APPLICATION_PATH . '/../public/qrcodecaches/' . $fileName;
$urlRelativeFilePath = '/qrcodecaches/' . $fileName;
// generating
if (!file_exists($pngAbsoluteFilePath)) {
QRcode::png('http://mysitelink.com/s/'.$quiz_url, $pngAbsoluteFilePath, 'L', 4, 2);
}
And this is how I save a file on Amazon:
$bucket = 'mybucket';
$map = 'qrcodecaches';
$client = S3Client::factory(array(
'key' => 'mykey',
'secret' => 'mysecret',
));
$fileName = $quiz_url . '.png';
$keyname = $map . '/' . $fileName;
try {
$client->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'Body' => fopen('/path/to/file', 'r'),
'ACL' => 'public-read',
));
} catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
But what do I need to save in the 'Body' tag? Can somebody help me with this?
You can create the image file passing a temporary file name like this:
$pngAbsoluteFilePath = tempnam(sys_get_temp_dir(), 'qr_code_');
if (!file_exists($pngAbsoluteFilePath)) {
QRcode::png('http://mysitelink.com/s/'.$quiz_url, $pngAbsoluteFilePath, 'L', 4, 2);
}
then all you need to do is to get that file contents and pass it in to the Body of the object in bucket:
try {
$client->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'Body' => file_get_contents($pngAbsoluteFilePath), // like this
'ACL' => 'public-read'
));
}
catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
After you're done - in case of successful upload - you can delete the temporary file. It is not necessary, however recommended because it will fill your /tmp with QR images until your server is restarted:
if ( empty($e) ) {
unlink($pngAbsoluteFilePath);
}
This worked well for me.
Amazon S3Client documentation suggests that you can poll to check that the upload was successful.
$client->waitUntilObjectExists(array(
'Bucket' => $bucket,
'Key' => $keyname
));
Alternatively if you're already writing the file to disk on your server you can supply S3Client with the path to that file and let it handle the filestream itself.
try {
$client->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => '/path/to/file',
'ACL' => 'public-read'
));
} catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
Source: http://docs.aws.amazon.com/aws-sdk-php/latest/class-Aws.S3.S3Client.html#_putObject

Categories