I'm having some issues when trying to upload an image to AWS S3. It seems to upload the file correctly but, whenever I try to download or preview, it can't be opened. Currently, this is the upload code I'm using:
<?php
require_once 'classes/amazon.php';
require_once 'includes/aws/aws-autoloader.php';
use Aws\S3\S3Client;
$putdata = file_get_contents("php://input");
$request = json_decode($putdata);
$image_parts = explode(";base64,", $request->image);
$image_type_aux = explode("image/", $image_parts[0]);
$image_type = $image_type_aux[1];
$image_base64 = $image_parts[1];
$dateTime = new DateTime();
$fileName = $dateTime->getTimestamp() . "." . $image_type;
$s3Client = S3Client::factory(array(
'region' => 'eu-west-1',
'version' => '2006-03-01',
'credentials' => array(
'key' => Amazon::getAccessKey(),
'secret' => Amazon::getSecretKey(),
)
));
try {
$result = $s3Client->putObject(array(
'Bucket' => Amazon::getBucket(),
'Key' => 'banners/' . $fileName,
'Body' => $image_base64,
'ContentType' => 'image/' . $image_type,
'ACL' => 'public-read'
));
echo $result['ObjectURL'] . "\n";
} catch(S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
So, when I check the console after uploading the image file, it has the expected size, permissions and headers but, as I said, whenever I try to open the file, it fails.
What could be the problem here? Thanks in advance.
The issue here is you appear to be uploading the base64 encoded version of the image and not the raw bytes of the image. Take $image_base64 and decode into raw bytes first http://php.net/manual/en/function.base64-decode.php . I am sure if you tried to open those "images" in a text editor you would see base64 hex data.
you can upload "on the fly" by using the function $s3Client->upload like the following example:
<?php
$bucket = 'bucket-name';
$filename = 'image-path.extension';
$imageData = base64_decode(end(explode(",", $base64)));
$upload = $s3Client->upload($bucket, $filename, $imageData, 'public-read');
$upload->get('ObjectURL');
I use silex php 2.0 and the following code fount
$s3 = $app['aws']->createS3();
$putdata = file_get_contents("php://input");
$data = json_decode($request->getContent(), true);
$data = (object) $data;
$image_parts = explode(";base64,", $data->image);
$image_type_aux = explode("image/", $image_parts[0]);
$image_type = $image_type_aux[1];
$image_base64 = base64_decode($image_parts[1]);
$result = $s3->putObject([
'ACL' => 'public-read',
'Body' => $image_base64,
'Bucket' => 'name-bucket',
'Key' => 'test_img.jpeg',
'ContentType' => 'image/' . $image_type,
]);
var_dump($result['ObjectURL']);
Related
I have uploaded my codeigniter project to aws elastic beanstalk and i am trying to upload an imagefile and an audio file after the image is done uploading.
Now the image upload is a successful one and below is the code
$path = $_FILES['thumbnail']['name'];
$file_temp = $_FILES['thumbnail']['tmp_name'];
$ext = pathinfo($path, PATHINFO_EXTENSION);
$new_name = time() . "." . $ext;
$file_type = $_FILES['thumbnail']['type'];
$upload_path = 'uploads/thumbnails/';
$s3->putObject(
array(
'Bucket' => 'bucket-name',
'Key' => $upload_path . $new_name,
'SourceFile' => $file_temp,
'ContentType' => $file_type,
'ACL' => 'public-read',
'StorageClass' => 'STANDARD'
)
);
and for the audio file upload the code sample is as below
$path = $_FILES['audio']['name'];
$file_name = "audio_" . time() . "." . pathinfo($path, PATHINFO_EXTENSION);
$file_temp = $_FILES['audio']['tmp_name'];
$file_type = $_FILES['audio']['type'];
$upload_path = 'uploads/audios/';
$s3->putObject(
array(
'Bucket' => 'bucket-name',
'Key' => $upload_path . $file_name,
'SourceFile' => $file_temp,
'ContentType' => $file_type,
'ACL' => 'public-read',
'StorageClass' => 'STANDARD'
)
);
And here is the snippet that executes the 2 upload methods
//upload image file
$thumb_upload = $this->upload_thumbnail();
//upload video file
$audio_upload = $this->upload_audio();
But i am trying to find out what the error is because if the image upload is a success, why is the audio upload giving an error as
Error:Unable to open using mode r: fopen(): Filename cannot be empty
Thanks in advance
I have managed to upload large files to s3 using multiPart Upload, but I can't download them again using the getObject function. Is there another way I can achieve this?
Here my code:
$keyname= 'key';
$bucket = 'bucketname';
$fileName = 'filename.txt';
$result = $s3->getObject([
'Bucket' => $bucket,
'Key' => $keyname
]);
var_dump($fileName);
$result['ContentDisposition'] = 'attachment; filename="'.$fileName.'"';
$result['fileName'] = $result['ContentDisposition'];
header("Content-Type: {$result['ContentType']}");
header("Content-Disposition: {$result['ContentDisposition']}");
header("Content-Length: {$result['ContentLength']}");
echo $result['Body'];
Thanks For the help. This is my solution:
$keyname= 'key';
$bucket = 'bucketname';
$fileName = 'filename.txt';
#create S3 Client
$s3 = new S3Client([
'version' => 'latest',
'region' => 'eu-central-1',
'credentials' => [
]
]);
$cmd = $s3->getCommand('GetObject', [
'Bucket' => $bucket,
'Key' => $keyname,
'ResponseContentDisposition' => 'attachment; filename="'.$fileName.'"'
]);
$request = $s3->createPresignedRequest($cmd, '+15 min');
$presignedUrl = (string)$request->getUri();
echo $presignedUrl;
after this, I download it in my frontend with an a tag via js
you can create a Presigned connection with S3 like this
$keyname= 'key';
$bucket = 'bucketname';
$fileName = 'filename.txt';
$command = $s3->getCommand('GetObject', array(
'Bucket' => $bucket,
'Key' => $keyname
'ResponseContentDisposition' => 'attachment; filename="'.$fileName.'"'
));
$signedUrl = $command->createPresignedUrl('+15 minutes');
header('Location: '.$signedUrl);
I'd created my webservice to upload files, it's working fine, but would like to use compress gzip, this part it's not working, how can I upload my files with gzip ?
$client = new Google_Client();
$storageService = new Google_Service_Storage($client );
$bucket = $this->bucketName;
$file_name = md5(uniqid(rand(), true)) . '.' . $ext;
$file_content = urldecode($myFile);
try {
$postbody = array(
'name' => $file_name,
'data' => file_get_contents($file_content),
'uploadType' => "resumable",
'predefinedAcl' => 'publicRead',
);
$options = array('Content-Encoding' => 'gzip');
$gsso = new Google_Service_Storage_StorageObject();
$gsso->setName( $file_name );
$gsso->setContentEncoding( "gzip" );
$storageService->objects->insert( $bucket, $gsso, $postbody, $options );
} catch ( Exception $e){
$result['error'] = json_decode($e->getMessage());
}
I have a script wich resize and crop an image and I would like to upload the image on my amazon S3 on the fly.
The problem is that I get an error message when I try to run my script because I guess the sourcefile is not recognized as a direct path from the disk ($filepath). Do you have any idea to get through this situation?
Fatal error: Uncaught exception 'Aws\Common\Exception\InvalidArgumentException' with message 'You must specify a non-null value for the Body or SourceFile parameters.' in phar:///var/www/submit/aws.phar/Aws/Common/Client/UploadBodyListener.php:...
$myResizedImage = imagecreatetruecolor($width,$height);
imagecopyresampled($myResizedImage,$myImage,0,0,0,0, $width, $height, $origineWidth, $origineHeight);
$myImageCrop = imagecreatetruecolor(612,612);
imagecopy( $myImageCrop, $myResizedImage, 0,0, $posX, $posY, 612, 612);
//Save image on Amazon S3
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'yol';
$keyname = 'image_resized';
$filepath = $myImageCrop;
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'private-key',
'secret' => 'secrete-key',
'region' => 'eu-west-1'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
You need to convert your image resource to an actual string containing image data. You can use this function to achieve this:
function image_data($gdimage)
{
ob_start();
imagejpeg($gdimage);
return(ob_get_clean());
}
You are setting the SourceFile of the upload to $filepath, which is assigned from $myImageCrop = imagecreatetruecolor(...). As such, it's not actually a path at all — it's a GD image resource. You can't upload these to S3 directly.
You'll need to either write that image out to a file (using e.g. imagejpeg() + file_put_contents()), or run the upload using data in memory (again, from imagejpeg() or similar).
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'kkkk';
$keyname = 'test';
// $filepath should be absolute path to a file on disk
$newFielName = tempnam(null,null); // take a llok at the tempnam and adjust parameters if needed
imagejpeg($myImageCrop, $newFielName, 100); // use $newFielName in putObjectFile()
$filepath = $newFielName;
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'jjj',
'secret' => 'kkk',
'region' => 'eu-west-1'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
I'm trying to upload a picture on my amazon S3 via their PHP SDK. So I made a little script to do so. However, my script doesn't work and my exception doesn't send me back any error message.
I'm new with AWS thank you for your help.
Here is the code :
Config.php
<?php
return array(
'includes' => array('_aws'),
'services' => array(
'default_settings' => array(
'params' => array(
'key' => 'PUBLICKEY',
'secret' => 'PRIVATEKEY',
'region' => 'eu-west-1'
)
)
)
);
?>
Index.php
<?php
//Installing AWS SDK via phar
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infact';
$keyname = 'myImage';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory('config.php');
// Upload a file.
try {
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
EDIT : I'm now using this code but its still not working. I don't even have error or exception message.
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infactr';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'key',
'secret' => 'privatekey',
'region' => 'eu-west-1'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
Try something like this (from the AWS docs):
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = '<your bucket name>';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = '/path/to/image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'your AWS access key',
'secret' => 'your AWS secret access key'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
It works fine for me as long as you have the right credentials. Keep in mind that the key name is the name of your file in S3 so if you want to have your key have the same name of your file you have to do something like: $keyname = 'image.jpg'; . Also, a jpg is generally not a plain/text file type, you can ommit that Content-type field or you can just simply specify: image/jpeg
$s3 = S3Client::factory('config.php');
should be
$s3 = S3Client::factory(include 'config.php');
For those looking an up to date working version, this is what I am using
// Instantiate the client.
$s3 = S3Client::factory(array(
'credentials' => [
'key' => $s3Key,
'secret' => $s3Secret,
],
'region' => 'us-west-2',
'version' => "2006-03-01"
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $s3Bucket,
'Key' => $fileId,
'SourceFile' => $filepath."/".$fileName
));
return $result['ObjectURL'];
} catch (S3Exception $e) {
return false;
}
An alternative way to explain is by showing the curl, and how to build it in php - the pragmatic approach.
Please don't stone me for ugly code, just thought that this example is easy to follow for uploading to Azure from PHP, or other language.
$azure1 ='https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/';
$azure3 ='?sr=c&si=bucketnameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D';
$shellCmd='ls -la '.$outFileName;
$lsOutput=shell_exec($shellCmd);
#print_r($lsOutput);
$exploded=explode(' ', $lsOutput);
#print_r($exploded);
$fileLength=$exploded[7];
$curlAzure1="curl -v -X PUT -T '" . $outFileName . "' -H 'Content-Length: " . $fileLength . "' ";
$buildedCurlForUploading=$curlAzure1."'".$azure1.$outFileName.$azure3."'";
var_dump($buildedCurlForUploading);
shell_exec($buildedCurlForUploading);
This is the actual curl
shell_exec("curl -v -X PUT -T 'fileName' -H 'Content-Length: fileSize' 'https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/fileName?sr=c&si=bucketNameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D'")
Below are the code for upload image/file in amazon s3 bucket.
function upload_agreement_data($target_path, $source_path, $file_name, $content_type)
{
$fileup_flag = false;
/*------------- call global settings helper function starts ----------------*/
$bucketName = "pilateslogic";
//$global_setting_option = '__cloud_front_bucket__';
//$bucketName = get_global_settings($global_setting_option);
/*------------- call global settings helper function ends ----------------*/
if(!$bucketName)
{
die("ERROR: Template bucket name not found!");
}
// Amazon profile_template template js upload URL
$target_profile_template_js_url = "/".$bucketName."/".$target_path;
// Chatching profile_template template js upload URL
//$source_profile_template_js_url = dirname(dirname(dirname(__FILE__))).$source_path."/".$file_name;
// file name
$template_js_file = $file_name;
$this->s3->setEndpoint("s3-ap-southeast-2.amazonaws.com");
if($this->s3->putObjectFile($source_path, $target_profile_template_js_url, $template_js_file, S3::ACL_PUBLIC_READ, array(), array("Content-Type" => $content_type)))
{
$fileup_flag = true;
}
return $fileup_flag;
}