I am trying to upload image after processing it with PHP to AWS. When I try to upload image that is already in the folder on my server the image uploads with no issue. However adding the same script into script that process uploaded image (which works saving images locally, creating cropped version and uploading link to them to local db), resize it and after reading exif data rotates correctly I only upload empty file (correctly named) that does not look like na image (downloading it from AWS I get error message that the image is broken) and after inspecting it it usually has only around 76b. What gives? Does that mean that the script needs some time to process the resizing / rotating information and therefore it uploads image that is half ready? I do not get any error messages or warnings back.
My script for the local file upload:
try{
$result = $s3->putObject([
'Bucket' => $config['s3']['bucket'],
'Key' => 'uploads/test.jpg',
'Body' => WWW_ROOT.'/penguin/xcropped/test.jpg',
'ACL' => 'public-read',
]);
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
$image_path = $result['ObjectURL'];
My script in the resizing and cropping script:
require_once('inc/aws_connect.inc.php');
try{
$result = $s3->putObject([
'Bucket' => $config['s3']['bucket'],
'Key' => 'uploads/test2.jpg',
'Body' => WWW_ROOT.$thumb_path,
'ACL' => 'public-read',
]);
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
$aws_result = $result['ObjectURL'];
echo $aws_result;
Does anyone knows why is this happening and what might be the issue. The images are around 1.7MB size. Is this maybe a problem?
In order for it to correctly work you need to have Guzzle and stream the file over to AWS. This worked for me:
use GuzzleHttp\Psr7\Stream;
use GuzzleHttp\Psr7\CachingStream;
try{
$aws_result = $s3->putObject([
'Bucket' => $config['s3']['bucket'],
'Key' => 'thumbs/thumb_' . $random_string,
'Body' => new CachingStream(
new Stream(fopen(WWW_ROOT.$thumb_path, 'r'))
),
'ACL' => 'public-read',
'ContentType' => 'image/jpg',
]);
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
$aws_result = $aws_result['ObjectURL'];
Related
I hope you are doing well.
My issue using AWS S3 is when I download images (png, jpg, jpeg) and documents (txt, doc, docx) from the bucket, the files are corrupted. I can't open them even though in the AWS S3 storage i can see them and downloaded manually and they open.
The only type I can upload and download correctly is PDF. With the other types I upload them correctly but when I download them the issue happens.
Here is my code for download:
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
// AWS Info
$bucketName = 'some text';
$IAM_KEY = 'some text';
$IAM_SECRET = 'some text';
try {
// You may need to change the region. It will say in the URL when the bucket is open
// and on creation.
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
),
'version' => '2006-03-01',
'region' => 'us-east-2',
'signature' => 'v4'
)
);
} catch (Exception $e) {
// We use a die, so if this fails. It stops here. Typically this is a REST call so this would
// return a json object.
die("Error: " . $e->getMessage());
}
$keyName = 'filepath';
// Add it to S3
try {
// Uploaded:
$result = $s3->getObject(
array(
'Bucket'=>$bucketName,
'Key' => $keyName
)
);
header("Content-Type: {$result['ContentType']}");
header('Content-Disposition: filename="' . basename($keyName) .'"');
echo $result['Body'];
} catch (Exception $e) {
die("Error: " . $e->getMessage());
}
I am trying to move a file from my php code to s3 bucket using this code:
$bucket = "bucket-name";
$s3 = S3Client::factory(array(
'key' => 'xxxxxx',
'secret' => 'xxxxxxxxxxxxxxxxxxxxxxxxxxxx'
));
$baseImageUrl = "http://www.xxxxx.com/banner_mobile.jpg";
$keyname = "31.jpg";
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'Body' => $baseImageUrl,
'ContentType' => 'image/jpg',
'ACL' => 'public-read'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
} catch(Exception $ex){
echo $ex->getMessage()."\n";
}
An empty file (or corrupted file) gets uploaded to s3 bucket. The actual size of image is 37KB, but the uploaded file was of size 40B. What is the reason for this?
As it seems you're inserting the url as the body of the file, so you're uploading a image file with text, if you want to upload that file you'll need to have the path or the content of the file
Im uploading some images to S3, how can I check if it was a successful transfer?
Here is my code. I screw up the access key on purpose so the files do not upload, how can I catch this error and act upon it?
The code below does not catch anything, even though the images fail to upload.
$this->commands[] = $this->s3->getCommand('PutObject', [
'Bucket' => env('AWS_BUCKET'),
'Key' => $name,
'Body' => $img,
'ContentType' => $mime,
'ACL' => 'public-read'
]);
$pool = new CommandPool($this->s3, $this->commands);
$promise = $pool->promise();
try {
$result = $promise->wait();
}
catch (AwsException $e) {
var_dump($e)
}
Im using php sdk 3.0
I am also using PHP SDK 3.0 and was able to get the result of the upload using the following:
$result = $s3Client->putObject(array(
'Bucket' => $bucketName,
'Key' => $backupFileName,
'SourceFile' => $backupFile,
));
$code = $result['#metadata']['statusCode'];
$uri = $result['#metadata']['effectiveUri'];
if ($code === 200) {
// Success code here
}
What I do now is creating a QR code image (PNG) and save a copy of it in a folder on the root of my server. Now I would like to save this image not on my server but on my amazon bucket.
I know how I can save a file on amazon but I can't figure out how to make this work together.
This is my original code for saving it in a root folder on my server:
$fileName = $quiz_url . '.png';
$pngAbsoluteFilePath = APPLICATION_PATH . '/../public/qrcodecaches/' . $fileName;
$urlRelativeFilePath = '/qrcodecaches/' . $fileName;
// generating
if (!file_exists($pngAbsoluteFilePath)) {
QRcode::png('http://mysitelink.com/s/'.$quiz_url, $pngAbsoluteFilePath, 'L', 4, 2);
}
And this is how I save a file on Amazon:
$bucket = 'mybucket';
$map = 'qrcodecaches';
$client = S3Client::factory(array(
'key' => 'mykey',
'secret' => 'mysecret',
));
$fileName = $quiz_url . '.png';
$keyname = $map . '/' . $fileName;
try {
$client->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'Body' => fopen('/path/to/file', 'r'),
'ACL' => 'public-read',
));
} catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
But what do I need to save in the 'Body' tag? Can somebody help me with this?
You can create the image file passing a temporary file name like this:
$pngAbsoluteFilePath = tempnam(sys_get_temp_dir(), 'qr_code_');
if (!file_exists($pngAbsoluteFilePath)) {
QRcode::png('http://mysitelink.com/s/'.$quiz_url, $pngAbsoluteFilePath, 'L', 4, 2);
}
then all you need to do is to get that file contents and pass it in to the Body of the object in bucket:
try {
$client->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'Body' => file_get_contents($pngAbsoluteFilePath), // like this
'ACL' => 'public-read'
));
}
catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
After you're done - in case of successful upload - you can delete the temporary file. It is not necessary, however recommended because it will fill your /tmp with QR images until your server is restarted:
if ( empty($e) ) {
unlink($pngAbsoluteFilePath);
}
This worked well for me.
Amazon S3Client documentation suggests that you can poll to check that the upload was successful.
$client->waitUntilObjectExists(array(
'Bucket' => $bucket,
'Key' => $keyname
));
Alternatively if you're already writing the file to disk on your server you can supply S3Client with the path to that file and let it handle the filestream itself.
try {
$client->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => '/path/to/file',
'ACL' => 'public-read'
));
} catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
Source: http://docs.aws.amazon.com/aws-sdk-php/latest/class-Aws.S3.S3Client.html#_putObject
I've been trying to figure out how to push an image to an S3 bucket using the new PHP 2.0 SDK. All I've found are tutorials for how to upload an image from your web server rather than from a local computer.
Here is the code that I'm using
$result = $s3->putObject(array(
'Bucket' => $bucketname,
'Key' => $filename,
'SourceFile' => $path,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg',
'StorageClass' => 'STANDARD'
));
$filename is just the file name that I want to appear in the bucket and $path is the local path to the file on my computer. This puts a file onto the bucket but when I try to open the image, it just shows up as an empty screen with the no image thumb nail. I checked and it seems to only be uploading like 30 bytes. Can someone please point me in the right direction?
So in order to upload you need to specify the Body as well. So if you're uploading from your computer this would be the code
$s3 = $aws->get('s3');
$filename = $_FILES['file']['name'];
$tmpFile = $_FILES['file']['tmp_name'];
$imageType = $_FILES['file']['type'];
// Upload a publicly accessible file.
// The file size, file type, and MD5 hash are automatically calculated by the SDK
try {
$s3->putObject(array(
'Bucket' => $bucketname,
'Key' => $filename,
'Body' => fopen($tmpFile, 'r'),
'ACL' => 'public-read',
'ContentType' => $imageType,
'StorageClass' => 'STANDARD'
));
} catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
where $tmpFile is the path to your file on your computer. I'm using an upload mechanism hence why i use temp. but you can just put the static path.
I had the same problem and Shahin's answer is correct, you can set
'Body' => fopen($tmpFile, 'r'),
However when I did this on my WAMP localhost, I got the error
Warning: fopen(C:\Windows\Temp\phpBDF3.tmp): failed to open stream: No such file or directory
This seems to be a windows permissions issue, and you can resolve it by copying the windows temp file to another temp file in a directory where there are no permissions problems (eg web root):
// $tmpFile was 'C:\Windows\Temp\php8A16.tmp';
// create a new temp file in webroot, and copy the windows tempfile there
$new_temp_location = tmpfile();
$new_temp_location = basename($tmpFile);
copy($tmpFile, $new_temp_location);
// now put the file in the bucket
try {
$s3->putObject(array(
'Bucket' => $bucketname,
'Key' => $filename,
'Body' => fopen($new_temp_location, 'r'),
'ACL' => 'public-read',
'ContentType' => $imageType,
'StorageClass' => 'STANDARD'
));
} catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
It worked for me - hope it saves someone else some time...
EDIT:
Or slightly simpler, you can use
'SourceFile' => $new_temp_location,
Instead of
'Body' => fopen($new_temp_location, 'r'),