AWS S3 PNG image not working with FPDF - php

On my PHP FPDF script
<?php
...
$mypdf->Image("http://s3-ap-southeast-1.amazonaws.com/mybucket/path/to/the/image/file.png", null, null, 150, 150);
...
?>
and it causes errors. However when I try to do the same thing but with a different image not hosted on S3, it works.
How is this possible that S3 does not work with FPDF?

I was having the same problem and I after some digging in the fpdf source figured out the issue was with fopen(). In order to use this method with an S3 image you need to use an S3 Stream Wrapper. This requires the AWS SDK for PHP or you could also roll your own if really wanted to.
My code looks like this
$credentials = new Aws\Credentials\Credentials('KEY','SECRET');
$client = new Aws\S3\S3Client([
'version'=>'latest',
'region' => 'REGION',
'credentials' => $credentials
]);
$client->registerStreamWrapper();
// Link to file
$url = 's3://bucket/key';
// add background image
$fpdf->Image($url, 0, 0, $fpdf->GetPageWidth(), $fpdf->GetPageHeight());

Related

Uploading to S3 from Laravel Quality Lost

My application uploads images to s3. I use front-end rendering to get the colors of the image, but because uploading to s3 lowers the quality (jpeg'ish), I get more colors then desired.
$s3 = \Storage::disk('s3');
$s3->put('/images/file.jpg', '/images/file.jpg', 'public');
Is there way to prevent this quality loss? I noticed that if I upload the file directly using aws console website, the quality stays the same which is ideal.
Thank you!
In the controller Action
public function uploadFileToS3(Request $request)
{
$image = $request->file('image');
}
Next we need to assign a file name to the uploaded file.You could leave this as the original filename, but in most cases you will want to change it to keep things consistent. Let’s change it to a timestamp, and append the file extension to it.
$imageFileName = time() . '.' . $image->getClientOriginalExtension();
Now get the image content as follows
$s3 = \Storage::disk('s3');
$filePath = '/images/file.jpg' . $imageFileName;
$s3->put($filePath, file_get_contents($image), 'public');
For further info you can refer to this
I am not familiar with Laravel but i am familiar with AWS S3, and i'm using aws-sdk-php.
As far as i know, neither AWS S3 nor php-sdk don't do something implicitly under the hood. So it must be something going wrong elsewhere in your project.
You can try use plain aws-sdk-php:
$s3 = S3Client::factory([
'region' => 'us-west-2',
'credentials' => $credentials,
'version' => 'latest',
]);
$s3->putObject([
'Bucket' => 'myBucket',
'Key' => 'test/img.jpg',
'SourceFile' => '/tmp/img.jpg',
]);
It works perfectly.

Upload a video file in Akamai netstorage using PHP

I started working on uploading a file to akamai netstorage using PHP and referred few API's in GitHub. I couldn't upload a video file. Though i can create and write contents in them.
<?php
require 'Akamai.php';
$service = new Akamai_Netstorage_Service('******.akamaihd.net');
$service->authorize('key','keyname','version');
$service->upload('/dir-name/test/test.txt','sample text');
?>
I referred this API. I also referred few others but couldn't get the right way to upload a video/image file. The code which i wrote above is working perfectly. Now i need to upload a video file instead of writing contents to a text file.
There is a more modern library for Akamai's NetStorage, which is built as a plugin for FlySystem, the akamai-open/netstorage.
Once you have it installed, you need to setup the authentication and the HTTP client (based on Guzzle):
$signer = new \Akamai\NetStorage\Authentication();
$signer->setKey($key, $keyName);
$handler = new \Akamai\NetStorage\Handler\Authentication();
$handler->setSigner($signer);
$stack = \GuzzleHttp\HandlerStack::create();
$stack->push($handler, 'netstorage-handler');
$client = new \Akamai\Open\EdgeGrid\Client([
'base_uri' => $host,
'handler' => $stack
]);
$adapter = new \Akamai\NetStorage\FileStoreAdapter($client, $cpCode);
And then you can create the filesystem object, and upload the file:
$fs = new \League\Flysystem\Filesystem($adapter);
// Upload a file:
// cpCode, action, content signature, and request signature is added transparently
// Additionally, all required sub-directories are created transparently
$fs->write('/path/to/write/file/to', $fileContents);
However, because you're uploading a video file I would suggest you use a stream rather than reading the contents in to memory. To do this, you use writeStream() instead:
$fs = new \League\Flysystem\Filesystem($adapter);
$stream = fopen('/path/to/local/file', 'r+');
$fs->writeStream('/path/to/write/file/to', $stream);

Laravel 5.1 AWS S3 Storage, how to link images?

i am in the proccess of creating a "Content Management System" for a "start up company". I have a Post.php model in my project, the following code snippet is taken from the Create method:
if(Request::file('display_image') != null){
Storage::disk('s3')->put('/app/images/blog/'.$post->slug.'.jpg', file_get_contents(Request::file('display_image')));
$bucket = Config::get('filesystems.disks.s3.bucket');
$s3 = Storage::disk('s3');
$command = $s3->getDriver()->getAdapter()->getClient()->getCommand('GetObject', [
'Bucket' => Config::get('filesystems.disks.s3.bucket'),
'Key' => '/app/images/blog/'.$post->slug.'.jpg',
'ResponseContentDisposition' => 'attachment;'
]);
$request = $s3->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, '+5 minutes');
$image_url = (string) $request->getUri();
$post->display_image = $image_url;
The above code checks if there is a "display_image" file input in the request object.
If it finds a file it uploads it directly to AWS S3 storage. I want to save the link of the file in the Database, so i can link it later in my views.
Hence i use this piece of code:
$request = $s3->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, '+5 minutes');
$image_url = (string) $request->getUri();
$post->display_image = $image_url;
I get a URL, the only problem is that whenever i visit the $post->display_image URL i get a 403 permission denied. Obviously no authentication takes place when using the URL of the image.
How to solve this? I need to be able to link all my images/files from amazon S3 to the front-end interface of the website.
You could open up those S3 URLs to public viewing, but you probably wouldn't want to. You have to pay for the outgoing bandwidth every time someone views one of those images.
You might want to check out Glide, a pretty simple-to-use image library that supports S3. Make sure to reduce the load requirements on your server and wallet by setting caching headers on the images you serve.
Alternatively, you could use a CloudFront distribution as a caching proxy in front of your S3 bucket.

Streaming private videos from Amazon S3

I need to display videos / images file with ACL:PRIVATE uploaded to my Amazon S3 account on my wordpress blog.
I am a newbie to PHP oops based coding. Any script help, link references, free plugins or even Logical Algorithm will be great help :)
Thanks in advance.
This issue could be solved by implementing the following steps:
Download latest stable version of SDK from here
Extract the .zip file & place in wamp/www folder
Rename config-sample.inc.php file to config.inc.php
Add the access key & secret key (retrieved from Amazon S3 account) into above file, save & exit
create a sample file to display public / private objects from Amazon S3
The content of the file should look as follows:
require('sdk.class.php');
require('services/s3.class.php');
$s3 = new AmazonS3();
$bucket = "bucketname";
$temp_link = $s3->get_object_url($bucket, 'your/folder/path/img.jpg', '5 minute');
echo $temp_link;
In above code, the URL you receive as output is a signed URL for your private object, thus it is valid only for 5 minutes.
You may grant access for a future date and allow only authorized users to access your private content or media on Amazon S3.
This question is a little bit old, but I'm posting it anyway. I had a simliar issue today and found out there's a simple answer.
aws doc explains it clearly and has an example as well.
http://docs.aws.amazon.com/aws-sdk-php-2/guide/latest/service-s3.html#amazon-s3-stream-wrapper
Basically, you need to register AWS' stream wrapper and use s3:// protocol.
Here's my code sample.
use Aws\Common\Aws;
use Aws\S3\Enum\CannedAcl;
use Aws\S3\Exception\S3Exception;
$s3 = Aws::factory(array(
'key' => Config::get('aws.key'),
'secret' => Config::get('aws.secret'),
'region' => Config::get('aws.region')
))->get('s3');
$s3->registerStreamWrapper();
// now read file from s3
// from the doc.
// Open a stream in read-only mode
if ($stream = fopen('s3://bucket/key', 'r')) {
// While the stream is still open
while (!feof($stream)) {
// Read 1024 bytes from the stream
echo fread($stream, 1024);
}
// Be sure to close the stream resource when you're done with it
fclose($stream);
}

Creating an image without storing it as a local file

Here's my situation - I want to create a resized jpeg image from a user uploaded image, and then send it to S3 for storage, but am looking to avoid writing the resized jpeg to the disk and then reloading it for the S3 request.
Is there a way to do this completely in memory, with the image data JPEG formatted, saved in a variable?
Most people using PHP choose either ImageMagick or Gd2
I've never used Imagemagick; the Gd2 method:
<?php
// assuming your uploaded file was 'userFileName'
if ( ! is_uploaded_file(validateFilePath($_FILES[$userFileName]['tmp_name'])) ) {
trigger_error('not an uploaded file', E_USER_ERROR);
}
$srcImage = imagecreatefromjpeg( $_FILES[$userFileName]['tmp_name'] );
// Resize your image (copy from srcImage to dstImage)
imagecopyresampled($dstImage, $srcImage, 0, 0, 0, 0, RESIZED_IMAGE_WIDTH, RESIZED_IMAGE_HEIGHT, imagesx($srcImage), imagesy($srcImage));
// Storing your resized image in a variable
ob_start(); // start a new output buffer
imagejpeg( $dstImage, NULL, JPEG_QUALITY);
$resizedJpegData = ob_get_contents();
ob_end_clean(); // stop this output buffer
// free up unused memmory (if images are expected to be large)
unset($srcImage);
unset($dstImage);
// your resized jpeg data is now in $resizedJpegData
// Use your Undesigned method calls to store the data.
// (Many people want to send it as a Hex stream to the DB:)
$dbHandle->storeResizedImage( $resizedJpegData );
?>
Hope this helps.
This can be done using the GD library and output buffering. I don't know how efficient this is compared with other methods, but it doesn't require explicit creation of files.
//$image contains the GD image resource you want to store
ob_start();
imagejpeg($image);
$jpeg_file_contents = ob_get_contents();
ob_end_clean();
//now send $jpeg_file_contents to S3
Once you've got the JPEG in memory (using ImageMagick, GD, or your graphic library of choice), you'll need to upload the object from memory to S3.
Many PHP S3 classes seem to only support file uploads, but the one at Undesigned seems to do what we're after here -
// Manipulate image - assume ImageMagick, so $im is image object
$im = new Imagick();
// Get image source data
$im->readimageblob($image_source);
// Upload an object from a resource (requires size):
$s3->putObject($s3->inputResource($im->getimageblob(), $im->getSize()),
$bucketName, $uploadName, S3::ACL_PUBLIC_READ);
If you're using GD instead, you can use
imagecreatefromstring to read an image in from a stream, but I'm not sure whether you can get the size of the resulting object, as required by s3->inputResource above - getimagesize returns the height, width, etc, but not the size of the image resource.
Pretty late to the game on this one, but if you are using the the S3 library mentioned by ConroyP and Imagick you should use the putObjectString() method instead of putObject() due the fact getImageBlob returns a string. Example that finally worked for me:
$headers = array(
'Content-Type' => 'image/jpeg'
);
$s3->putObjectString($im->getImageBlob(), $bucket, $file_name, S3::ACL_PUBLIC_READ, array(), $headers);
I struggled with this one a bit, hopefully it helps someone else!
Realize this is an old thread, but I spent some time banging my head against the wall on this today, and thought I would capture my solution here for the next guy.
This method uses AWS SDK for PHP 2 and GD for the image resize (Imagick could also be easily used).
require_once('vendor/aws/aws-autoloader.php');
use Aws\Common\Aws;
define('AWS_BUCKET', 'your-bucket-name-here');
// Configure AWS factory
$aws = Aws::factory(array(
'key' => 'your-key-here',
'secret' => 'your-secret-here',
'region' => 'your-region-here'
));
// Create reference to S3
$s3 = $aws->get('S3');
$s3->createBucket(array('Bucket' => AWS_BUCKET));
$s3->waitUntilBucketExists(array('Bucket' => AWS_BUCKET));
$s3->registerStreamWrapper();
// Do your GD resizing here (omitted for brevity)
// Capture image stream in output buffer
ob_start();
imagejpeg($imageRes);
$imageFileContents = ob_get_contents();
ob_end_clean();
// Send stream to S3
$context = stream_context_create(
array(
's3' => array(
'ContentType'=> 'image/jpeg'
)
)
);
$s3Stream = fopen('s3://'.AWS_BUCKET.'/'.$filename, 'w', false, $context);
fwrite($s3Stream, $imageFileContents);
fclose($s3Stream);
unset($context, $imageFileContents, $s3Stream);
The Imagemagick library will let you do that. There are plenty of PHP wrappers like this one around for it (there's even example code for what you want to do on that page ;) )
I encounter the same problem, using openstack object store and php-opencloud library.
Here is my solution, which does not use the ob_start and ob_end_clean function, but store the image in memory and in temp file. The size of the memory and the temp file may be adapted at runtime.
// $image is a resource created by gd2
var_dump($image); // resource(2) of type (gd)
// we create a resource in memory + temp file
$tmp = fopen('php://temp', '$r+');
// we write the image into our resource
\imagejpeg($image, $tmp);
// the image is now in $tmp, and you can handle it as a stream
// you can, then, upload it as a stream (not tested but mentioned in doc http://docs.aws.amazon.com/aws-sdk-php/v2/guide/service-s3.html#uploading-from-a-stream)
$s3->putObject(array(
'Bucket' => $bucket,
'Key' => 'data_from_stream.txt',
'Body' => $tmp
));
// or, for the ones who prefers php-opencloud :
$container->createObject([
'name' => 'data_from_stream.txt',
'stream' => \Guzzle\Psr7\stream_for($tmp),
'contentType' => 'image/jpeg'
]);
About php://temp (from the official documentation of php):
php://memory and php://temp are read-write streams that allow temporary data to be stored in a file-like wrapper. The only difference between the two is that php://memory will always store its data in memory, whereas php://temp will use a temporary file once the amount of data stored hits a predefined limit (the default is 2 MB). The location of this temporary file is determined in the same way as the sys_get_temp_dir() function.
The memory limit of php://temp can be controlled by appending /maxmemory:NN, where NN is the maximum amount of data to keep in memory before using a temporary file, in bytes.
Maye by using the GD library.
There is a function to copy out a part of an image and resize it. Of course the part could be the whole image, that way you would only resize it.
see imagecopyresampled

Categories