Here's my situation - I want to create a resized jpeg image from a user uploaded image, and then send it to S3 for storage, but am looking to avoid writing the resized jpeg to the disk and then reloading it for the S3 request.
Is there a way to do this completely in memory, with the image data JPEG formatted, saved in a variable?
Most people using PHP choose either ImageMagick or Gd2
I've never used Imagemagick; the Gd2 method:
<?php
// assuming your uploaded file was 'userFileName'
if ( ! is_uploaded_file(validateFilePath($_FILES[$userFileName]['tmp_name'])) ) {
trigger_error('not an uploaded file', E_USER_ERROR);
}
$srcImage = imagecreatefromjpeg( $_FILES[$userFileName]['tmp_name'] );
// Resize your image (copy from srcImage to dstImage)
imagecopyresampled($dstImage, $srcImage, 0, 0, 0, 0, RESIZED_IMAGE_WIDTH, RESIZED_IMAGE_HEIGHT, imagesx($srcImage), imagesy($srcImage));
// Storing your resized image in a variable
ob_start(); // start a new output buffer
imagejpeg( $dstImage, NULL, JPEG_QUALITY);
$resizedJpegData = ob_get_contents();
ob_end_clean(); // stop this output buffer
// free up unused memmory (if images are expected to be large)
unset($srcImage);
unset($dstImage);
// your resized jpeg data is now in $resizedJpegData
// Use your Undesigned method calls to store the data.
// (Many people want to send it as a Hex stream to the DB:)
$dbHandle->storeResizedImage( $resizedJpegData );
?>
Hope this helps.
This can be done using the GD library and output buffering. I don't know how efficient this is compared with other methods, but it doesn't require explicit creation of files.
//$image contains the GD image resource you want to store
ob_start();
imagejpeg($image);
$jpeg_file_contents = ob_get_contents();
ob_end_clean();
//now send $jpeg_file_contents to S3
Once you've got the JPEG in memory (using ImageMagick, GD, or your graphic library of choice), you'll need to upload the object from memory to S3.
Many PHP S3 classes seem to only support file uploads, but the one at Undesigned seems to do what we're after here -
// Manipulate image - assume ImageMagick, so $im is image object
$im = new Imagick();
// Get image source data
$im->readimageblob($image_source);
// Upload an object from a resource (requires size):
$s3->putObject($s3->inputResource($im->getimageblob(), $im->getSize()),
$bucketName, $uploadName, S3::ACL_PUBLIC_READ);
If you're using GD instead, you can use
imagecreatefromstring to read an image in from a stream, but I'm not sure whether you can get the size of the resulting object, as required by s3->inputResource above - getimagesize returns the height, width, etc, but not the size of the image resource.
Pretty late to the game on this one, but if you are using the the S3 library mentioned by ConroyP and Imagick you should use the putObjectString() method instead of putObject() due the fact getImageBlob returns a string. Example that finally worked for me:
$headers = array(
'Content-Type' => 'image/jpeg'
);
$s3->putObjectString($im->getImageBlob(), $bucket, $file_name, S3::ACL_PUBLIC_READ, array(), $headers);
I struggled with this one a bit, hopefully it helps someone else!
Realize this is an old thread, but I spent some time banging my head against the wall on this today, and thought I would capture my solution here for the next guy.
This method uses AWS SDK for PHP 2 and GD for the image resize (Imagick could also be easily used).
require_once('vendor/aws/aws-autoloader.php');
use Aws\Common\Aws;
define('AWS_BUCKET', 'your-bucket-name-here');
// Configure AWS factory
$aws = Aws::factory(array(
'key' => 'your-key-here',
'secret' => 'your-secret-here',
'region' => 'your-region-here'
));
// Create reference to S3
$s3 = $aws->get('S3');
$s3->createBucket(array('Bucket' => AWS_BUCKET));
$s3->waitUntilBucketExists(array('Bucket' => AWS_BUCKET));
$s3->registerStreamWrapper();
// Do your GD resizing here (omitted for brevity)
// Capture image stream in output buffer
ob_start();
imagejpeg($imageRes);
$imageFileContents = ob_get_contents();
ob_end_clean();
// Send stream to S3
$context = stream_context_create(
array(
's3' => array(
'ContentType'=> 'image/jpeg'
)
)
);
$s3Stream = fopen('s3://'.AWS_BUCKET.'/'.$filename, 'w', false, $context);
fwrite($s3Stream, $imageFileContents);
fclose($s3Stream);
unset($context, $imageFileContents, $s3Stream);
The Imagemagick library will let you do that. There are plenty of PHP wrappers like this one around for it (there's even example code for what you want to do on that page ;) )
I encounter the same problem, using openstack object store and php-opencloud library.
Here is my solution, which does not use the ob_start and ob_end_clean function, but store the image in memory and in temp file. The size of the memory and the temp file may be adapted at runtime.
// $image is a resource created by gd2
var_dump($image); // resource(2) of type (gd)
// we create a resource in memory + temp file
$tmp = fopen('php://temp', '$r+');
// we write the image into our resource
\imagejpeg($image, $tmp);
// the image is now in $tmp, and you can handle it as a stream
// you can, then, upload it as a stream (not tested but mentioned in doc http://docs.aws.amazon.com/aws-sdk-php/v2/guide/service-s3.html#uploading-from-a-stream)
$s3->putObject(array(
'Bucket' => $bucket,
'Key' => 'data_from_stream.txt',
'Body' => $tmp
));
// or, for the ones who prefers php-opencloud :
$container->createObject([
'name' => 'data_from_stream.txt',
'stream' => \Guzzle\Psr7\stream_for($tmp),
'contentType' => 'image/jpeg'
]);
About php://temp (from the official documentation of php):
php://memory and php://temp are read-write streams that allow temporary data to be stored in a file-like wrapper. The only difference between the two is that php://memory will always store its data in memory, whereas php://temp will use a temporary file once the amount of data stored hits a predefined limit (the default is 2 MB). The location of this temporary file is determined in the same way as the sys_get_temp_dir() function.
The memory limit of php://temp can be controlled by appending /maxmemory:NN, where NN is the maximum amount of data to keep in memory before using a temporary file, in bytes.
Maye by using the GD library.
There is a function to copy out a part of an image and resize it. Of course the part could be the whole image, that way you would only resize it.
see imagecopyresampled
Related
I have a PHP site that currently pulls images off an Azure Blob, writes it to disk using file_put_contents, then imagick reads the file from the disk using readImageFile. I would rather this live in memory than be wrote to disk, then read from disk. How can I accomplish this? When I try to ReadImageBlob, I get the below error:
Warning: Imagick::readimageblob() expects parameter 1 to be string, resource given in <file> <line>
Below is a snippet of my code (This is just testing code, not production):
// Get Data from Azure Storage Blob
$blob = $blobClient->getBlob($containerName, $documentPath);
// Get TIF file from Blob and convert to PDF
$im = new imagick();
$im->readImageBlob($blob->getContentStream());
$im->setImageFormat('pdf');
// Echo as PDF
header('Content-Type: application/pdf');
echo $im;
You should be able to use stream_get_contents to read the string from the stream that you have. Example:
$im->readImageBlob(stream_get_contents($blob->getContentStream()));
I'm creating a png and uploading immediately to S3, I'd like to log how big that file was without having to do a seperate call to the same file on S3 to work out the size on disk.
Is this possible?
$tile = imagecreatetruecolor($tileImageSize, $tileImageSize);
imagecopy($tile, $resizedMainImage, 0, 0, $currentCoordsX, $currentCoordsY, $tileSize, $tileSize);
$writeStream = fopen("s3://bucket/file.png", 'w');
imagepng($tile, $writeStream, 9); // need filesize of this action
I've tried wrapping the imagepng in ob_start(); ob_get_length(); etc with no joy.
The thing is that you will need to save the image somewhere in order to get the size. To avoid writing to the filesystem, you can use the memory storage of php. I mean something like this:
imagepng($tile, 'php://memory/image.png');
$file_size = filesize('php://memory/image.png');
You can do this before uploading the image to the server to have the information for your log.
So I have a scenario where I have to upload a cropped image to AWS.
First, I have the basic image upload working (AWS putbucket and all) so that's not the issue.
I also have the cropping of the image working (using imgAreaSelect) so that is also not the issue.
On the PHP side I also grab the image from the $_FILES['file']['tmp_name'] and create a new cropped image (using code similar to found at http://www.codeforest.net/how-to-crop-an-image-using-jquery-and-php.
However I need a way to grab the new image created on the last line
imagejpeg($new, $new_filename, 95);
into $_FILES['file']['tmp_name'] in the aws upload here
$s3->putObject(array(
'Body' => fopen($_FILES['file']['tmp_name'], 'r'),
));
So:using GD
ob_start(); // start a new output buffer
imagejpeg( $dstImage, NULL, JPEG_QUALITY);
$resizedJpegData = ob_get_contents();
ob_end_clean(); // stop this output buffer
// free up unused memmory (if images are expected to be large)
unset($srcImage);
unset($dstImage);
// your resized jpeg data is now in $resizedJpegData
// Use your Undesigned method calls to store the data.
// (Many people want to send it as a Hex stream to the DB:)
$dbHandle->storeResizedImage( bin2hex($resizedJpegData) );
'Body' => fopen($dbHandle, 'r'),
I am using the Amazon SDK for PHP and wideimage. I am resizing an image with wideimage and trying to then upload that resized image to Amazon S3.
$resized = $image->resize($width,$height);
upload
$response = $s3->create_object($myBucket, $newFilename, array(
'fileUpload' => $resized, //this does not work
));
Does anyone know the proper way to do this?
You can use a stream wrapper and use WideImage's saveToFile method. There are many stream wrappers for S3, this is one example: https://github.com/jakajancar/S3StreamWrapper.
You don't need to save an image and then upload from there.
When you resize the image, you have to convert to a string. You can do that with WideImage class.
Example:
$image = WideImage::load($_FILES["file"]['tmp_name']);
$resized = $image->resize(1024);
$data = $resized->asString('jpg');
And then when you're uploading on Amazon, you have to use the param 'body' instead of 'fileUpload'.
Example:
$response = $s3->create_object($myBucket, $newFilename, array(
'body' => $data,
));
I hope that helps.
I would like to point out few things might help someone in making choice.
First of all, I think you better go with what you are trying to do first resize image over your server and then move it to Amazon because suppose if there is some kind of way to resize and upload image at same time on the fly then your script will perform slow because script will have to resize and save it to server which is far away destined. It would be minor if there are few images but can be a problem when it's huge resizing even on high speed bandwidth and as PHP will not be able to release its resources used for image resizing until it has not completely saved the target image.
Second that if you are using a CDN (Content Delivery Network) then CDN uses PULL SERVER technique means that we do not push static content to CDN server but when a user/client ask for static content then CDN first checks its entire servers and if not found then it asks our main server for that.
Amazon S3 is not a true CDN. S3 was designed for content storage. The correct Amazon service to use for content delivery is Amazon CloudFront. And if we are saving files to any of our files to any storage server or CDN then that's called PUSH SERVER
A thorough article can be read on http://www.binarymoon.co.uk/2010/11/timthumb-cdn-amazon-s3-good/. That's actually about TimThumb but worth a good knowledge.
I ended up saving the file to the server and then uploading the file from there. If there is a better way then please let me know.
I'm building a basic analytics service, based in theory off of how Google Analytics works, but instead of requesting an actual image, I'm routing the image request to a script that accepts the data and then outputs an image. Since browsers will be requesting this image on every load, every millisecond counts.
I'm looking for the most efficient way for a file to output a gif file from a PHP script. So far, I've established 3 main methods.
Is there a more efficient way for me output a 1x1 GIF file from within a PHP script? If not, which of these is the most efficient and scalable?
Three Identified Methods
PHP image building libraries
$im = imagecreatetruecolor(1, 1);
imagefilledrectangle($im, 0, 0, 0, 0, 0xFb6b6F);
header('Content-Type: image/gif');
imagegif($im);
imagedestroy($im);
file_get_contents the image off of the server and output it
$im = file_get_contents('raw.gif');
header('Content-Type: image/gif');
echo $im;
base64_decode the image
header('Content-Type: image/gif');
echo base64_decode("R0lGODdhAQABAIAAAPxqbAAAACwAAAAAAQABAAACAkQBADs=");
(My gut was that base64 would be fastest, but I have no idea how resource intensive that function is; and that file_get_contents would likely scale less well, since it adds another file-system action.)
For reference, the GIF I'm using is here: http://i.stack.imgur.com/LQ1CR.gif
EDIT
So, the reason I'm serving this image is that my analytics library builds a query string and attaches it to this image request. Rather than parse logs, I'm routing the request to a PHP script which processes the data and responds with an image,so that the end user's browser doesn't hang or throw an error. My question is, how do I best serve that image within the confines of a script?
maybe
header('Content-Type: image/gif');
//equivalent to readfile('pixel.gif')
echo "\x47\x49\x46\x38\x37\x61\x1\x0\x1\x0\x80\x0\x0\xfc\x6a\x6c\x0\x0\x0\x2c\x0\x0\x0\x0\x1\x0\x1\x0\x0\x2\x2\x44\x1\x0\x3b";
That will output a binary string identical to the binary file contents of a 1x1 transparent gif. I'm claiming this as efficient based on the grounds that it doesn't do any slow IO such as reading a file, nor do I call any functions.
If you want to make your own version of the above hex string, perhaps so that you can change the color, you can use this to generate the php code for the echo statement.
printf('echo "%s";', preg_replace_callback('/./s', function ($matches) {
return '\x' . dechex(ord($matches[0]));
}, file_get_contents('https://upload.wikimedia.org/wikipedia/en/d/d0/Clear.gif')));
header('Content-Type: image/gif');
header("Content-Length: " . filesize("image.gif"));
$f = fopen('image.gif', 'rb');
fpassthru($f);
fclose($f);
Probably would be fastest for image from disk, but (especially if you're using bytecode caching) for a small images known in advance the base64 way will be the fastest I think. Sending Content-Length might be a good idea too, for the small image the browser would in most cases not wait for anything after receiving the bytes so while your server would take as much time, use experience will be sightly better.
Another way would be to let Apache/lighttpd/nginx serve the image, log the access and the parse it offline.
With Laravel:
$pixel = "\x47\x49\x46\x38\x39\x61\x1\x0\x1\x0\x80\x0\x0\xff\xff\xff\x0\x0\x0\x21\xf9\x4\x1\x0\x0\x0\x0\x2c\x0\x0\x0\x0\x1\x0\x1\x0\x0\x2\x2\x44\x1\x0\x3b";
return response($pixel,200,[
'Content-Type' => 'image/gif',
'Content-Length' => strlen($pixel),
]);
If anyone wants that for some reason.
Alternatively, if you don't like long(ish) hex strings in your code:
base64_decode('R0lGODlhAQABAIAAAP///wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw')
Instead of dynamically generating/outputting an image, why not just redirect to a static image?
<?php
// process query param stuff
header('Location: pixel.gif');
exit();
?>