So I currently have an upload system working using AWS S3 to upload images.
Here's the code:
//Upload image to S3
$s3 = Aws\S3\S3Client::factory(array('key' => /*mykey*/, 'secret' => /*myskey*/,));
try {
$s3->putObject(array(
'Bucket' => "bucketname",
'Key' => $file_name,
'Body' => fopen(/*filelocation*/, 'r+')
));
} catch(Exception $e) {
//Error
}
This image can be a jpeg or png, and I want to convert it to a png before uploading. To do this I use:
//This is simplified, please don't warn about transparency, etc.
$image = imagecreatetruecolor($width, $height);
imagecopyresampled($image, $source, 0, 0, 0, 0, etc.);
So I have this $image object in memory.
I want to upload this to S3 without having to save it locally, upload it and then delete it locally; this extra step seems pointless. But I can't work out how to upload this $image object directly.
Any ideas how this would be done? I'd assumed fopen() would create an object of a similar type to imagecreatetruecolor(), but I've tried passing the $image object in and it doesn't work - whereas it does if I open an image locally with fopen().
You can capture the content of a GD image resource using output buffering:
ob_start();
imagepng($image);
$pngdata = ob_get_clean();
Related
Please, anyone, help me I want to set the Image Resolution to 300dpi for the uploaded image using Image Intervention.
Any alternate solution is also welcome.
What I want:
Upload Image -> Resize image to (100 X 100) -> Set image size to less than 30KB -> and set image resolution to 300dpi -> auto download
I have done everything else the resolution to my project... Here I'm sending the code and link ...
$width = 400;
$height = 200;
$file = $request->file('upload');
$image = Image::make($file)->resize($width, $height)->encode('jpg');
$headers = [
'Content-Type' => 'image/jpeg',
'Content-Disposition' => 'attachment; filename=utipanpsa_'.$request->type.'.jpg',
];
return response()->stream(function() use ($image) {
echo $image;
}, 200, $headers);
https://www.utipanpsa.com/cropping-tools
acording to document you can use imageresolution() if you have php >= 7.2 like this:
//sets the image resolution to 200
imageresolution($imageRecource, 200);
In my application there is a call which might request a resized version of the image. In the case when it is resized I want to store the resized version long-term onto gridfs, exactly like the other images.
I can easily resize the image and return it using Intervention (http://image.intervention.io/getting_started/introduction) however it seems to be that it's not possible to store the resized image to the DB without saving it to a temporary file.
In particular this fails:
$bucket = DB::getMongoDB()->selectGridFSBucket();
$stream = $bucket->openDownloadStreamByName($name);
$document = $bucket->getFileDocumentForStream($stream);
$metadata = $document['metadata']->getArrayCopy();
$file = stream_get_contents($stream);
$img = Image::cache(function($image) use ($file){
$image->make($file);
}, 60, true);
$img->resize($width, $height, function ($constraint) {
$constraint->aspectRatio();
$constraint->upsize();
});
$bucket->uploadFromStream($randomName, $img, ['metadata' => $metadata]);
The call to uploadFromStream fails saying: Expected $source to have type "resource" but found "Intervention\Image\CachedImage".
Trying to change:
$bucket->uploadFromStream($randomName, $img, ['metadata' => $metadata]);
to:
$bucket->uploadFromStream($randomName, $img->stream('png'), ['metadata' => $metadata]);
Leads to the same error only that the type changes from Intevention\Image\CachedImage to GuzzleHttp\Psr7\Stream.
Now I could do:
$img->save('/tmp/test.png')
$stream = fopen('/tmp/test.png')
$bucket->uploadFromStream($randomName, $stream, ['metadata' => $metadata]);
but this is abysmally bad because:
it has a race condition. I would need to ensure the filename used is unique otherwise it might be overwritten
it writes to disk which means it's going to be way slower than just using an in-memory stream
it writes to disk which means it deteriorates more. Consider that this write is completely useless so I'd avoid using it. It can be triggered quite often.
So, is there a way to save the resized image to GridFS without passing throught the disk?
I found a solution: you can obtain the binary data as a string using $img->response()->content() and then instead of using uploadFromStream you can use openUploadStream and write the data.
The full working solution would be:
$bucket = DB::getMongoDB()->selectGridFSBucket();
$stream = $bucket->openDownloadStreamByName($name);
$document = $bucket->getFileDocumentForStream($stream);
$metadata = $document['metadata']->getArrayCopy();
$file = stream_get_contents($stream);
$img = Image::cache(function($image) use ($file){
$image->make($file);
}, 60, true);
$img->resize($width, $height, function ($constraint) {
$constraint->aspectRatio();
$constraint->upsize();
});
$stream = $bucket->openUploadStream($randomName, ['metadata' => $metadata]);
fwrite($stream, $img->response()->content());
fclose($stream);
Im trying to use Intervention with AWS S3, but the resize method is not working for me.
$img = Image::make($file)->rotate($rotate)->crop($width, $width, $x, $y);
$img->backup();
foreach(Config::get('img.image-sizes') as $_k => $_v){
$img->resize($_v['w'], $_v['h']);
$s3->queue($img, $name);
$img->reset();
}
The images upload fine to S3, but resize fails, I get all images the size of the original image
If I call save() on the image, the resize works, but I do not wish to save the image as I am uploading via S3, putting the $img as the body:
$this->commands[] = $this->s3->getCommand('PutObject', [
'Bucket' => env('AWS_BUCKET'),
'Key' => Config::get('img.image-path').$name,
'Body' => $img,
'ContentType' => $img->mime(),
'ACL' => 'public-read'
]);
To get this to work will I have to call save on each image first? If so is there a way to get this to play nice with S3, ideally I do not want to save them to my server first before sending them off to S3.
This is the code I used to get my uploaded images resized and saved to Amazon S3 using Laravel 5 and Intervention.
$imageFile = \Image::make($imageUpload)->resize(600, 600)->stream();
$imageFile = $imageFile->__toString();
$filename = 'myFileName.png';
$s3 = \Storage::disk('s3');
$s3_response = $s3->put('/'.$filename, $imageFile, 'public');
I'm not sure what I'm doing wrong, but PNG files uploaded to an S3 Amazon bucket show a white color instead of transparent, even though the local file does show transparency. Here is what I have:
$thumb = \PhpThumbFactory::create(
$content,
array(
'jpegQuality' => 100
),
true
);
$thumb->setFormat('PNG');
ob_start();
$thumb->show(true);
$content = ob_get_clean();
//This part is for testing purposes, I store the file locally to see that the transparency is there
$file = 'picture' . '-' . time() . '.png';
file_put_contents($file, $content); //The file created is an image which perfectly shows the transparent, as it should be
//The code below should upload the same file, but somehow it replaces transparency with a white color.
$S3 = new Zend_Service_Amazon_S3($myKey, $mySecret);
$response = $S3->putObject(
$bucket,
$content,
array(
Zend_Service_Amazon_S3::S3_CONTENT_TYPE_HEADER => 'image/png',
Zend_Service_Amazon_S3::S3_ACL_HEADER => Zend_Service_Amazon_S3::S3_ACL_PUBLIC_READ
)
);
Am I missing something when doing the upload? Is there anything that I should configure on the bucket?
I have multiple Images - saved as Base64 Strings and now i want to resize these images to get thumbnails of them...
Best would be using Javascript (Node-Server) to resize them, but it would be possible to resize them with php, too.
Thanks in advance
I agree to the method from Jon Hanna: Do Parsing the Base64code then load it to GD Image before Resample. However to get it back as data it is not as easy as I though. On php in GAE it will need to enable output buffering by setting output_buffering = "On" in php.ini file.
Here I explain the step in detail.
This doc is taken as reference to Create Image Resource using the Parsing of Base64code: http://php.net/manual/en/function.imagecreatefromstring.php
// Create image resource from Base64code
$data64 = 'iVBORw0KGgoAAAANSUhEUgAAABwAAAASCAMAAAB/2U7WAAAABl'
. 'BMVEUAAAD///+l2Z/dAAAASUlEQVR4XqWQUQoAIAxC2/0vXZDr'
. 'EX4IJTRkb7lobNUStXsB0jIXIAMSsQnWlsV+wULF4Avk9fLq2r'
. '8a5HSE35Q3eO2XP1A1wQkZSgETvDtKdQAAAABJRU5ErkJggg==';
$image = imagecreatefromstring(base64_decode($data64));
This is an image resource which can be directly put to the Resample function: http://php.net/manual/en/function.imagecopyresampled.php
// Resample
$image_p = imagecreatetruecolor($new_w, $new_h);
imagecopyresampled($image_p, $image, 0, 0, 0, 0, $new_w, $new_h, $org_w, $org_h);
The result is also an image resource. To get it as a data we need Buffering.
See
how to create a base64encoded string from image resource
// Buffering
ob_start();
imagepng($image_p);
$data = ob_get_contents();
ob_end_clean();
Using doc below I set a GCS bucket on my project as a website so I can Store & Display it directly:
https://cloud.google.com/storage/docs/website-configuration#tips
//Store & Display
$context = stream_context_create([
'gs' =>[
'acl'=> 'public-read',
'Content-Type' => 'image/jpeg',
'enable_cache' => true,
'enable_optimistic_cache' => true,
'read_cache_expiry_seconds' => 300,
]
]);
file_put_contents("gs://mybucket/resample/image.jpeg", $data, false, $context);
header("Location: http://mybucket/resample/image.jpeg");
Your best bet is to use PHPThumb in PHP.
An alternative is to invoke ImageMagick however you prefer:
http://coffeeshopped.com/2009/01/creating-image-thumbnails-using-php-and-imagemagick
http://www.hacksparrow.com/node-js-image-processing-and-manipulation.html
No idea how to do that (or well, anything) in node.js, but the PHP bit of your question is certainly possible. After parsing the Base64, load it into a GD image and then resample it.
http://php.net/manual/en/function.imagecopyresampled.php
Maybe you just can use a lib to handle that. Try WideImage. I have used it and worked nicely.
Example:
$image = base64_decode(preg_replace('#^data:image/\w+;base64,#i', '', $req->image));
$thumbnail = WideImage::load($image)
->resize(300, 300, 'inside')
->crop('center', 'center', 300, 300);
Library Documentation: http://wideimage.sourceforge.net/