My application allows users to upload bunch of images which are later sent as an email attachment. The problem shows up if file size of images that are being sent is bigger than email accept.
What I want to do is to reduce file size of image until it is lower then 5MB and this is how I tried to do it:
/*..some code...*/
$img_quality = 75;
while (filesize($path) >= 5242880) {
$img_string = file_get_contents($path);
$img = imagecreatefromstring($img_string);
$path = $this->getFilePath($file, $file_section, $entity_id);
imagejpeg($img, $path, $img_quality);
$img_quality--;
}
/*..some code...*/
//Functions I am calling
public function getFilePath($upload, $section, $id = null) {
$path = base_path('../upload').$this->downloadOverwrite($upload, $section, $id);
if(!$upload) {
return $path;
}
if(!file_exists($path)) {
return null;
}
$tmppath = #tempnam("tmp", "myapp");
file_put_contents($tmppath, file_get_contents($path));
return $tmppath;
}
public function downloadOverwrite($upload, $section, $id = null, $config = []) {
$section = !empty($upload['entity']) ? $upload['entity'] : $section;
$id = !empty($upload['entity_id']) ? $upload['entity_id'] : $id;
$path = empty($id) ? "/$section" : "/$section/$id";
if(!empty($upload)) $path .= "/{$upload['fs_name']}";
return $path;
}
This code actually works, but if image file size is too big, it takes too long before image is compressed to desired value. Is there any better solution to do this?
actually yes there is a package which is really popular for manipulating images in php which has good integration with laravel :
http://image.intervention.io/getting_started/installation
so with this package you do as below :
$ php composer.phar require intervention/image
and after adding aliases if laravel < 5.5 you publish vendor files :
$ php artisan vendor:publish --provider="Intervention\Image\ImageServiceProviderLaravelRecent"
then you can replace image handling to this :
// create instance
$img = Image::make('public/foo.jpg');
// resize image to fixed size
$img->resize(300, 200);
to resize or do any thing like prevent upsizing like :
// prevent possible upsizing
$img->resize(null, 400, function ($constraint) {
$constraint->aspectRatio();
$constraint->upsize();
});
or any thing you would like to do according to documentation .
hope this helps
In case someone has similiar problem, here is the loop that I used in the end and that worked pretty well for me. May not be the best solution out there, but it's fair enough:
$image_width = getimagesize($path)[0];
while (filesize($path) >= 5242880) {
$image_width -= 50;
$img_string = file_get_contents($path);
$img = imagecreatefromstring($img_string);
$path = $this->getFilePath($file, $file_section, $entity_id);
$img = imagescale($img, $image_width);
imagejpeg($img, $path);
}
I am lowering the width of the image, while height is being lowered automatically respecting the aspect ratio. This results in pretty fast file size drop, while keeping the quality of image at approximately the same level as it was before scaling.
Related
I want to get the resolution of an AI file but it only returns 72. However, the resolution is supposed to be 300. If I were to change the resolution of the image using imagick, it would come out to be something far less [the dimensions]. How can I get the actual resolution of a file?
In case I can't use imagick for that, is there a library or framework that can give me the precised resolution of a file (regardless what type)?
I have my code as:
try {
$imagick = new Imagick($imagePath);
$data = $imagick->identifyimage();
}
catch(Exception $e) {
echo json_encode('ERROR: ' . $e->getMessage());
}
if($data['units'] == 'PixelsPerCentimeter') {
$imageResolution = $imagick->getImageResolution(); // Temp Image Resolution
//echo 'pixels per cent lol';
if (!empty($imageResolution['y'])) {
$dpi = intval( round($imageResolution['y'] * 2.54, 2) );
}
}
else $dpi = $data['resolution']['x'];
$width = round($data['geometry']['width'] / $dpi, 3);
$height = round($data['geometry']['height'] / $dpi, 3);
I am using the following code to save an image from a URL but sometimes the image URL is bad and there is no image there, OR there is an issue with the image and it saves a zero size file.
<?php
file_put_contents ("/var/www/html/images/" . $character . ".jpg",
file_get_contents($image));
I need to try and find a way to stop this happening as this creates a problem (saving zero size files).
I have tried this, but it still seems to be happening:
$filesize = file_put_contents ("/var/www/html/images/" . $character . ".jpg",
file_get_contents($image));
if (($filesize < 10) || ($filesize == "")) {
echo "Error";
}
Could anyone recommend a more reliable way to do this?
Imagick package has methods for doing this
Imagick::getImageGeometry() - returns width and height of an image, or throws an exception.
function isValidImage($filename)
{
if (!fileexists($filename) return false;
if (filesize($filename) == 0) return false;
$image = new imagick($filename);
$img=$image->getImageGeometry();
return ($img['width'] > 0 && $img['height'] > 0);
}
EDIT: I have updated my answer with more checks
I have tried to get an image size by URL. I have get_headers() function to get the image size. Here is an example which is given below:
function checkImageSize($imageUrl){
if(!empty($imageUrl)){
$file_headers = #get_headers($imageUrl, 1); // it gives all header values .
// for image size, we use **Content-Length** for size.
$sizeInKB = round($file_headers['Content-Length'] / 1024, 2));
return $sizeInKB;
} else {
return 0;
}
}
$imageSize =checkImageSize($imageUrl);
if($imageSize<=$conditionalSize){
// upload code
} else {
// error msg
}
I'm making a images gallery website where users can upload any image and they will be displayed on frontend. I need to compress images without effecting it's quality to reduce there size so that page load speed should not effect that much. I'm using following code to upload image:
$rules = array('file' => 'required');
$destinationPath = 'assets/images/pages'
$validator = Validator::make(array('file' => $file), $rules);
if ($validator->passes()) {
$filename = time() . $uploadcount . '.' . $file->getClientOriginalExtension();
$file->move($destinationPath, $filename);
return $filename;
} else {
return '';
}
The best and easiest way to compress images before uploading to the server, I found here:-
https://github.com/spatie/laravel-image-optimizer
You need to optimize the image for web usage as user may upload images that are way to large (Either in size or resolution). You may also want to remove the meta data from the images to decrease the size even more. Intervention Image perfect for resizing/optimizing images for web usage in Laravel. You need to optimize the image before it is saved so that the optimized version is used when loading the web page.
Intervention Image
https://tinypng.com provides an API service for compressing images. All you need to do is install their PHP library in Laravel, get a developer key from their website. After that by the adding the below code, you can compress your uploaded image. In the code, I am assuming you have stored your file under 'storage' directory.
$filepath = public_path('storage/profile_images/'.$filename);
\Tinify\setKey("YOUR_API_KEY");
$source = \Tinify\fromFile($filepath);
$source->toFile($filepath);
Here is the link to a blog which explains how to upload and compress images in Laravel http://artisansweb.net/guide-upload-compress-images-laravel
**Using core php **
function compress($source_image, $compress_image)
{
$image_info = getimagesize($source_image);
if ($image_info['mime'] == 'image/jpeg') {
$source_image = imagecreatefromjpeg($source_image);
imagejpeg($source_image, $compress_image, 20); //for jpeg or gif, it should be 0-100
} elseif ($image_info['mime'] == 'image/png') {
$source_image = imagecreatefrompng($source_image);
imagepng($source_image, $compress_image, 3);
}
return $compress_image;
}
public function store(Request $request)
{
$image_name = $_FILES['image']['name'];
$tmp_name = $_FILES['image']['tmp_name'];
$directory_name = public_path('/upload/image/');
$file_name = $directory_name . $image_name;
move_uploaded_file($tmp_name, $file_name);
$compress_file = "compress_" . $image_name;
$compressed_img = $directory_name . $compress_file;
$compress_image = $this->compress($file_name, $compressed_img);
unlink($file_name);
}
I have a function that uploads files up to 8MB but now I also want to compress or at least rescale larger images, so my output image won't be any bigger than 100-200 KB and 1000x1000px resolution. How can I implement compress and rescale (proportional) in my function?
function uploadFile($file, $file_restrictions = '', $user_id, $sub_folder = '') {
global $path_app;
$new_file_name = generateRandomString(20);
if($sub_folder != '') {
if(!file_exists('media/'.$user_id.'/'.$sub_folder.'/')) {
mkdir('media/'.$user_id.'/'.$sub_folder, 0777);
}
$sub_folder = $sub_folder.'/';
}
else {
$sub_folder = '';
}
$uploadDir = 'media/'.$user_id.'/'.$sub_folder;
$uploadDirO = 'media/'.$user_id.'/'.$sub_folder;
$finalDir = $path_app.'/media/'.$user_id.'/'.$sub_folder;
$fileExt = explode(".", basename($file['name']));
$uploadExt = $fileExt[count($fileExt) - 1];
$uploadName = $new_file_name.'_cache.'.$uploadExt;
$uploadDir = $uploadDir.$uploadName;
$restriction_ok = true;
if(!empty($file_restrictions)) {
if(strpos($file_restrictions, $uploadExt) === false) {
$restriction_ok = false;
}
}
if($restriction_ok == false) {
return '';
}
else {
if(move_uploaded_file($file['tmp_name'], $uploadDir)) {
$image_info = getimagesize($uploadDir);
$image_width = $image_info[0];
$image_height = $image_info[1];
if($file['size'] > 8000000) {
unlink($uploadDir);
return '';
}
else {
$finalUploadName = $new_file_name.'.'.$uploadExt;
rename($uploadDirO.$uploadName, $uploadDirO.$finalUploadName);
return $finalDir.$finalUploadName;
}
}
else {
return '';
}
}
}
For the rescaling I use a function like this:
function dimensions($width,$height,$maxWidth,$maxHeight)
// given maximum dimensions this tries to fill that as best as possible
{
// get new sizes
if ($width > $maxWidth) {
$height = Round($maxWidth*$height/$width);
$width = $maxWidth;
}
if ($height > $maxHeight) {
$width = Round($maxHeight*$width/$height);
$height = $maxHeight;
}
// return array with new size
return array('width' => $width,'height' => $height);
}
The compression is done by a PHP function:
// set limits
$maxWidth = 1000;
$maxHeight = 1000;
// read source
$source = imagecreatefromjpeg($originalImageFile);
// get the possible dimensions of destination and extract
$dims = dimensions(imagesx($source),imagesy($source),$maxWidth,$maxHeight);
// prepare destination
$dest = imagecreatetruecolor($dims['width'],$dims['height']);
// copy in high-quality
imagecopyresampled($dest,$source,0,0,0,0,
$width,$height,imagesx($source),imagesy($source));
// save file
imagejpeg($dest,$destinationImageFile,85);
// clear both copies from memory
imagedestroy($source);
imagedestroy($dest);
You will have to supply $originalImageFile and $destinationImageFile. This stuff comes from a class I use, so I edited it quite a lot, but the basic functionality is there. I left out any error checking, so you still need to add that. Note that the 85 in imagejpeg() denotes the amount of compression.
you can use a simple one line solution through imagemagic library the command will like this
$image="path to image";
$res="option to resize"; i.e 25% small , 50% small or anything else
exec("convert ".$image." -resize ".$res." ".$image);
with this you can rotate resize and many other image customization
Take a look on imagecopyresampled(), There is also a example that how to implement it, For compression take a look on imagejpeg() the third parameter helps to set quality of the image, 100 means (best quality, biggest file) and if you skip the last option then default quality is 75 which is good and compress it.
I am trying to build a script that retrieves a list of thumbnail images from an external link, much like Facebook does when you share a link and can choose the thumbnail image that is associated with that post.
My script currently works like this:
file_get_contents on the URL
preg_match_all to match any <img src="" in the contents
Works out the full URL to each image and stores it in an array
If there are < 10 images it loops through and uses getimagesize to find width and height
If there are > 10 images it loops through and uses fread and imagecreatefromstring to find width and height (for speed)
Once all width and heights are worked out it loops through and only adds the images to a new array that have a minimum width and height (so only larger images are shown, smaller images are less likely to be descriptive of the URL)
Each image has its new dimensions worked out (scaled down proportionally) and are returned...
<img src="'.$image[0].'" width="'.$image[1].'" height="'.$image[2].'"><br><br>
At the moment this works fine, but there are a number of problems I can potentially have:
SPEED! If the URL has many images on the page it will take considerably longer to process
MEMORY! Using getimagesize or fread & imagecreatefromstring will store the whole image in memory, any large images on the page could eat up the server's memory and kill my script (and server)
One solution I have found is being able to retrieve the image width and height from the header of the image without having to download the whole image, though I have only found some code to do this for JPG's (it would need to support GIF & PNG).
Can anyone make any suggestions to help me with either problem mentioned above, or perhaps you can suggest another way of doing this I am open to ideas... Thanks!
** Edit: Code below:
// Example images array
$images = array('http://blah.com/1.jpg', 'http://blah.com/2.jpg');
// Find the image sizes
$image_sizes = $this->image_sizes($images);
// Find the images that meet the minimum size
for ($i = 0; $i < count($image_sizes); $i++) {
if ($image_sizes[$i][0] >= $min || $image_sizes[$i][1] >= $min) {
// Scale down the original image size
$dimensions = $this->resize_dimensions($scale_width, $scale_height, $image_sizes[$i][0], $image_sizes[$i][1]);
$img[] = array($images[$i], $dimensions['width'], $dimensions['height']);
}
}
// Output the images
foreach ($img as $image) echo '<img src="'.$image[0].'" width="'.$image[1].'" height="'.$image[2].'"><br><br>';
/**
* Retrieves the image sizes
* Uses the getimagesize() function or the filesystem for speed increases
*/
public function image_sizes($images) {
$out = array();
if (count($images) < 10) {
foreach ($images as $image) {
list($width, $height) = #getimagesize($image);
if (is_numeric($width) && is_numeric($height)) {
$out[] = array($width, $height);
}
else {
$out[] = array(0, 0);
}
}
}
else {
foreach ($images as $image) {
$handle = #fopen($image, "rb");
$contents = "";
if ($handle) {
while(true) {
$data = fread($handle, 8192);
if (strlen($data) == 0) break;
$contents .= $data;
}
fclose($handle);
$im = #imagecreatefromstring($contents);
if ($im) {
$out[] = array(imagesx($im), imagesy($im));
}
else {
$out[] = array(0, 0);
}
#imagedestroy($im);
}
else {
$out[] = array(0, 0);
}
}
}
return $out;
}
/**
* Calculates restricted dimensions with a maximum of $goal_width by $goal_height
*/
public function resize_dimensions($goal_width, $goal_height, $width, $height) {
$return = array('width' => $width, 'height' => $height);
// If the ratio > goal ratio and the width > goal width resize down to goal width
if ($width/$height > $goal_width/$goal_height && $width > $goal_width) {
$return['width'] = floor($goal_width);
$return['height'] = floor($goal_width/$width * $height);
}
// Otherwise, if the height > goal, resize down to goal height
else if ($height > $goal_height) {
$return['width'] = floor($goal_height/$height * $width);
$return['height'] = floor($goal_height);
}
return $return;
}
getimagesize reads only header, but imagecreatefromstring reads whole image. Image read by GD, ImageMagick or GraphicsMagic is stored as bitmap so it consumes widthheight(3 or 4) bytes, and there's nothing you can do about it.
The best possible solution for your problem is to make curl multi-request (see http://ru.php.net/manual/en/function.curl-multi-select.php ), and then one by one process recieved images with GD or any other library. And to make memory consumption a bit lower, you can store image files on disk, not in memory.
The only idea that comes to mind for your current approach (which is impressive) is to check the HTML for existing width and height attributes and skip the file read process altogether if they exist.