determine image file size php - php

I have this code for checking max image size allowed: The one below is for 4 MB
elseif (round($_FILES['image_upload_file']["size"] / 1024) > 4096) {
$output['error']= "You can upload file size up to 4 MB";
I don't understand this calculation and approaches from the internet is making it more confusing
I wanted the size for
8 MB

PHP $_FILES["image_upload_file"]["size"] variable return the value of file size in BYTES. So, for check the file size you have two option,
Convert your checking limit into BYTES, and check with the $_FILES["image_upload_file"]["size"] value. As, 5MB= 5000000KB, 6MB= 6000000KB, 8MB= 8000000KB and so on. (Values are simplified)
Convert the $_FILES["image_upload_file"]["size"] value in to MB and check.
For me check the value in BYTES. It is more easier and you no need to calculate any thing.
In your example, the values are calculate into KB and then checking. As, $_FILES['image_upload_file']["size"] / 1024 return value in KB and 4MB= 4096 KB. So, your internet code also right.
If you want to use your internet code for 8MB then change the 4096 to 8192. It will work same.
Hope, now you understand the code.

Related

Uploading base64_encoded image increasing file size from original size

I am using the below code to upload a base64 encoded image to s3.
$config = ['Expires' => '30 day','CacheControl' => 'max-age=31536000' ];
$path = 'website/'. $image;
Storage::getDriver()->put($path, base64_decode($b64str), $config);
after uploading, I notice that the file in s3 has a higher file size compared to the actual file size.
is there any way I can upload with the same size?
here I attached the file actual size and s3 size screenshot.
Image 1 before the upload file size
Image 2 after the upload file size
Unfortunetly base64 increases file size.
Each Base64 digit represents exactly 6 bits of data. So, three 8-bits bytes of the input string/binary file (3×8 bits = 24 bits) can be represented by four 6-bit Base64 digits (4×6 = 24 bits).
This means that the Base64 version of a string or file will be at most
133% the size of its source (a ~33% increase). The increase may be
larger if the encoded data is small. For example, the string "a" with
length === 1 gets encoded to "YQ==" with length === 4 — a 300%
increase.
Source: https://developer.mozilla.org/en-US/docs/Glossary/Base64
However, the size difference seems to be too big. Could you show how you encode images?

When User select the image in greater then 2 mb the php get the size zero

When User Select the image of less then 2 mb its show the size and work correctly .Its size greater then 2 mb its not get the size and show error message.
if($_FILES['image']){
$file_size = $_FILES['image']['size'];
if($file_size > (2222222)) {
echo "Please Select File less then 2 mb";
}
Please specif more what need?
If you question is why only 2 MB, you can check php.ini
Maximum allowed size for uploaded files.
upload_max_filesize
Must be greater than or equal to upload_max_filesize
post_max_size
If you want limit your img size on php go How to limit file upload type file size in PHP?

How can I check if a jpeg will fit in memory?

Opening a JPEG image using imagecreatefromjpeg can easily lead to fatal errors, because the memory needed exeeds the memory_limit.
A .jpg file that is less than 100Kb in size can easily exceed 2000x2000 pixels - which will take about 20-25MB of memory when opened. "The same" 2000x2000px image may take up 5MB on the disk using a different compression level.
So I obviously cannot use the filesize to determine if it can be opened safely.
How can I determine if a file will fit in memory before opening it, so I can avoid fatal errors?
According to several sources the memory needed is up to 5 bytes per pixel depending on a few different factors such as bit-depth. My own tests confirm this to be roughly true.
On top of that there is some overhead that needs to be accounted for.
But by examining the image dimensions - which can easily be done without loading the image - we can roughly estimate the memory needed and compare it with (an estimate of) the memory available like this:
$filename = 'black.jpg';
//Get image dimensions
$info = getimagesize($filename);
//Each pixel needs 5 bytes, and there will obviously be some overhead - In a
//real implementation I'd probably reserve at least 10B/px just in case.
$mem_needed = $info[0] * $info[1] * 6;
//Find out (roughly!) how much is available
// - this can easily be refined, but that's not really the point here
$mem_total = intval(str_replace(array('G', 'M', 'K'), array('000000000', '000000', '000'), ini_get('memory_limit')));
//Find current usage - AFAIK this is _not_ directly related to
//the memory_limit... but it's the best we have!
$mem_available = $mem_total - memory_get_usage();
if ($mem_needed > $mem_available) {
die('That image is too large!');
}
//Do your thing
$img = imagecreatefromjpeg('black.jpg');
This is only tested superficially, so I'd suggest further testing with a lot of different images and using these functions to check that the calculations are fairly correct in your specific environment:
//Set some low limit to make sure you will run out
ini_set('memory_limit', '10M');
//Use this to check the peak memory at different points during execution
$mem_1 = memory_get_peak_usage(true);

PHP - imagecreatefromjpeg uses 100M memory for <1M image

The image is less than 1MB but the size is roughly 5500x3600. I am trying to resize the image down too something less than 500x500. My code is pretty simple.
$image = imagecreatefromjpeg("upload/1.jpg");
$width = imagesx($image);
$height = imagesy($image);
$pWidth = 500;
$pHeight = 334;
$image_p = imagecreatetruecolor($pWidth, $pHeight);
setTransparency($image,$image_p,$ext);
imagecopyresampled($image_p, $image, 0, 0, 0, 0, $pWidth, $pHeight, $width, $height);
I found out that to process this image, the imagecreatefromjpeg uses 100M using memory_get_usage.
Is there a better way to do imagecreatefromjpeg? is there a workaround or a different function that uses less memory?
I did ask my server admins to increase the memory but I doubt they will increase it to 100M or more. I am considering limiting the dimensions of an image a user can upload but not before exhausting all my options as users will most likely upload images they took.
Btw, the following is the image i used which uses 100M of memory
What #dev-null-dweller says is the correct answer:
With 5500x3600 you will need at least 5500*3600*4 bytes in memory =~ 80MB. For large pictures Imagick extensions might have better performance than GD
There is no way to "improve" on that because that's the amount of memory needed to process the image. JPEG is a compressed format so its file size is irrelevant, it's the actual dimensions that count. The only way to deal with such an image inside GD is increasing the memory limit.
You may be able to do better using a library/command line client like ImageMagick if you have access to that - when you run ImageMagick from the command line, its memory usage won't count towards the memory limit. Whether you can do this, you'd need to find out from your web host or server admin.
Another idea that comes to mind is using an image resizing API that you send the image to. That would take the load completely off your server. This question has some pointers: https://stackoverflow.com/questions/5277571/is-there-a-cdn-which-provides-on-demand-image-resizing-cropping-sharpening-et
ImageMagick, on the command-line at least, is able to use a feature of libjpeg called "Shrink on Load" to avoid unnecessarily loading the whole image if you are only planning on scaling it down.
If I resize your image above up from its current 3412x2275 to the size you are actually dealing with, 5500x2600 like this
convert guys.jpg -resize 5500x2600 guys.jpg
I can now do some tests... first, a simple resize
/usr/bin/time -l convert guys.jpg -resize 500x334 small.jpg
0.85 real 0.79 user 0.05 sys
178245632 maximum resident set size <--- 178 MB
0 average shared memory size
0 average unshared data size
0 average unshared stack size
44048 page reclaims
0 page faults
0 swaps
you can see it uses a peak of 178 MB on my Mac.
If I now use the "Shrink on Load" feature I mentioned:
/usr/bin/time -l convert -define jpeg:size=500x334 guys.jpg -resize 500x334 small.jpg
0.06 real 0.04 user 0.00 sys
8450048 maximum resident set size <--- Only 8 MB
0 average shared memory size
0 average unshared data size
0 average unshared stack size
2381 page reclaims
33 page faults
0 swaps
you can see it only takes 8MB now. It's faster too!
I've been using vipsthumbnail for years. I use it to resize 700MP images (enormous 16,384 x 42,731 = 2.6BG if loaded into php) to a viewable size. This takes about 10 seconds to generate a small preview that's 3000px tall.
https://www.libvips.org/API/current/Using-vipsthumbnail.html
By specifying one dimension, it keeps the original aspect ratio and limits the x or y, which ever is greater
$exec_command = sprintf('vipsthumbnail --size 3000 "%s" -o "%s"', $source, $destination);
exec( $exec_command, $output, $return_var );
if($return_var != 0) {
// Handle Error Here
}
I was still using php to resize smaller images, but my system recently generated a secondary image that was 15,011 x 15,011 (1.075GB uncompressed). My PHP settings allow for 1GB of ram, and it was crashing!. I increased PHP's memory limit over time to deal with these images. I finally converted this function to also use vipsthumbnail. These smaller images only take about 100ms each to generate. Should have done this a long time ago.
$exec_command = sprintf('vipsthumbnail --size 150 "%s" -o "%s"', $src, $dst);
exec( $exec_command, $output, $return_var );

imagecreatetruecolor fails with 1536X1962 but not smaller diminsions

I have narrowed down my problem to show the size of the file matters. I don't know when the file is to big however. How can I catch the error. A 1.1mg file causes imagecreatetruecolor to bail but it chuggs along just fine when processing a 688k file.
Thanks
From what I can tell in the PHP/GD documentation, this function creates a 24-bit RGB image with black as the default color. The width and height it takes as arguments are ints for the pixel dimensions. Therefore to calculate size, you could multiply them as follows to determine if the raw image (before compression) is to blame:
1536 * 1962 = 3,013,632 pixels
3,013,632 * 24 = 72,327,168 bits
72,327,168 / 8 = 9,040,896 bytes
1024 * 768 = 786,432 pixels
786,432 * 24 = 18,874,368 bits
18,874,368 / 8 = 2,359,296 bytes
It seems unusual to me that this function would cause problems at a size of 1.1 MB but perhaps you are referring to a compressed image such as a jpg, where the actual raw size could be much, much greater. (As you can see a "small" image of 1024x768 is still well over 1.1 MB raw.)
The function will throw an error if you try to create an image too big. Just suppress the error and handle it yourself. For example,
$img = #imagecreatetruecolor(100000, 100000);
if ($img === false)
{
// Handle error
}

Categories