I've been able to get EXIF data by using exif_read_data(). According to the EXIF documentation provided on PHP docs, there has to be an imageNumber tag (I understand it's not guaranteed), but I haven't been able to see anything like that on my test image (Unedited JPG from a Nikon D5100). The same image seems to carry information about the shutter count as per online shutter count websites.
Really appreciate it if you can shed some light on what I'm possibly doing wrong to get this number. Or is there any other place or method they store shutter count in image meta?
EDIT:
Here's the code I tried to work out, and I'm trying to get imageNumber which is apparently not available to get. But online tools show the shutter count on the same image. I'd like to get the same result using PHP (or even using another language). Any help is appreciated.
$exif_data = exif_read_data ( $_FILES['fileToUpload']['tmp_name']);
print_r( $exif_data);
As per your example file it is specific to Nikon's MakerNote and in there specific to the D5100 model. Using ExifTool in verbose mode shows the structure:
> exiftool -v DSC_8725.JPG
...
JPEG APP1 (65532 bytes):
ExifByteOrder = MM
+ [IFD0 directory with 11 entries]
| 0) Make = NIKON CORPORATION
| 1) Model = NIKON D5100
...
| 9) ExifOffset (SubDirectory) -->
| + [ExifIFD directory with 41 entries]
...
| | 16) MakerNoteNikon (SubDirectory) -->
| | + [MakerNotes directory with 55 entries]
...
| | | 38) ShotInfoD5100 (SubDirectory) -->
| | | + [BinaryData directory, 8902 bytes]
...
| | | | ShutterCount = 41520
JPEG explained, see segment APP1:
Exif explained, see tag 0x927c:
Nikon's MakerNote explained, see tag 0x0091:
ShotInfoD5100 explained, see index 801
MakerNotes are proprietary: how data is stored there is up to each manufacturer. Documentations from those are rare - mostly hobbyists reverse engineer that information - that's why only selected software can read it at all for selected models. At this point you may realize that dozens of manufacturers with dozens of models exist, for which you all would have to interpret bytes differently - which is a lot of work! As per exif_read_data()s ChangeLog PHP 7.2.0 nowhere claims to support Nikon at all.
You have to either parse the MakerNote yourself or find PHP code/library/software which already did that for you. As a last resort you could execute non-PHP software (such as ExifTool) to get what you want.
Related
I need to upload images and get Exif to display.
I'm using exif_read_data to get exifs data.
$data = exif_read_data($imageFile, 'EXIF', true);
Everything is fine, but :
$data['EXIF']['ExposureProgram'] = 3
I know Exposure Programm could be :
aperture priority
shutter priority
auto
manuel
I don't find any information about this number and what is its meaning.
Maybe someone can help me here ?
Thanks
Why not checking Exif's Standard 2.32 itself? Page 50:
0 = Not defined
1 = Manual
2 = Normal program
3 = Aperture priority
4 = Shutter priority
5 = Creative program (biased toward depth of field)
6 = Action program (biased toward fast shutter speed)
7 = Portrait mode (for closeup photos with the background out of focus)
8 = Landscape mode (for landscape photos with the background in focus)
Other = reserved
Phil Harvey's ExifTool reference is also worth to check for additional improper values from vendors violating the standard(s):
(the value of 9 is not standard EXIF, but is used by the Canon EOS 7D)
9 = Bulb
I need to catch only these PHP extensions that are necessary for the app to work. The idea is removing all the PHP extensions that are not necessary for the app. Do you guys have any idea how can I do that?
The app is on PHP 8.0.14 - Laravel 8
I am not sure what is the question.
Try this code:
$full = explode("-", "PHP 8.0.14 - Laravel 8"); // first argument is separator, in your case it's "-" sign, second argument is what needs to be separated
$extension =end($full); //gives everithing after "-" sign
$noextension = reset($full); //gives everithing before "-" sign
I think you need one of those.
Take a look at this project: https://github.com/maglnet/ComposerRequireChecker
It does what you are asking, based both in the dependencies you required through composer and on the code you wrote.
You can use it by running composer-require-checker check <path/to/composer.json>
The output will be something like this:
$ composer-require-checker check composer.json
ComposerRequireChecker 4.1.x-dev#fb2a69aa2b7307541233536f179275e99451b339
The following 2 unknown symbols were found:
+----------------------------------------------------------------------------------+--------------------+
| Unknown Symbol | Guessed Dependency |
+----------------------------------------------------------------------------------+--------------------+
| hash | ext-hash |
| Psr\Http\Server\RequestHandlerInterface | |
+----------------------------------------------------------------------------------+--------------------+
By looking at it you can tell that I must include ext-hash and psr/http-server-handler to my composer's require section.
Note that although ext-hash has been shipped with standard PHP distributions for a while, it may be a good practice to include it, so if your software is being executed in a non-standard/custom distribution.
I am using something called DAP (https://github.com/rapid7/dap) which helps deal with large file handling and outputs an ever growing list of data.
For example:
curl -s https://scans.io/data/rapid7/sonar.http/20141209-http.gz | zcat | head -n 10 | dap json + select vhost + lines
This code correctly works and it will output 10 lines of IP addresses.
My question is how can I read this data from PHP - in effect where a data feed is continuous/live (it will end at some point) how can I process each line I'm given?
I've tried piping to it but I don't get passed the output. I don't want to use exec because the data is constantly growing. I think it could be a stream but not sure that is the case.
For anyone else that finds themselves in the same situation - here is the answer that works for me (can be run directly from the command line also):
curl -s 'https://scans.io/data/rapid7/sonar.http/20141209-http.gz' | zcat | head -n 1000 | dap json + select vhost + lines | while read line ; do php /your_script/path/file.php $line ; done
Then pull out $argv[1] and the data is all yours.
I was able to install webp support for imagemagick. But I'm missing some precise commands.
The basic is covered thru:
$im = new Imagick();
$im->pingImage($src);
$im->readImage($src);
$im->resizeImage($width,$height,Imagick::FILTER_CATROM , 1,TRUE );
$im->setImageFormat( "webp" );
$im->writeImage($dest);
But I'm missing lots of fine tuning options as described in the imageMagick command line documentation here:
http://www.imagemagick.org/script/webp.php
Specifically:
How do I set compression quality? (I tried setImageCompressionQuality and it does not work, ie the output is always the same size)
How do I set the "method" (from 0 to 6)?
Thanks
EDIT: I followed #emcconville's advice below (thanks!) and neither the method nor the compression worked. So I start to suspect my compilation of imagemagick.
I tried using command line:
convert photo.jpg -resize 1170x1170\> -quality 50 photo.webp
Wehn changing the 50 variable for quality the resulting file was always the same size. So there must be something wrong with my imagemagick...
How do I set the "method" (from 0 to 6)?
Try this...
$im = new Imagick();
$im->pingImage($src);
$im->readImage($src);
$im->resizeImage($width,$height,Imagick::FILTER_CATROM , 1,TRUE );
$im->setImageFormat( "webp" );
$im->setOption('webp:method', '6');
$im->writeImage($dest);
How do I set compression quality? (I tried setImageCompressionQuality and it does not work, ie the output is always the same size)
Imagick::setImageCompressionQuality seems to work for me, but note that webp:lossless becomes enabled if the values is 100, or greater (see source). You can test toggling lossless to see how that impacts results.
$img->setImageFormat('webp');
$img->setImageCompressionQuality(50);
$img->setOption('webp:lossless', 'true');
Edit from comments
Try testing the image conversion to webp directly with the cwebp utility.
cwebp -q 50 photo.jpg -o photo.webp
This will also write some statistical information to stdout, which can help debug what's happening.
Saving file 'photo.webp'
File: photo.jpg
Dimension: 1170 x 1170
Output: 4562 bytes Y-U-V-All-PSNR 55.70 99.00 99.00 57.47 dB
block count: intra4: 91
intra16: 5385 (-> 98.34%)
skipped block: 5357 (97.83%)
bytes used: header: 86 (1.9%)
mode-partition: 2628 (57.6%)
Residuals bytes |segment 1|segment 2|segment 3|segment 4| total
macroblocks: | 0%| 0%| 0%| 98%| 5476
quantizer: | 45 | 45 | 43 | 33 |
filter level: | 14 | 63 | 8 | 5 |
Also remember that for some subject matters, adjusting the compression quality doesn't always guaranty a file size decrease. But those are extreme edge cases.
I have a gzipped text file that I'm trying to read within PHP (using gzopen/gzgets). The file is somewhat large, around 158,000 lines. The script works fine except when it gets to line 157,237 of the file, it reads in part of the line then acts as if it's reached EOF. I'm able to unzip the file and confirm the rest of the file does exist. I wrote a simple script to test:
<?php
$handle = gzopen('/path/to/file.gz','r');
while(true) {
echo gzgets($handle,4096);
}
?>
It reads in everything perfectly then suddenly gets to this line and prints:
GUAN XIN 508|R34745|CH|CGO|100|
and nothing else. It just sits there [the not-infinite-loop version exits the while(!gzeof($handle))]
If I gunzip the file and go to that line, I see:
GUAN XIN 508|R34745|CH|CGO|100| | | | |BEGS| | | | |133|19| | | | | | | | | | | | |413669000|1|
So the data is there. Is there some sort of size limitation on the zlib functions that I'm not aware of?
UPDATE: I ran it through a 'cat -vet' to look for special characters... nothing.
Updated zlib to 1.2.7. We were running 1.2.3, and "large file" support was apparently added in 1.2.4.