I am developing an application and for that I need to generate multiple copies of same image with unique base 64 encoding.
However I am not sure that if I add a random string like "aasasa" in between my image code whether it will corrupt image or it will display properly?
Is there any way to achieve this?
$imageDataEncoded = base64_encode(file_get_contents($_FILES["fileToUpload"]["tmp_name"]));
$imageData = base64_decode($imageDataEncoded);
$imageData="asasasas".$imageData;
$source = imagecreatefromstring($imageData);
Thanks
Ahmar
It's not entirely obvious why you would want to do this (maybe there is a better way to achieve what you're after). Anyways, I would try and change the image metadata. By changing the metadata you are "not changing" the image itself.
Further reading material:
http://php.net/manual/en/book.exif.php
http://php.net/manual/en/function.iptcembed.php
http://www.sno.phy.queensu.ca/~phil/exiftool/TagNames/EXIF.html
This can be achieved by using a EXIF library, but be aware that these changes are obviously way more performance intensive and can slow down your load time significantly, especially if you do this on a lot of pages.
Related
I have created a service that hides text inside photographs. For example:
$img_name = "myimage.jpeg";
$orig_contents = file_get_contents($img_name);
$msg = "My secret.";
$fp = fopen($img_name, "wb");
fwrite($fp, $orig_contents . $msg);
fclose($fp);
I'm curious: How much information can I hide inside photographs using this method? For example, could I embed chapters of a novel in an image file? I have added fairly large blocks of text without corrupting the image, but I'm wondering if PHP or image viewing applications impose limits on this.
(P.S. I am aware that this type of steganography is insecure; I'm just doing this for fun.)
You should take a look at Steganography. And be aware you are not hidding your data in the image. Anyone who could open the image with a text editor would see your text somewhere in the file (in this case, in the end, which is much worse). If I were you, I'd do the following:
Encrypt your data with some decent Algorithm and a strong key
Create a function that distributes your data through the file in a pseudo-random way, so that anyone would note that you're trying to put something secret in it (be aware you have to recover it afterwards). In a regular bitmap image, you can use the last bit of each pixel to save your information, since this change made by it would not be perceived by human eye, if you compared the original image with the one that has hidden data.
Pray NSA isn't reading this, otherwise you can get some serious trouble :)
No, there's essentially no limit imposed by either PHP or the JPEG format on how much data you'll be able to add to an image using this method. This works because the JPEG format stores all of the image data at the beginning of the file until some marker. After the marker, any data is assumed to be something else like a thumbnail, for example.
One cool trick (that also works with GIF images) is that you can append a ZIP file to the end of an image and the file works as both a JPEG and a ZIP file. It will be readable by both image processing programs or ZIP programs just by changing the file extension.
I think this is not the most secure way to do it, if you really want to hide string into an image, you will probably use a specific pattern to change a pixel every 10 pixels, the idea is simple convert your image to an array of integer, loop through the array and every 10 pixels change the value to the ascii character number.
Changing 1 each 10 pixel won't make a lot of noise.
To make it more secure use encoding, so use your own map to encode ascii, like #fvdalcin proposed.
I've been doing some speed optimization on my site using Page Speed and it gives recommendations like so:
Optimizing the following images could reduce their size by 35.3KiB (21% reduction).
Losslessly compressing http://example.com/.../some_image.jpg could save 25.3KiB (55% reduction).
How are the calculations done to get the file size reduction numbers? How can I determine if an image is optimized in PHP?
As I understand it, they seem to base this on the image quality (so saving at 60% in Photoshop or so is considered optimized?).
What I would like to do is once an image is uploaded, check if the image is fully optimized, if not, then optimize it using a PHP image library such as GD or ImageMagick. If I'm right about the number being based on quality, then I will just reduce the quality as needed.
How can I determine if an image is fully optimized in the standards that Page Speed uses?
Chances are they are simply using a standard compression or working on some very simple rules to calculate image compression/quality. It isn't exactly what you were after but what I often use on uploaded images etc etc dynamic content is a class called SimpleImage:
http://www.white-hat-web-design.co.uk/blog/resizing-images-with-php/
this will give you the options to resize & adjust compression, and I think even change image type (by which I mean .jpg to .png or .gif anything you like). I worked in seo and page optimization was a huge part of my Job I generally tried to make the images the size the needed to be no smaller or bigger. compress JS & CSS and that's really all most people need to worry about.
It looks like you could use the PageSpeed Insights API to get the compressions scores: https://developers.google.com/speed/docs/insights/v1/reference, though I'm guessing you'd want to run a quality check/compression locally, rather than sending everything through this API.
It also looks like they've got a standalone image optimizer available at http://code.google.com/p/page-speed/wiki/DownloadPageSpeed?tm=2, though this appears to be a Windows binary. They've got an SDK there as well, though I haven't looked into what it entails, or how easy it would be to integrate into an existing site.
I find myself manually encoding background images in the css in base64 often.
When I mean manually, I mean that I encode the image, copy the resulting string, paste it into the css file and so on. This is stupid!
I came to the conclusion that writing a script in PHP or Python that does it automatically would not be difficult, it's just a matter of parsing the css, finding the image on the HD, encoding it in base64, replace the result with the original string in the css file and save a new file.
Then I thought: "how come nobody has already done this? Maybe it would be better to ask before doing it."
So here I am, does a similar solution exist?
Thanks
Well, Chris Coyer # CSS-Tricks published an article talking about Data URIs, where he explains how to use them and how they are useful. Near the end, he states that's it's very easy to generate those on the fly with PHP, like so
if you are using PHP (or PHP as CSS), you could create data URIs on the fly like this
<?php echo base64_encode(file_get_contents("../images/folder16.gif")) ?>
However, take not that you shouldn't use base64_encode on all images on a website. the size of the string generated by base64_encode is larger by about 33% of the original image. Data URIs are great when you have small pictures and you don't want to waste requests on them.
I have a lot of jpeg images which I want to optimize for web but I need a process which can also be done for incoming images in real time. In other words I don't want to use a service like Smush.it or drop them into photoshop for manipulation but I do want to know what I can do in php. I would prefer a solution which only requires php image processing functions but if necessary and it would provide significant improvement then a command line tool like jpegcrush could be used as well.
I have read that simply by making the image in php the EXIF data is stripped. What other things can I do without degrading the actual quality? When I save in photoshop using the 'save for web' feature, the savings are significant without a noticeable quality loss so I was wondering if anyone knew exactly what operations are done in there. One other thing I have noticed is that images from youtube are normally much larger area-wise than they need to be but they have very small file sizes... does anyone know what is going on there or is this some secret technique?
If it makes any difference the images I am working with are mostly 320x320 and I want to make them progressive jpg. Thanks in advance.
I'd use the PHP GD library to create a jpg with quality set to around 80:
imagejpeg ( resource $image [, string $filename [, int $quality ]] )
If you want to output Progressive JPEGs, you need to set interlacing on with imageinterlace().
Are you looking for something more than this?
Assuming "optimizing for the web" is converting to GIF, PEAR does it if i'm not mistaken. You don't need fancy functions to do this.
check this question:
Convert jpg image to gif, png & bmp format using PHP
I have a script that gets the raw binary image data via url request. It then takes the data and puts it into mysql.
Pretty simple right? Well It's I'm inserting some 8,000 decent sized 600x400 jpegs and for some odd reason some of the images are getting cut off. Maybe the part of my script that iterates through each image it needs to get is going to fast?
When I do a straight request to the URL I can see all the raw image data, but on my end, the data is cut off some way down the line.
Any ides why?
Is something in the chain treating the binary data as a string, in particular a C style null-terminated string? That could cause it to get cut off at the first null byte ('\0').
Have you tried simply call your script that pulls the binary image, and dump it out. If you see the image correctly then its not pulling part, might be something to do with inserting.
Are you setting the headers correctly?
ie:
header('Content-Length: '.strlen($imagedata));
header('Content-Type: image/png');
...
A string datatype would definitely not be the optimum for storing images in a DB.
In fact I've seen several recommendations that the image should go in a folder somewhere in your filesystem and the DB contains only the address/file path.
This is a link to a page about inserting images.
It contains the suggestion about the filepath and that a blob datatype is better if the images must go in the database.
If it's a blob, then treating it as a string won't work.
If you make repeated requests to the same url, does the image eventually load?
If so that points to a networking issue. Large packet support is enabled in your kernal (assuming linux) which doesn't work correctly for a lot of windows clients. I've seen a similar issue with large(1+MB) javascript libraries served from a linux machine.
http://en.wikipedia.org/wiki/TCP_window_scale_option
http://support.microsoft.com/kb/314053