I am using Intervention/imagecache to cache my image.
However the cache image load slower than the source image file.
Almost extra 60-70ms in time latancy (Tested in chrome inspect element network)
This is the code how I load the image at Route.php
Route::get('images/cars/{src}', function ($src){
$cacheimage = Image::cache(function($image) use($src){
return $image->make("images/products/".$src);
},1440);
return Response::make($cacheimage,200, array('Content-Type'=>'image/jpg'));
});
In blade
<img src="{{ URL::asset('/images/cars/theimage.jpg' }}" alt="">
Any thought or better way to store image cache?
I never used laravel, but this is a general issue.
If you let the webserver handle the delivery of the image to the client, the php interpreter will not be started.
If you deliver something via PHP (I assume, because you write something about a cached image), you need the php interpreter. Then you need to execute the script, and all its logic, which is in a scripted language always slower, then in native.
Your best bet is, to save the image on the file system, and link to it, instead of a PHP script.
This means for example:
Somewhere in your Application you have a point, where the original image is created. Now think about, what versions of it you need. Resize, crop, edit it as much you want. Save each version you need in your file system. So you have instead of image.jpg a image-200x200-cropped-with-branding.jpg. At this point, performance shouldn't be so much important (The image will be viewed thousands times, but only one time created).
You want to have
<img src="/path/to/image-200x200-cropped-with-branding.jpg">;
instead of
<img src="/image.php?param1=1¶m2=2">;
Just some additional thoughts, based upon the answer of Christian Gollhardt.
He is absolutely right, this is a general issue. But I did not like his approach of creating all the versions needed on creation (or upload) of the original image. Because there is one big problem, what if - at some point the future - you decide that your thumbnails should be 250x250 instead of 200x200 (or any other dimension)? So basically what I want is the flexibility offered by the ImageCache package without the performance drop off.
I haven't actually implemented this, but my approach would be to use some kind of - in between - helper function to include all your images within your views. In essence the helper function would simulate the functionality of the image cache, but instead of processing all that logic on the actual image-request it would be processed during the page-request. So at the time the actual image is requested from the users browser, every image version has already been created on the server and the link would point to the actual image on the filesystem. Some pseudo code explains it better ...
e.g. within a show_profile.blade view
<h1>{{ $profile->name }}</h1>
<img src="{{ image_helper($profile->image->filename, 'small') }}">
helpers.php
function image_helper($filename, $version) {
if (!file_exists($version . '_' . $filename)) {
// some other helper function ...
create_image_version($filename, $version);
}
return "my/images/" . $version . '_' . $filename;
}
function create_image_version($filename, $version) {
// if you want to go that route, you would need some kind of mapping
// that maps the $version (string) to a codeblock that actually knows
// what to do if that version is requested.
// E.g. if the version 'small' is requested,
// create an image with a dimension of 100x100
}
Related
I have a website with images upload/show functionality on it. All images are saved into filesystem on a specific path.
I use Yii2 framework in the project. There isn't straight way to the images and all of them requested by specific URL. ImageController proceses the URL and takes decision about image resizing. ImageModel does the job. The user get image content.
Here the code snippet:
$file = ... // full path to image
...
$ext = pathinfo($file)['extension'];
if (file_exists($file)) {
// return original
return Imagine::getImagine()
->open($file)
->show($ext, []);
}
preg_match("/(.*)_(\d+)x(\d+)\.{$ext}/", $file, $matches);
if (is_array($matches) && count($matches)) {
if (!file_exists("{$matches[1]}.{$ext}")) {
throw new NotFoundHttpException("Image doen't exist!");
}
$options = array(
'resolution-units' => ImageInterface::RESOLUTION_PIXELSPERINCH,
'resolution-x' => $matches[2],
'resolution-y' => $matches[3],
'jpeg_quality' => 100,
);
return Imagine::resize("{$matches[1]}.{$ext}", $matches[2], $matches[3])
->show($ext, $options);
} else {
throw new NotFoundHttpException('Wrong URL params!');
}
We don't discuss data caching in this topic.
So, I wonder about efficient of this approach. Is it ok to return all images by PHP even they aren't changed at all? Will it increase the server load?
Or, maybe, I should save images to another public directory and redirect browser to it? How long does it take to so many redirects on a single page (there are can be plenty images). What about SEO?
I need an advice. What is the best practice to solve such tasks?
You should consider using sendFile() or xSendFile() for sending files - it should be much faster than loading image using Imagine and displaying it by show(). But for that you need to have a final image saved on disk, so we're back to:
We don't discuss data caching in this topic.
Well, this is actually the first thing that you should care about. Sending image by PHP will be significantly less efficient (but still pretty fast, although this may depend on your server configuration) than doing that by webserver. Involving framework into this will be much slower (bootstrapping framework takes time). But this is all irrelevant if you will resize the image on every request - this will be the main bottleneck here.
As long as you're not having some requirement which will make it impossible (like you need to check if the user has rights to see this image before displaying it) I would recommend saving images to public directory and link to them directly (without any redirection). It will save you much pain with handling stuff that webserver already do for static files (handling cache headers, 304 responses etc) and it will be the most efficient solution.
If this is not possible, create a simple PHP file which will only send file to the user without bootstrapping the whole framework.
If you really need the whole framework, use sendFile() or xSendFile() for sending file.
The most important things are:
Do not use Imagine to other things than generating an image thumbnail (which should be generated only once and cached).
Do not link to PHP page which will always only redirect to real image served by webserver. It will not reduce server load comparing to serving image by PHP (you already paid the price of handling request by PHP) and your website will work slower for clients (which may affect SEO) due to additional request required to get actual image.
If you need to serve image by PHP, make sure that you set cache headers and it works well with browser cache - you don't want to download the same images on every website refresh.
I'm building a web based system, which will host loads and loads of highres images, and they will be available for sale. Of course I will never display the highres image, instead when browsing people will only see a low resolution, watermarked image. Currently the workflow is as follows:
PHP script handles the highres image upload, when image is uploaded, it's automatically re-sized to a low res image and to a thumbnail image as well and both of the files are saved on the server, (no watermark is added).
When people are browsing, the page displays the thumbnail of the image, on click, it enlarges and displays the lowres image with watermark as well. At the time being I apply the watermark on the fly whenever the lowres image is opened.
My question is, what is the correct way:
1) Should I save a 2nd copy of the lowres image with thumbnail, only when it's access for the first time? I mean if somebody access the image, I add the watermark on the fly, then display the image & store it on the server. Next time the same image is accessed if a watermarked copy exist just display the wm copy, otherwise apply watermark on the fly. (in case watermark.png is changed, just delete the watermarked images and they will be recreated as accessed).
2) Should I keep applying watermarks on the fly like I'm doing now.
My biggest question is how big is the difference between a PHP file_exists(), and adding a watermark to an image, something like:
$image = new Imagick();
$image->readImage($workfolder.$event . DIRECTORY_SEPARATOR . $cat . DIRECTORY_SEPARATOR .$mit);
$watermark = new Imagick();
$watermark->readImage($workfolder.$event . DIRECTORY_SEPARATOR . "hires" . DIRECTORY_SEPARATOR ."WATERMARK.PNG");
$image->compositeImage($watermark, imagick::COMPOSITE_OVER, 0, 0);
All lowres images are 1024x1024, JPG with a quality setting of 45%, and all unnecessary filters removed, so the file size of a lowres image is about 40Kb-80Kb.
It is somehow related to this question, just the scale and the scenarios is a bit different.
I'm on a dedicated server (Xeon E3-1245v2) cpu, 32 GB ram, 2 TB storage), the site does not have a big traffic overall, but it has HUGE spikes from time to time. When images are released we get a few thousand hits per hours with people browsing trough the images, downloading, purchasing, etc. So while on normal usage I'm sure that generating on the fly is the right approach, I'm a bit worried about the spike period.
Need to mention that I'm using ImageMagick library for image processing, not GD.
Thanks for your input.
UPDATE
None of the answers where a full complete solution, but that is good since I never looked for that. It was a hard decision which one to accept and whom to accord the bounty.
#Ambroise-Maupate solution is good, but yet it's relay on the PHP to do the job.
#Hugo Delsing propose to use the web server for serving cached files, lowering the calls to PHP script, which will mean less resources used, on the other hand it's not really storage friendly.
I will use a mixed-merge solution of the 2 answers, relaying on a CRON job to remove the garbage.
Thanks for the directions.
Personally I would create a static/cookieless subdomain in a CDN kinda way to handle these kind of images. The main reasons are:
Images are only created once
Only accessed images are created
Once created, an image is served from cache and is a lot faster.
The first step would be to create a website on a subdomain that points to an empty folder. Use the settings for IIS/Apache or whatever to disable sessions for this new website. Also set some long caching headers on the site, because the content shouldn't change
The second step would be to create an .htaccess file containing the following.
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.*) /create.php?path=$1 [L]
This will make sure that if somebody would access an existing image, it will show the image directly without PHP interfering. Every non-existing request will be handled by the create.php script, which is the next thing you should add.
<?php
function NotFound()
{
if (!headers_sent()) {
$protocol = (isset($_SERVER['SERVER_PROTOCOL']) ? $_SERVER['SERVER_PROTOCOL'] : 'HTTP/1.0');
header($protocol . ' 404 Not Found');
echo '<h1>Not Found</h1>';
exit;
}
}
$p = $_GET['path'];
//has path
if (strlen($p)<=1)
NotFound();
$clean = explode('?', $p);
$clean = explode('#', $clean[0]);
$params = explode('/', substr($clean[0], 1)); //drop first /
//I use a check for two, because I dont allow images in the root folder
//I also use the path to determine how it should look
//EG: thumb/125/90/imagecode.jpg
if (count($params)<2)
NotFound();
$type = $params[0];
//I use the type to handle different methods. For this example I only used the full sized image
//You could use the same to handle thumbnails or cropped/watermarked
switch ($type) {
//case "crop":if (Crop($params)) return; else break;
//case "thumb":if (Thumb($params)) return; else break;
case "image":if (Image($params)) return; else break;
}
NotFound();
?>
<?php
/*
Just some example to show how you could create a responds
Since you already know how to create thumbs, I'm not going into details
Array
(
[0] => image
[1] => imagecode.JPG
)
*/
function Image($params) {
$tmp = explode('.', $params[1]);
if (count($tmp)!=2)
return false;
$code = $tmp[0];
//WARNING!! SQL INJECTION
//USE PROPER DB METHODS TO GET REALPATH, THIS IS JUST EXAMPLE
$query = "SELECT realpath FROM images WHERE Code='".$code."'";
//exec query here to $row
$realpath = $row['realpath'];
$f = file_get_contents($realpath);
if (strlen($f)<=0)
return false;
//create folder structure
#mkdir($params[0]);
//if you had more folders, continue creating the structure
//#mkdir($params[0].'/'.$params[1]);
//store the image, so a second request won't access this script
file_put_contents($params[0].'/'.$params[1], $f);
//you could directly optimize the image for web to make it even better
//optimizeImage($params[0].'/'.$params[1]);
//now serve the file to the browser, because even the first request needs to show the image
$finfo = finfo_open(FILEINFO_MIME_TYPE);
header('Content-Type: '.finfo_file($finfo, $params[0].'/'.$params[1]));
echo $f;
return true;
}
?>
I would suggest you to create watermarked images on-the-fly and to cache them at the same time as everybody suggested.
Then you could create a garbage-collector PHP script that will be executed every days (using cron). This script will browse your cache folder to read every image access time. This can done using fileatime() PHP method. Then when a cached wm image has not been accessed within 24 or 48 hours, just delete it.
With this method, you can handle spike periods as images are cached at the first request. AND you will save your HDD space as your garbage-collector script will delete unused images for you.
This method will only work if your server partition has atime updates enabled.
See http://php.net/manual/en/function.fileatime.php
For most scenarios, lazily applying the watermark would probably make most sense (generate the watermarked image on the fly when requested then cache the result) however if you have big spikes in demand you are creating a mechanism to DOS yourself: create the watermarked version on upload.
Considering your HDD storage capacity and Pikes.
I would only create a watermarked image if it is viewed.(so yes on the fly) In that way you dont use to much space with a bunch a files that are or might not be viewed.
I would not watermark thumbnails i would rather make a filter that fake watermark and protect from being saved. That filter would apply to all thumbnails without creating a second image.
In this way all your thumbbails are watermarked (Fake with onther element on top).
Then if one of these thumbnails is viewed it generate a watermarked image (only once) since after its generated you load the new watermarked image.
This would be the most efficient way to deal with your HDD storage and Pikes.
The other option would be to upgrade your hosting services. Godaddy offer unlimited storage and bandwith for about 50$ a year.
I'm trying to work out a way to clear the cache of all manipulations of an image that have been created using Intervention/ImageCache's manipulation templates (and likely in the future the cache function as well).
I have a some manipulation templates that run a custom filters on the image:
'thumbnail' => function($image) {
return $image->filter(new ImageFilters\Fit(150));
},
The filter gets the focal point from the database and then runs the appropriate functions, using the focal point for the position as needed.
public function applyFilter(Image $image)
{
$fp = Media::where('file_name', $image->filename)->pluck('focal_point');
$fp = $fp ?: 'center';
return $image->fit($this->width, $this->height, function(){return true;}, $fp);
}
This all works fine, but when the focal point is changed by the user the image isn't updated because of the cache. I need a way of clearing the cache for that specific image across the board.
From what I can tell each manipulated image is given its own unique cache key made up of an md5 hash, but I can't find any way of targeting all manipulations of a single image.
Even if the question lies quite in the past, I would like to present a solution:
// get an instance of the ImageCache class
$imageCache = new \Intervention\Image\ImageCache();
// get a cached image from it and apply all of your templates / methods
$image = $imageCache->make($pathToImage)->filter(new TemplateClass);
// remove the image from the cache by using its internal checksum
Cache::forget($image->checksum());
Execute this lines whenever you make a change to your image which affects the applied filters (e.g. a focal point).
When the cached image is requested, it will be created again with new values.
Hope this helps,
Richard
Clearing the cache for one image is not possible. The string for the focal_point must be in the callback signature, to take effect on the caching.
You can do this, by adding it to the Filter, by GET parameter for example.
'thumbnail' => function($image) {
$fp = Input::get('fp', 'center');
return $image->filter(new ImageFilters\Fit(150, $fp));
},
Then request your image like this:
/imagecache/thumbnail/foo.jpg?fp=center
On my server, I have three files per image.
A thumbnail file, which is cropped to 128 by 128.
A small file, which I aspect fit to a max of 160 by 240.
A large file, which I aspect fit to a max of 960 by 540.
My method for returning these URLs to three20's gallery looks like this:
- (NSString*)URLForVersion:(TTPhotoVersion)version {
switch (version) {
case TTPhotoVersionLarge:
return _urlLarge;
case TTPhotoVersionMedium:
return _urlSmall;
case TTPhotoVersionSmall:
return _urlSmall;
case TTPhotoVersionThumbnail:
return _urlThumb;
default:
return nil;
}
}
After having logged when these various values are called, the following happens:
When the thumbnail page loads, only thumbnails are called (as expected)
When an image is tapped, the thumbnail appears, and not the small image.
After that thumbnail appears, the large image is loaded directly (without the small image being displayed).
What I desire to happen is the following
This is the same (thumbnails load as expected on the main page)
When the image is tapped, the small image is loaded first
Then after that, the large image is loaded.
Or, the following
Thumbnails
Straight to large image.
The problem with the thumb, is that I crop it so it is a square.
This means that when a thumbnail image is displayed in the main viewer (after thumb was tapped), it is oversized, and when the large image loads, it immediately scales down to fit.
That looks really bad, and to me, it would make far more sense if it loaded the thumbs in the thumbnail view, and then the small image followed by the large image in the detail view.
Does anyone have any suggestions on how to fix this?
Is the best way simply to make the thumbs the same aspect ratio?
I would appreciate any advice on this issue
Looking at the three20 source I can see that TTPhotoView loads the preview image using the following logic:
- (BOOL)loadPreview:(BOOL)fromNetwork {
if (![self loadVersion:TTPhotoVersionLarge fromNetwork:NO]) {
if (![self loadVersion:TTPhotoVersionSmall fromNetwork:NO]) {
if (![self loadVersion:TTPhotoVersionThumbnail fromNetwork:fromNetwork]) {
return NO;
}
}
}
return YES;
}
The problem is that as your small image is on the server and not locally the code skips the image and uses the Thumbnail for the preview.
I would suggest that your best solution would be to edit the thumbnails so that they have the same aspect ratio as the large images. This is what the developer of this class seems to have expected!
I think you have three ways to go here:
modify the actual loadPreview implementation from TTPhotoView so that it implements the logic you want (i.e., allowing loading the small version from the network);
subclass TTPhotoView and override loadPreview to the same effect as above;
pre-cache the small versions of your photos; i.e, modify/subclass TTThumbView so that when TTPhotoVersionThumbnail is set, it pre-caches the TTPhotoVersionSmall version; in this case, being the image already present locally, loadPreview will find it without needing to go out for the network; as an aside, you might do the pre-caching at any time that you see fit for your app; to pre-cache the image you would create a TTButton with the proper URL (this will both deal with the TTURLRequest and the cache for you);
otherwise, you could do the crop on-the-fly from the small version to the thumbnail version by using this UIImage category; in this case you should also tweak the way your TTThumbView is drawn by overriding its imageForCurrentState method so that the cropping is applied when necessary. Again, either you modify directly TTThumbView or you subclass it; alternatively, you can define layoutSubviews in your photo view controller and modify there each of the TTThumbViews you have:
- (void)layoutSubviews {
[super layoutSubviews];
for (NSInteger i = 0; i < _thumbViews.count; ++i) {
TTThumbView* tv = [_thumbViews objectAtIndex:i];
[tv contentForCurrentState].image = <cropped image>;
If you prefer not using the private method contentForCurrentState, you could simply do:
[tv addSubview:<cropped image>];
As you can see, each option will have its pros and cons; 1 and 2 are the easiest to implement, but the small version will be loaded from the network so it could add some delay; the same holds true for 4, although the approach is different; 3 gives you the most responsive implementation (no additional delay from the network, since you pre-cache), but it is possibly the most complex solution to implement (either you download the image and cache it yourself, or use TTButton to do that for you, which is kind of not very "clean").
Anyway, hope it helps.
I would like to generate a dynamic image from a script, and then have it load to the browser without being persistent on the server.
However, I cannot call this by setting the image's src="script.php", since that would require running the script that just generated the page and its data all over again, just to get the final data that will generate the graph.
Is there a way to do this that is similar to setting image's src="script.php", but which is called from within another script, and just sends the image without saving it? I need access to the data that is used in the generation of the markup, in order to create this dynamic image.
Or, if not, what is the easiest way to destroy the image once the page is loaded? a quick ajax call?
Is there any way to cache certain data for some limited time frame in order for it to be available to some other script?
Any ideas would be greatly appreciated, as I'm having a really hard time finding the right solution to this...
Thanks!
You can inline the image into a <img> tag if you need to.
Like
<?php
$final_image_data; // Your image data, generated by GD
$base64_data = base64_encode($final_image_data);
echo "<img src=\"data:image/png;base64,{$base64_data}\" ... />";
?>
That should work on all modern browsers, and IE8. Doesn't work well with some email clients tho (Outlook, for one).
Also, another solution I found is to store the image in a session variable which is then called from a php script in the image tag. This would allow a user specific image to be served, and then removed from memory by the script... This also avoids messy img src="" tags...
Hopefully that is helpful to someone.
Use a rewrite rule.
RewriteRule ^magicimage.jpg$ /myscript.php
Then simply echo your image data from gd, instead of writing it to disk -- which is as simple as not providing a filename to the appropriate image*() function
myscript.php
<?php
$im = imagecreatetruecolor($w, $h);
//...do gd stuff...
header('Content-type: image/jpeg');
//this outputs the content directly to the browser
//without creating a temporary file or anything
imagejpeg($im);
And finally, utilize the above
display.php
<img src="magicimage.jpg">