I'm using a Symfony2 bundle from GregWar to resize up to 12 images at a time from a user upload. I'm using this https://github.com/Gregwar/ImageBundle
I'm resizing them to four different sizes, as these sizes are needed for mobile, desktop, thumbnail, etc. It takes time to do each one of course, but with 12 x 4 resizes it can take a while. Over 30 seconds easily. Which is behind the default PHP timeout and isn't really acceptable for an end user wait time.
I want the resizing to be done at upload so the new sizes are available immediately to the user. Rather than later as a batch process.
This seems like it may be a common problem. So what can I do to improve my situation?
Should I use a different library?
Or reduce my image sizes from four, down to perhaps two to improve processing speed but sacrifice user experience?
Is this normal? Could it be a hardware issue? On my local machine it's even slower.
PHP memory is set to 256MB. I use a ServerGrove VPS, with PHP5.3.
I have moved away from trying to solve this on my server and instead now use a dedicated EC2 instance with Amazon to resize images.
Similar to this approach http://sumitbirla.com/2011/11/how-to-build-a-scalable-caching-resizing-image-server/
If there is GD extension installed on your server, you can use imagecopyresized function.
Related
I am creating a simple file manager for a CMS. I included a function that calculates the total size of a folder, but I noticed that it increases load times considerably.
The first thing that came up was an option to enable or disable performing those calculations and only display the size of individual files. But now I am thinking about the impact it would have on the server when, at some point and hypothetically speaking, the option is activated and in the directory there is a folder whose size is 1 TB or more (assuming that the file system allows it).
What do you think would happen to the server if it received a request to perform those big calculations? Would it be better to remove the option?
PHP Can overload the system if you done it on PHP alone. You can use PHP native support for your Operating System to handle the calculations, because as far as I know, your OS cache the indexing result so calculating the folder size is not a problem.
reference: how-to-get-directory-size-in-php
I am using Image Magick in my PHP program on linux to process some images. In one case we are taking layers from a PSD and displaying certain layers and colorizing them. In others, we are overlaying 3 or so PNG images and then returning a final image. We are using PHP but we use the system command to run the commands like at the command prompt and then serve up the image to the browser. It seems very slow. Caching is not an option because of the myriad of possible combinations. The images are not that large and it takes sometimes 5 seconds per image to process and it seems that the server only does them one at a time (I am assuming that the browser is probably asking for more than one image concurrently.). We have put the site on a very beefy server. Is there somewhere that I can allocate more memory or processing power to these Image Magick commands like "convert"?
My website is showing same image, one normal and other is blurred and I'm thinking what is better method of doing it in terms of speed. Create two images upon uploading where it uploads one normal image and one blurred to server or upload only one image but blur second one on the fly using gd?
If you're using GD, I would do it at upload and save them as flat files.
Apache and other web servers can serve flat files remarkably fast.
However, I would look into using http://www.graphicsmagick.org/ to do the image manipulation. It's much, much faster and efficient than Imagemagick and most certainly PHP's GD.
I have an image GD script which is currently using around 9MB of memory.
I get a lot of traffic and hence sometimes it using up hell lot of RAM on my server.
Is there any way to reduce memory usage for image gd?
Or at-least make script process faster so that it de-allocates the memory which it is using, faster.
I have tried changing image quality, it had no effect.
I also tried changing image pixel size, it reduced the memory usage, but not much.
Thanks.
It's impossible to tell without seeing the code, but unless it contains any major mistakes, then the answer is probably no.
What you might be able to do is use the external imagemagick binary instead - it runs outside PHP's script memory limit - but that is an entirely different technology, and would require you to rewrite your code.
I assume you are already caching GD's results so not every request causes it to run?
Try avoiding using image GD on the fly if you are concerned about memory limits.
It's hard to solve the problem without seeing code, but i can make a suggestion.
Have a different process handle the images, for example, if you want to resize images, don't resize them everytime the user access a page, instead run a cron or a scheduler with window to resize all the images that needs to be resized periodically and save them. so there will be less overhead.
If you provide more code you will get better help
I am the tech intern for an online independent newspaper, and the writers on the staff are not tech-savvy. They don't quite understand how web pages work, and often they upload and include images straight from their digital cameras, or scanned from original media. These images become a burden when there are 10 images on the front page each at 3.5Mb each.
We are trying to work out some sort of training method for teaching them how to resize and optimize the images they want to include in their articles, but like I said, they are not very tech savvy, and any method we attempt to employ may go way over their heads.
So, I wanted to know if it is outside of reason to attempt to resample and cache images that are included in the articles using a PHP function and the GD library in order to stream line the amount of data that has to be passed per article.
I think it's possible, I'm just trying to figure out if it would be worth it to just take the time and effort to train the writers, or if creating an automated process would be better.
You'd be better off doing the GD image processing during the upload process. GD can take up quite a bit of resources, so processing each image on every request would not be a preferable solution. If you can't do it during the upload process, you should cache all the resampled images and use those if/when available.
It's certainly possible, and I'd be very surprised if Joomla! doesn't already have modules that do just that.
With the current web site that I'm working on I needed to answer a similar question. I've opted for using the joomla addon Easy Gallery. The two stand out features for me are the automated thumbnail creation and the image resize feature. The sizes are configurable with the components configuration page. You'll have a thumbnail, a resized image and the original image with each upload.
This component is a Joomla 1.0 component, so, if you're running a Joomla 1.5 install you will need to turn on legacy mode. Work seems to be in progress for a native 1.5 version.
I also found a couple of places where the generated thumbnails weren't being displayed. I raised the question on the Easy Gallery forum, and managed to work out the answer for myself.