Can file paths be replaced by memory resources? - php

I'w writing a PHP script that uses Montage, which is an extension of ImageMagick that creates tiled arrays of images. The Montage syntax is straightforward:
montage image1.png image2.png image3.png -tile x1 -geometry 50x50 out.png
However, I'm generating these images on the fly and I would hate to have to write them to disk just to run this command. Is there some way I can pass there resources in without writing them to file?
(This isn't really a Montage question, but rather a general question that could apply to many different situations.)
Any help would be great!

As I commented, you could find some tmpfs file system (e.g. with df | grep tmpfs) such as /run and put your files inside. They will then live within the virtual memory (and won't usually need any disk I/O). Of course the space they take are consuming virtual memory resources (so are limited).
BTW, on many Linux systems, writing small files don't use a lot of disk I/O because they sit in the file system cache.
Of course, any tmpfs file system loses all its content at every reboot, so you don't want to keep important unrecoverable data inside.

ImageMagick has APIs for both C and PHP. PHP IMagick appears to have montage hooks.
http://us3.php.net/manual/en/imagick.montageimage.php
General info on PHP IMagick usage:
http://us3.php.net/manual/en/imagick.examples-1.php

Related

PHP Imagick processing images takes long time

I'm processing 5MB - 6MB images and reducing them to under 600KB file size using PHP Imagick. Usually on the order of 3000 to 5000 at a time. However, the process is taking 8-12 hours to complete. I tried two different ways of handling this: 1) Retrieving the images remotely using Guzzle Pool and storing them locally, then running the conversion process, and 2) Retrieving the images remotely and storing in an ImageMagick object, processing them, then saving locally. Either method seems to take a huge amount of time to complete. The process of resizing the images and saving them below is the same between the two methods, except for reading the image from file if I already have it saved locally.
$imagick = new Imagick();
$imagick->readImageBlob($source);
$imagick->setImageFormat('jpg');
$imagick->setOption('jpeg:extent', '600kb');
$imagick->stripImage();
$imagick->writeImage($destination);
Wondering if there is something else I can do to speed things up.
I have several suggestions, and you can probably combine them together too if you wish. It would have helped if you had specified your OS and also the image types in your question...
First suggestion - process in parallel
Do them in parallel, with GNU Parallel. So instead of the following sequential code:
for f in *.png; do
./Process "$f"
done
you would do:
parallel ./Process {} ::: *.png
Second suggestion - use "shrink on load"
If you are reducing the pixel dimensions of your images, i.e. from 5,000x3,000 pixels to 800x600 pixels for example, and they are JPEGs, you can use "shrink on load"
magick -define jpeg:size=1600x1200 BigBoy.jpg -define jpeg:extent=600kb BiteSize.jpg
Consider vips
Consider using vips which I suspect will be much faster and lighter on resources
vipsthumbnail BigBoy.jpg -s 800x600 -o BiteSize.jpg
Add --vips-leak to the command above to see how much memory it used.

ImageMagick convert process causing deadly CPU load and killing available swap space in an instant

Today a customer's server went down after a user uploaded a 16MB JPG which then got processed by an automated, cron-controlled script which has been running flawlessly for months, successfully processing at least a thousand images.
After some investigation i figured that indeed that certain image is causing ImageMagick trouble. I downloaded the image to my local system (OSX Mavericks, ImageMagick 6.8.6-6 Q16) and ran the exact same convert command on it:
convert test.jpg icc:profile.icc
convert test.jpg -crop 3567x5340+125+0 +repage -resample 200x200 -resize 3937x5906 -compress lzw -profile ./profile.icc out.tif
Immediately everything stalls, the system gets slow and after a few seconds i get this message:
And in the terminal it says:
convert: unable to extend cache `test.jpg': No space left on device # error/cache.c/OpenPixelCache/3661.
My drive has a clear 73 GB of free space, although i am certain that swap space is not exactly referring to that. Then again i have never before encountered such an issue.
So much for local testing.
On the server (CentOs 6.5 Final, ImageMagick 6.8.6 Q16) the main symptoms are a high CPU load causes by the convert process, causing some kind of stapling of further convert processes which may get called through users navigating the website in yet uncached areas, triggering convert executions for the purpose thumbnail generation. But these never exceed even a percent of CPU usage. The convert process that should work through the image in question however, that one i have seen to go up to almost a 100%.
Sideinfos
I use a normal call to exec() to run convert. Forking as a separate process doesn't seem like much of an option (not even sure if PCNTL is available), nor would it help much, other than that the convert process would then just continue running in circles until infinity==null (so to speak).
PHP's memory limit is set to 128 M on the server (on my test machine i called the process manually from the terminal).
PHP's max_execution_time is set to 300, for we need a certain amount of buffer to assure proper processing of a whole stack of images. Note that this system has worked stable and properly managed to process entire packs of images, not seldom exceeding a count of 25 pieces a time (all of which were larger than the image in question).
Note
Even though i would like to, due to copyright issues i cannot provide the original image. But perhaps someone has ran into the same or a similar issue and can therefore give valuable input that may server in solving the problem or at least finding out what exactly causes it.

How can I speed up Image Magick image processing commands in Linux?

I am using Image Magick in my PHP program on linux to process some images. In one case we are taking layers from a PSD and displaying certain layers and colorizing them. In others, we are overlaying 3 or so PNG images and then returning a final image. We are using PHP but we use the system command to run the commands like at the command prompt and then serve up the image to the browser. It seems very slow. Caching is not an option because of the myriad of possible combinations. The images are not that large and it takes sometimes 5 seconds per image to process and it seems that the server only does them one at a time (I am assuming that the browser is probably asking for more than one image concurrently.). We have put the site on a very beefy server. Is there somewhere that I can allocate more memory or processing power to these Image Magick commands like "convert"?

7zip string/stream compression PHP/C? No file name / date store in .7z archive, every byte count

I want compress 1 small file/data, only file size matter.
No need file information store, like filename, size, date, etc...
If I use rar/7zip/zip as CLI, file information added to archive. It's not good for me.
I finding the BEST compression solution for file size.
In PHP I can use gzdeflate() or bzcompress() to compress string then save to file as compressed. I finding a same or CLI solution.
Environment: Linux, 32/64 bit.
I want to use 7zip/7za as same for string/stream compression.
If I using a binary version of 7z, for example: 7za a -mx9 output.7z input.dat
But this time in .7z found file/date/size information and file size is bigger.
How can I use 7zip or other better compressor like as bzcompress or gzdeflate to compress data stream only, without file informations?
Maybe I cannot use 7zip actually in PHP because not supported yet.
Someone can recommend/create a small C/C++ CLI application/source or in other language what can usable in Linux CLI to compress 1 file and output to 1 file?
For example I want shell exec:
7zcpp input.dat output.7z
or
7zcpp -mx9 input.dat output.7z
Summary: Compression speed not important, only better, smaller file size. I want compress only 1 file (string/stream), every byte count, no need filename/date information inside the archive. I can use any better compressor than 7zip, but I think this is one of best actually. Any ideas, recommendations?
Thanks
7Zip memory compression functions - You could check out their SDK and try to look through the functions, - In (C, C++, Java and C#, too bad no php, unless you create your own php extension to bring 7z functionality into your app). In the LZMA SDK, go through the path C\Util\Lzma, find the LzmaUtil.c, it's a perfect example to help you . But I've personally used zlib, lzma compresses to 8% of data set size, compared to zlib's 12%, even though zlib is much faster. But since you don't give a shit about speed, lzma is best for you then

JPEG Optimization from PHP running on hosted accounts

I am looking for a loss-less JPEG optimization tool that I can include in a PHP-based photo gallery software.
I already checked the following question, but ruled out the tools mentioned there for reasons I'll explain below:
Tools for JPEG optimization?
The command line tools jpegtran and jpegoptim are not available in the average PHP hosting account. The web-based tools (www.smush.it, www.kraken.io) all have a limit of 1 MB for processed images.
The 1 MB limit is a problem, because the gallery software delivers images based on browser-size and I also want to support Full HD and even larger displays. Depending on content, photos prepared for such resolutions can get larger than 1 MB.
Any hints on a solution are appreciated!
OK, I found my answer in another stack overflow question:
JPG File Size Optimization - PHP, ImageMagick, & Google's Page Speed
ImageMagick already does the Huffman optimization. I assumed it doesn't because my ImageMagick files were still larger then the ones from jpegtran & Co. However, that was only because I did not strip the metadata.
Cheers, Robert
There are no solution.
Either use some command-line utility or increase memory limit.
(and, to let you know, it's not target image file size but uploaded image dimensions that puts you into trouble with memory limit.

Categories