batch image processing cronjob in php - php

i need to import images from a third party source. the import file looks like:
// import.csv
group1;nameofimage1;http://www.site.com/image/image1.jpg;nameofimage2;http://www.site.com/image/image2.jpg; etc...
you can have up to 20 images per line.
i have a cronjob that reads the file and then process it (loop each line, parse each line then curl to get the image etc.. - that im ok).
the image provided are way too big for what i need and for each file i need to resize them to 50% of their original size. i tried to use the gd library but it takes a very long time to complete.
is it normal? what can i use to make it faster?
thansk

GD library is not optimized for big images.. actually I'd NOT recommend to use GD at all.. only if you don't have other options..
ImageMagick is your wizard here :)
And other small bit. Better is to not use PHP for this task. You can use command line imagemagick tools for this. Just add another line to your cron/bash script to automatically resize images if needed.

Related

How to connect ghostscript with imagecreatefrompng in PHP?

I am currently using ghostscript, via exec() from PHP, to create a PNG image that is saved to a file.
And then I use imagecreatefrompng() to load it into PHP. I am wondering if there is a way to directly connect these without first saving it to a file and then reading it from a file. Does this work with the expect:// wrappers? If so could somebody provide an example? I am not sure about the syntax.
Another solution may perhaps be with imagecreatefromstring() if one can get the image file as a string from ghostscript. I am looking for an example of how to do this. The image files I create are not that large, typically 600 x 150 pixels.

Using FFMpeg av_read_frame in PHP?

I'm hoping to incorporate FFMpeg in a PHP script. My understanding is that the two best solutions are to use the command line via exec() and extract results or use PHP-FFMpeg (https://github.com/PHP-FFMpeg/PHP-FFMpeg).
What I want to do is walk through a video frame-by-frame, and I think I need to use av_read_frame(). What's the best way to use that from PHP? I don't think it's available from the command line or PHP-FFMpeg. Should I write a C program (using XCode) to do what I want and call it from my PHP script?
Thanks!
I ended up learning the command line interface to FFMPEG. (It wasn't as hard as I feared!) To walk through frame-by-frame, I just created all frames as jpeg files with a single ffmpeg command with exec(), walked through each one, then deleted them when I was done. Easy.
If the video size is a risk, FFMPEG can be used to generate a range of jpeg files using -ss and -to options. For example, 0-20 seconds, 20-40 seconds, etc. for about 200mb of files per set.

extract images from PDF with PHP

The thing is that the client wants to upload a pdf with images as a way of batch processing multiple images at once.
I already looked around and out of the box PHP can't read PDF's.
What are my alternatives?
I already know the host has not installed imageMagick or any pdf library and the exec function is disabled. That's basicly leaving me with nothing to work with, I guess?
Does anyone know if there is an online service that can do this, with an api of sorts?
thanks in adv
AFAIK, there is no PHP module to do it. There is a command line tool, pdfimages (part of xpdf). For reference, here's how that works:
pdfimages -j source.pdf image
Which will extract all images from source.pdf as image-000.jpg, image-001.jpg, etc. Note the output format is always Jpeg.
Possible Options
Being a command line tool, you need exec (or system, passthru, any of the command executing functions built into PHP). As your environment doesn't have that, I see four options:
Beg that exec be turned on for you (your hosting provider can limit what you can exec to a single command)
Change the design -- how about a ZIP upload?
Roll your own, using the source code of pdfimages as a model
Let pdfimages do the heavy lifting, by running it on a remote host you do control
Regarding #3, rolling your own, I don't think rolling your own, to solve a very narrow definition of requirements, would be too difficult. I seem to recall that the image boundaries in PDF are well defined: just read in the file to a boundary, cut to the end of the boundary, base64_decode, and write to a file -- repeat. However, that may be too much...
If rolling your own is too complicated, then option #4 is kind of like what Joel Spolsky describes for working with complicated Excel objects (see the numbered list under the bold heading "Let Office do the heavy work for you").
Find a cheap hosting environment (eg Amazon EC2) that let's you exec and curl
Install pdfimages
Write a PHP script that takes a URL to a PDF, curl opens that PDF, writes it to disk, passes it to pdfimages, then returns the URL to the resulting images.
An example exchange could look like this:
GET http://www.cheaphost.com/pdfimages.php?extract=http://www.limitedhost.com/path/to/uploaded.pdf
Content-type: text/html
<html>
<body>
<ul>
<li>http://www.cheaphost.com/pdfimages.php?retrieve=ab9895v/image-000.jpg</li>
<li>http://www.cheaphost.com/pdfimages.php?retrieve=ab9895v/image-001.jpg</li>
</ul>
</body>
</html>
So your single pdfimages.php script (running on the host with the exec functionality) can both extract images, and give you access to the extracted images. When extracting, it reads a PDF you tell it, runs pdfimages on it, and gives you back a list of URL to call to retrieve the extracted images. When retrieving, it just gives you back a straight image.
You would need to deal with cleanup, perhaps the thing to do would be to delete the image after retrieval. You would also need to handle security -- don't know what's in these images, but the content might need to be wrapped in SSL and other precautions taken.
You can use pdfimages and install it this way:
apt install poppler-utils
Then use it this way to get all the images as PNG files:
pdfimages -j mypdf.pdf image -png
Images will be placed in the same folder under image-000.png, image-001.png, etc.
There are many options available, including some to change the output format, more information here.
I hope this helps!

Save audio file in chunk from original file

I actually want a demo version for audio file. I uploaded audio file in mp3 format on server with php script. I want that it run for 30 seconds in demo mode and for register user it play whole file.
There is a solution using ffmpeg library while uploading split file in frames and save one for demo and one for original. But I need some different solution because ffmpeg not available on shared server.
Please help me to solve this problem.
You can't cut an MP3 (reliably) on frames alone. There is something called a bit reservoir, where basically frames can rely on other frames. Most frames will be using this feature. If you cut on the frame, you will undoubtedly end up with a corrupt file, and have a few artifacts in your clip.
To do what you are asking, you have two options:
Parse the MP3 data and cut only on frames not using the bit reservoir
Write an encoder that can cut on any frame and fix the bit reservoir so that prior and latter frames are not needed.
Neither of these are good options. Call your web host, get FFMPEG installed. If they won't put it on there for you, get a new hosting provider. Don't waste your time with anything else.

PHP Thumbnails

I was looking at a way to dynamically create thumbnails using PHP and GD but everytime i select a large image maybe 10MegaPixels about 4-5MB it gives the error
**images/Surabhi_Cow.jpgimages/tn/Surabhi_Cow.jpg
Fatal error: Allowed memory size of 31457280 bytes exhausted (tried to allocate 10368 bytes) in C:\Program Files\xampp\htdocs\MySite\Staff\test.php on line 51**
Changing the memory_limit in php.ini to 60 does the trick but my host only allows the memory_limit to 32M. What other options do I have to generate thumbnails on the fly?
I checked phpThumb() but don't really get it. So any other options are welcome!
You want to use ImageMagick. It is much more efficient in handling large images than GD.
If all you want to do is generate thumbnails. I recommend this nice little script called imagethumb.php. You can download it here:http://www.olivo.net/software/imagethumb/
This script produces excellent thumbnails with absolutely no pixelation. It accepts a height or width argument that you append to the URL that calls the script. It's really really easy to use and comes with documentation (which you'll read for all of 2 minutes).
I tried other thumbnailing scripts such as "ThumbsUp" (for example) before landing on this one. BTW, it also renders .png images and also .gif (if I recall correctly). The cache feature will make it easier on your server if you have large files. Also, I assume that your server has the GD library or ImageMagick installed. Good Luck ;)
As the others have said, if the images are that big it's time to drop GD and switch to ImageMagick. One word of warning though: do it all on the command-line - the class wrappers out there are wheels in need of damn good re-inventing, every last one.
Consider using a command-line based approach. For example, you can invoke ImageMagick from the command-line to resize images.
Other than that, in pure PHP, it's hard to see how you can edit images that are larger (in RGB format) than your RAM...
I was doing some research on the topic and I found Imagick much more efficient for manipulating bigger images. You`ll either pass the allowed memory or the maximum execution time. A better approach would be to use Imagick library. Check the onfo on how to generate thumbnails with php on the fly using Imagick.

Categories