How to create videos from images with php? - php

Let's say I have 10 images and I want to combine those images in a video like a slideshow.
For example I want to show each image for 5 seconds and then continue with next image for another 5 seconds.
If it's possible, it will be perfect to include music and some descriptive text too.
Is there a sample code for this may be with ffmpeg library ?

My first thought was to shell out to the ffmpeg command with something like this.
Creating a Video from Images
ffmpeg can be used to stitch several images together into a video.
There are many options, but the following example should be enough to
get started. It takes all images that have filenames of
XXXXX.morph.jpg, where X is numerical, and creates a video called
"output.mp4". The qscale option specifies the picture quality (1 is
the highest, and 32 is the lowest), and the "-r" option is used to
specify the number of frames per second.
ffmpeg -r 25 -qscale 2 -i %05d.morph.jpg output.mp4
(The website that this blurb was taken from is gone. Link
has been removed.)
Where 25 means 25 images per second. You could set this to 1 for a slight (1 sec) delay or use decimals, IE: 0.5 for a 2 second delay.
You can then combine a video and audio stream with something like this.
ffmpeg -i video.mp4 -i audio.mp3 -c:v copy -c:a aac -b:a 128k final.mp4
Of course choose your appropriate codecs. If you want an mp4 use libx264 for video and aac (built into ffmpeg and no longer "experimental") for audio.
Just remember that if you choose to use a method like this that ffmpeg output goes, by default, to stderr for when you try to read it. It can be redirected to stdout if you prefer.

The first thing that came to mind for me was imagemagick. I've used it with PHP for a lot of image manipulation and I know it supports reading a decent amount of video formats and according to that link it supports writing to some too.

yes, ffmpeg is the right solution for you. i just recently made something similar - a video site with animated thumbnails. i used ffmpeg to put together images in an aminated gif. however, the output can be whatever you need... unfortunately, in my searches into this topic i have not found any sample code that would combine all the points you are after, so i suppose you will have to try manually with ffmpeg... in my project i used php video toolkit http://sourceforge.net/projects/phpvideotoolkit/ in some parts to make it a bit easier...

You can use blend effect with ffmpeg:
ffmpeg -framerate 20 \
-loop 1 -t 0.5 -i 1.jpg \
-loop 1 -t 0.5 -i 2.jpg \
-loop 1 -t 0.5 -i 3.jpg \
-loop 1 -t 0.5 -i 4.jpg \
-c:v libx264 \
-filter_complex " \
[1:v][0:v]blend=all_expr='A*(if(gte(T,0.5),1,T/0.5))+B*(1-(if(gte(T,0.5),1,T/0.5)))'[b1v]; \
[2:v][1:v]blend=all_expr='A*(if(gte(T,0.5),1,T/0.5))+B*(1-(if(gte(T,0.5),1,T/0.5)))'[b2v]; \
[3:v][2:v]blend=all_expr='A*(if(gte(T,0.5),1,T/0.5))+B*(1-(if(gte(T,0.5),1,T/0.5)))'[b3v]; \
[0:v][b1v][1:v][b2v][2:v][b3v][3:v]concat=n=7:v=1:a=0,format=yuv420p[v]" -map "[v]" out.mp4
You should check bellow link for more effect of ffmpeg :D
https://github.com/letungit90/ffmpeg_memo

Related

FFMpeg joining audio files together, and the last file as the background [duplicate]

I Need to concatenate 2 small audio files with loop and add background music in a single command.
I am capable to concatenate two audio files with a background music. My above given code is working.
ffmpeg -i 1.mp3 -i 2.mp3 -i background.mp3
-filter_complex "[0:0][1:0]concat=n=2:v=0:a=1,volume=1dB,aformat=fltp, pan=stereo|c0=c0|c1=c0[a0];
[2]volume=0.5dB,aformat=fltp,pan=stereo|c0=c0|c1=c1[a1];[a0][a1]amix=inputs=2:duration=longest,aformat=fltp[a]"
-map "[a]" -strict -2 -y output.mp3
but i want to a make a loop of the concatenated files till the end of the background music. background music is longer than approx 5 times from concatenated files.
If someone can suggest a single command solution.
I know about amovie tag but unfortunately its not possible to use in here because amovie requires file name which is not possible with concatenated files as per my knowledge.
Can anyone help me how can i achieve my goal!
Thanks
Use the concat demuxer.
Create a text file
file 1.mp3
file 2.mp3
and then,
ffmpeg -f concat -stream_loop -1 -i list.txt -i background.mp3
-filter_complex "[0]volume=1dB,aformat=fltp,pan=stereo|c0=c0|c1=c0[a0];
[1]volume=0.5dB,aformat=fltp,pan=stereo|c0=c0|c1=c1[a1];
[a0][a1]amix=inputs=2:duration=shortest,aformat=fltp[a]"
-map "[a]" -strict -2 -y output.mp3
Both 1.mp3 and 2.mp3 should have the same properties - sampling rate, channel layout..etc

Generating a waveform using ffmpeg

I am trying to generate a waveform image using ffmpeg.
I have successfully made a waveform image, however it doesn't look very nice...
I have been looking around to try and style the image to make it look nicer, however I have been unable to find any information on this or any tutorials on this.
I am using PHP and shell_exec to create the waveform.
I am aware that there are php library that can do this but due to file format this is a lengthy process.
The code I am using is as follows:
$command = 'convertvid\bin\ffmpeg -i Temp\\'.$file.' -y -lavfi showwavespic=split_channels=0:s='.$width.'x50 Temp\\'.$PNGFileName;
shell_exec($command);
Basically I would like to add a line through the middle as there are blank spots at the moment and would like to be able to set the background and wave colour.
Default waveform
ffmpeg -i input.wav -filter_complex showwavespic -frames:v 1 output.png
Notes
Notice the segment of silent audio in the middle (see "Fancy waveform" below if you want to see how to add a line).
The background is transparent.
Default colors are red (left channel) and green (right channel) for a stereo input. The color is mixed where the channels overlap.
You can change the channel colors with the colors option, such as "showwavespic=colors=blue|yellow". See a list of valid color names or use hexadecimal notation, such as #ffcc99.
See the showwavespic filter documentation for additional options.
If you want a video instead of an image use the showwaves filter.
Fancy waveform
ffmpeg -i input.mp4 -filter_complex \
"[0:a]aformat=channel_layouts=mono, \
compand=gain=-6, \
showwavespic=s=600x120:colors=#9cf42f[fg]; \
color=s=600x120:color=#44582c, \
drawgrid=width=iw/10:height=ih/5:color=#9cf42f#0.1[bg]; \
[bg][fg]overlay=format=auto,drawbox=x=(iw-w)/2:y=(ih-h)/2:w=iw:h=1:color=#9cf42f" \
-frames:v 1 output.png
Explanation of options
aformat downsamples the audio to mono. Otherwise, by default, a stereo input would result in a waveform with a different color for each channel (see Default waveform example above).
compand modifies the dynamic range of the audio to make the waveform look less flat. It makes a less accurate representation of the actual audio, but can be more visually appealing for some inputs.
showwavespic makes the actual waveform.
color source filter is used to make a colored background that is the same size as the waveform.
drawgrid adds a grid over the background. The grid does not represent anything, but is just for looks. The grid color is the same as the waveform color (#9cf42f), but opacity is set to 10% (#0.1).
overlay will place [bg] (what I named the filtergraph for the background) behind [fg] (the waveform).
Finally, drawbox will make the horizontal line so any silent areas are not blank.
Gradient example
Using gradients filter:
ffmpeg -i input.mp3 -filter_complex "gradients=s=1920x1080:c0=000000:c1=434343:x0=0:x1=0:y0=0:y1=1080,drawbox=x=(iw-w)/2:y=(ih-h)/2:w=iw:h=1:color=#0000ff[bg];[0:a]aformat=channel_layouts=mono,showwavespic=s=1920x1080:colors=#0068ff[fg];[bg][fg]overlay=format=auto" -vframes:v 1 output.png
Color background
ffmpeg -i input.opus -filter_complex "color=c=blue[color];aformat=channel_layouts=mono,showwavespic=s=1280x720:colors=white[wave];[color][wave]scale2ref[bg][fg];[bg][fg]overlay=format=auto" -frames:v 1 output.png
The scale2ref filter automatically makes the background the same size as the waveform.
Image background
Of course you can use an image or video instead for the background:
ffmpeg -i audio.flac -i background.jpg -filter_complex \
"[1:v]scale=600:-1,crop=iw:120[bg]; \
[0:a]showwavespic=s=600x120:colors=cyan|aqua[fg]; \
[bg][fg]overlay=format=auto" \
-q:v 3 showwavespic_bg.jpg
Getting waveform stats and data
Use the astats filter. Many stats are available: RMS, peak, min, max, difference, etc.
RMS level per audio frame
Example to get standard RMS level measured in dBFS per audio frame:
ffprobe -v error -f lavfi -i "amovie=input.wav,astats=metadata=1:reset=1" -show_entries frame_tags=lavfi.astats.Overall.RMS_level -of csv=p=0 > rms.log
Peak level per second
Add the asetnsamples filter.
ffprobe -v error -f lavfi -i "amovie=input.wav,asetnsamples=44100,astats=metadata=1:reset=1" -show_entries frame_tags=lavfi.astats.Overall.Peak_level -of csv=p=0
Same as above but with timestamps
ffprobe -v error -f lavfi -i "amovie=input.wav,asetnsamples=44100,astats=metadata=1:reset=1" -show_entries frame=pkt_pts_time:frame_tags=lavfi.astats.Overall.Peak_level -of csv=p=0
Output to file
Just append > output.log to the end of your command:
ffprobe -v error -f lavfi -i "amovie=input.wav,asetnsamples=44100,astats=metadata=1:reset=1" -show_entries frame_tags=lavfi.astats.Overall.RMS_level -of csv=p=0 > output.log
JSON
ffprobe -v error -f lavfi -i "amovie=input.wav,asetnsamples=44100,astats=metadata=1:reset=1" -show_entries frame_tags=lavfi.astats.Overall.RMS_level -of json > output.json

Convert 720p Mp4 to 480p with php ffmpeg

I need to covert 720p or 1080p video to 480p mp4 video , i found below code
ffmpeg -i input -vf scale=-1:480 -vcodec mpeg4 -qscale 3 output.mp4
and please help me , i am unable to give it INPUT VIDEO ex.
$video='/path/to/mp4/video';
exec('ffmpeg -i $video -vf scale=-1:480 -vcodec mpeg4 -qscale 3 output.mp4');
why above code is not working
i used that code and it worked flawless. maybe you werent in the dir with the vid files? or something? you should replace YOURVID.mkv with your file OR /path/to/your/file.mkv as well as replace /path/to/output.mkv with the file name or a file path to the converted output 480p file. easiest thing to do though is be in the dir with the original 720p/1080p video and just use the filename.mkv then output it to that same dir by just using an output.file.name.mkv and not a path. then all files will be in that one directory. i may have over explained it. sorry. lol. and THANK YOU for posting this. it did, indeed, work great for me on my freebsd machine with ffpmeg installed.
ffmpeg -i YOURVID.mkv -vf scale=-1:480 -vcodec mpeg4 -qscale 3 /path/to/new/output.mp4
I have an old 32bit Intel Atom Acer netbook, its not much good for anything. I have batocera installed on it and i need 480p video to play on it.
P.S. with a 320GB HDD, the prev mentioned netbook, some retro game roms and some 480p video files makes an excellent mobile entertainment center.

how to convert series of jpegs to flv using imagemagik and php?

I have series of jpegs , i want to make flv or mpg from all the images . How can i do it with using imagemagik and php .
exec(convert image1.jpg image2.jpg one.flv) make blank flv
Well I would jump stright into using ffmpeg. You can also do it using ImageMagick; however the docs state you need ffmpeg installed, so why have the middleman?
I haven't tested this, fair warning.
/* cmd img series codec bitrate framerate optional -s WidthxHeight and output filename */
exec(ffmpeg -f image2 -i image%d.jpg -vcodec mpeg4 -b 800k -r 12 video.avi);
/* For Mpeg4 *
/*For FLV */
exec(ffmpeg -f image2 -i image%d.jpg -vcodec flv -b 800k -r 12 video.flv);
If you want to use the outdated mpeg2 or mpeg1 formats you can do that as well.
I would suggest connecting via ssh and testing these commands, and hopefully you have ffmpeg installed.
a ffmpeg -formats will show you which formats are supported:
See the docs:
http://ffmpeg.org/ffmpeg.html#Video-and-Audio-file-format-conversion
and this great answer which I stole various things from:
Image sequence to video quality

how to convert video from one format to another using php

hi i want to include the vedio download option in my webpage. I am using ffmpeg, but it seems to work very slow. Is there is any other way to do this or how to spead up the ffmpeg.
i am using this code to get the frames from the vedio.
to convert the vedio
$call="ffmpeg -i ".$_SESSION['video_to_convert']." -vcodec libvpx -r 30 -b ".$quality." -acodec libvorbis -ab 128000 -ar ".$audio." -ac 2 -s ".$size." ".$converted_vids.$name.".".$type." -y 2> log/".$name.".txt";
$convert = (popen("start /b ".$call, "r"));
pclose($convert);
to get the frame from the vedio
exec("ffmpeg -vframes 1 -ss ".$time_in_seconds." -i $converted_vids video_images.jpg -y 2>);
but this code does not generate any error its loading continously.
Cache or pre-generate the output format.
Use the ffmpeg-php library. Should boost up some processes rather then manually calling the ffmpeg command line tool using exec.
I'd first of all take PHP out of the equasion and time how long it takes to do what you're after via the command line.
Once you're happy that works the way you'd like it to, make sure you've tweaked your script's execution time (see http://php.net/manual/en/function.set-time-limit.php) to accomodate what's likely to take a while.
Consider an async approach if it's getting in the way of UX.
Ta

Categories