getting progress for multiple exec() processes realtime - php

I have a php script that executes ffmpeg and then it starts working. I can execute this script 4 times and spawn 4 processes. I would like to get progress in real time. How do I get and identify which process is which for each output?
$h = fopen('php://stdin', 'r'); $str = fgets($h);
I know that line will get whatever output is in stdin but how do you separate it? How do I continuously poll for output? Once a process is started, how do you access that output on a continuous basis? I know you need ajax but what php method will allow you to access that info and separate it from other processes.
Some have suggested directing output to a text file and getting the contents, but is this the only way? Why keep making more text files for each process? Another suggested redirecting output using 2>&1 | php script.php after the command. This only gets output at the start of the process since the file executes once.
I have been struggling with this for a while and am getting closer, so any expert help would be so appreciated.
See Edit
$process = exec("/usr/local/bin/ffmpeg -i /home/g/Desktop/cave.wmv -deinterlace -acodec libfaac -ab 96k -ar 44100 -vcodec libx264 -s 480x320 -f flv /home/g/Desktop/file.flv")

This is a bit complex. As far as I know exec blocks the PHP thread until it completes. I think ffmpeg can write it's progress to a file. You'd need to parse that file to get the progress. You'll want to call that with Ajax.
You need to figure out how you can get ffmpeg to report it's progress, the rest is easy :P

If only 4 processes should be running at a time, I really see no reason not to create 4 text files and directing the output of ffmpeg into those. After that it's just a question of reading those text files with PHP.
If you intend to make this HTTP operateable, you could create two scripts: start_process.php and process_status.php. The first file would take an ID as input, starting an ffmpeg process, continuously reading its progress and reporting to a MySQL database. process_status.php would take the same ID as input and report the status from MySQL to the user. You could hook that file up to Ajax and get live updates.
You would have to polish the solution with checking for double entry of the same ID, perhaps automatically creating an ID and returning it from start_process.php etc.

You should probably create an array for the four processes. Start your processes with proc_open and slap the returned resource in your array. You could then poll your the array looking at the output for each process resource.
http://us2.php.net/manual/en/function.proc-open.php

Related

How to speedup screenshot capture of an rtsp stream via ffmpeg?

I want to capture a fresh screenshot from an rtsp stream and pass the data on via php whenever an ajax endpoint is called.
Currently I am simply running
$resp = exec( 'ffmpeg -rtsp_transport tcp -y -i '.$post['link'].' -vframes 1 '.$path.$imagename);
The problem is this takes a lot of time to generate a screenshot, about 4 second per capture if the stream is good. If the stream drops frames or causes an error, I won't get a response and the script will timeout.
Is there a way to optimize this or allow for fault tolerance? And overall speedup things so I can get a screenshot every second? If there's a better solution over ffmpeg, I am willing to try it as well!
try:
ffmpeg -i "rtsp://192.168.1.73/live/ch00_1" -s 320x240 -f image2 -vf fps=fps=3 /tmp/ipcam_73_%04d.jpg
(example generates 3 images per second)

How to mix/merge 2 mp3 files in PHP?

I need a tool to join 2 mp3 files in 1.
The first file ( background ) will be some sound ( music, submitted by user ),
second file will be a machine (google) speaking text, which was submitted ( by user ).
In output I need a single mp3 file, with background music and speaking robot text playing together.
I can't really find any PHP standalone solution like some library or something like, only shell commands atc.
So are there some libraries?
or there's some unique shell command which works on all OS to combine files?
Or how do I complete the task?
Based off of this question, you should be able to install FFMPEG onto your server (hopefully not a shared host?) and use
//Reduce background volume, assuming it's input2.mp3
shell_exec('ffmpeg -i input2.mp3 -af "volume=0.3" backround.wav');
//Merge the two
shell_exec('ffmpeg -y -i input1.mp3 -i background.wav -filter_complex amerge -c:a libmp3lame -q:a 4 output.mp3');
Then you can simply serve up the output.mp3 file. If this is being performed by PHP running under apache (or other web host, instead of CLI) you'll need to make sure your www-data user has read access to the input files, and write access to the output directory.
All your temporary output, like the background, should be saved as .wav, and only converted back to .mp3 for the final output. Recompressing the audio at each step may result in a very poor final output.
I am making assumptions about how FFMPEG works here, so you may need to consult the documentation for it in order to build a functioning or more efficient set of commands.
You can simply do this
file_put_contents('combined.mp3',
file_get_contents('file1.mp3') .
file_get_contents('file2.mp3'));

test if file opened from php is still open

I have a PHP script running on Debian that calls the ping command and redirects the output to a file using exec():
exec('ping -w 5 -c 5 xxx.xxx.xxx.xxx > /var/f/ping/xxx.xxx.xxx.xxx_1436538580.txt &');
The PHP script then has a while loop that scans the /var/f/ping/ folder and checks to see if the ping has finished writing to it. I tried checking the output using:
exec('lsof | grep /var/f/ping/xxx.xxx.xxx.xxx_1436538580.txt');
to see if the file was still open, but it takes lsof about 10-15 seconds to return its results, which is too slow for what we need. Ideally it should be able to check this within 2 or 3 seconds.
Is there a faster/better way to test if the ping has completed?
using grep with lsof is probably the slowest way, as lsof will scan everything. you can narrow down the scope that lsof uses to one directory by doing:
lsof +D /var/f/ping
or similar.
there's a good and easy-to-read overview of lsof uses here:
http://www.thegeekstuff.com/2012/08/lsof-command-examples/
alternately, you could experiment with:
http://php.net/manual/en/function.fam-monitor-file.php
and see if that meets your requirements better.
You need Deffered queue pattern to such kind of tasks. Make pings in background by cron and create table or file with job statuses.

Call ffmpeg via PHP but dont wait on it

I have my upload script that handles the file upload and then calls exec(ffmpeg -i......
I call ffmpeg 3 times. One to convert to mp4, one to convert to wemb and one to get thumbnail.
Problem is that if an investigator uploads ALL his footage, he doesnt want to have to sit there and wait for it to convert before he can do anything else. So how can i send the 3 commands and then exit the script?
I also want to run exec(rm -r /path/to/original) so the original video is deleted but obviously I have to wait for conversions to be complete before it can be run
You can use the ampersand character after the command to run it in the background and get your prompt (control) back right away.
I suggest creating a bash script like:
#!/bin/sh
ffmpeg -i $1 -ab 64k -vb 420k $1.mp4
ffmpeg -i $1 -ab 64k -vb 420k $1.webm
Then run it from php like:
system ('myscript.sh tempfile &');
The following example uses if structure to decide whether to use sound or no. The first parameter is the filename, if the second one equals 0 then no sound on the mp4, if the third one is zero then no sound on the webm.
#!/bin/sh
if [ $2 -eq 0 ]
then
ffmpeg -i $1 -an -vb 420k $1.mp4
else
ffmpeg -i $1 -ab 64k -vb 420k $1.mp4
fi
if [ $3 -eq 0 ]
then
ffmpeg -i $1 -an -vb 420k $1.webm
else
ffmpeg -i $1 -ab 64k -vb 420k $1.webm
fi
You can use it from PHP like:
system ('myscript.sh ' . $tempfile . ' ' . (0 + $mp4sound) . ' ' . (0 + $webmsound) . ' &');
... but remember you can even start a PHP file instead of a shellscript like:
system ("php converter.php \"$tempfile\" \"$mp4sound\" \"$webmsound\" &');
Handle the file upload and conversion separately
After file has been uploaded,
Put it into a job queue
Run a cron job that check for latest queue every minute + if currently processing job has finished, if so then do the conversion.
After conversion do a unlink(/path/to/original ) to delete original video
Update the job status
Not sure if it make sense, something i am thinking of better than u do everything at once.
Add little abit more of the work behind to ease the pain infront
Try handling all ffmpeg conversions separately in one page. ex: convert.php
Once upload was done. sent hidden request to convert.php?id=someid
in top of the convert.php use ignore_user_abort(TRUE);
The process will run even the user closed the browser.
if the user is not closed that page still in that page with ajax response update status.
i hope it may help to get an idea..
Perhaps you can move the code that does the conversion into a script that is being called by a cron job.
For example, the user can upload a video to a directory and perhaps insert a record into a database table that contains all unconverted videos.
The cron job then runs every x minutes, checks the database for any unconverted videos, and then converts them. After that is done, the script can then delete the original videos and delete the row from the database.
If you are running on Linux, a trailing ampersand to your command in the exec()-call will start the process in the background.
You probably also want to redirect the output from ffmpeg to /dev/null
ffmpeg -i (...) > /dev/null 2>&1 &
The same approach could be used to call a shell script if you wanted to clean up after converting.

Is it possible to send FFMPEG output to a php file?

I want to send ffmpeg output to a php file so I can use a regex and update the output into a database. This will allow me to handle progress for multiple uploads. Does anyone know how to do this? Can it be done? Currently I can execute a php file with parameters after the ffmpeg command, and get ffmpeg to write to a txt file but can I send the output to the php file and execute it?
execute php file with parameters
&& php /opt/lampp/htdocs/xampp/site/update_db.php ".$parameter1." ".$parameter2.";
Write output to txt file
ffmpeg command and filepath to converted 1> /home/g/Desktop/output.txt 2>&1
Can something like this be done?
ffmpeg command and filepath to converted 1> php /opt/lampp/htdocs/xampp/site/update_db.php ".$output." 2>&1
Yes, you can read STDIN.
http://php.net/manual/en/features.commandline.io-streams.php
If it were me, I'd just execute FFMPEG from within PHP. You have a bit more flexibility that way, but I know that isn't desirable for every application.
You could use exec to call ffmpeg, then use the content of the output parameter to get returned output.
But doing so only allow you to get the output once the program execution is terminated:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.

Categories