I have my upload script that handles the file upload and then calls exec(ffmpeg -i......
I call ffmpeg 3 times. One to convert to mp4, one to convert to wemb and one to get thumbnail.
Problem is that if an investigator uploads ALL his footage, he doesnt want to have to sit there and wait for it to convert before he can do anything else. So how can i send the 3 commands and then exit the script?
I also want to run exec(rm -r /path/to/original) so the original video is deleted but obviously I have to wait for conversions to be complete before it can be run
You can use the ampersand character after the command to run it in the background and get your prompt (control) back right away.
I suggest creating a bash script like:
#!/bin/sh
ffmpeg -i $1 -ab 64k -vb 420k $1.mp4
ffmpeg -i $1 -ab 64k -vb 420k $1.webm
Then run it from php like:
system ('myscript.sh tempfile &');
The following example uses if structure to decide whether to use sound or no. The first parameter is the filename, if the second one equals 0 then no sound on the mp4, if the third one is zero then no sound on the webm.
#!/bin/sh
if [ $2 -eq 0 ]
then
ffmpeg -i $1 -an -vb 420k $1.mp4
else
ffmpeg -i $1 -ab 64k -vb 420k $1.mp4
fi
if [ $3 -eq 0 ]
then
ffmpeg -i $1 -an -vb 420k $1.webm
else
ffmpeg -i $1 -ab 64k -vb 420k $1.webm
fi
You can use it from PHP like:
system ('myscript.sh ' . $tempfile . ' ' . (0 + $mp4sound) . ' ' . (0 + $webmsound) . ' &');
... but remember you can even start a PHP file instead of a shellscript like:
system ("php converter.php \"$tempfile\" \"$mp4sound\" \"$webmsound\" &');
Handle the file upload and conversion separately
After file has been uploaded,
Put it into a job queue
Run a cron job that check for latest queue every minute + if currently processing job has finished, if so then do the conversion.
After conversion do a unlink(/path/to/original ) to delete original video
Update the job status
Not sure if it make sense, something i am thinking of better than u do everything at once.
Add little abit more of the work behind to ease the pain infront
Try handling all ffmpeg conversions separately in one page. ex: convert.php
Once upload was done. sent hidden request to convert.php?id=someid
in top of the convert.php use ignore_user_abort(TRUE);
The process will run even the user closed the browser.
if the user is not closed that page still in that page with ajax response update status.
i hope it may help to get an idea..
Perhaps you can move the code that does the conversion into a script that is being called by a cron job.
For example, the user can upload a video to a directory and perhaps insert a record into a database table that contains all unconverted videos.
The cron job then runs every x minutes, checks the database for any unconverted videos, and then converts them. After that is done, the script can then delete the original videos and delete the row from the database.
If you are running on Linux, a trailing ampersand to your command in the exec()-call will start the process in the background.
You probably also want to redirect the output from ffmpeg to /dev/null
ffmpeg -i (...) > /dev/null 2>&1 &
The same approach could be used to call a shell script if you wanted to clean up after converting.
Related
I want to capture a fresh screenshot from an rtsp stream and pass the data on via php whenever an ajax endpoint is called.
Currently I am simply running
$resp = exec( 'ffmpeg -rtsp_transport tcp -y -i '.$post['link'].' -vframes 1 '.$path.$imagename);
The problem is this takes a lot of time to generate a screenshot, about 4 second per capture if the stream is good. If the stream drops frames or causes an error, I won't get a response and the script will timeout.
Is there a way to optimize this or allow for fault tolerance? And overall speedup things so I can get a screenshot every second? If there's a better solution over ffmpeg, I am willing to try it as well!
try:
ffmpeg -i "rtsp://192.168.1.73/live/ch00_1" -s 320x240 -f image2 -vf fps=fps=3 /tmp/ipcam_73_%04d.jpg
(example generates 3 images per second)
Seem to have a problem with PHP system command - I place a command with it and if return code is above 0 - I cite failure otherwise I carry on.
It never seems to return on FFMPEG commands lower than a certain number of seconds (generally 3-5 seconds duration video encodes that are very quick to encode). Is this something that'd be from FFMPEG not returning properly or system command?
An example command:-
system('ffmpeg -i /home/test.wmv -f flv 340x160 -vcodec libx264 export.flv', $returnCode);
if($returnCode > 0) { error(); }
The only way to get round this seems to be to run a timer and check log files if nothing back after an amount of time but if anyone has any pointers - be gratefully received.
have you checked the code syntax? Seems you're closing the string earlier on the first parameter
I always wondered if it was possible to parse the thumbs db files located in Windows 7 in:
C:\Users\%userdata%\AppData\Local\Microsoft\Windows\Explorer
In Windows XP they used to be located in each folder, but I assume I would have to traverse these to find the directory I want, etc. I'm aware there are ways to generate thumbnails using ffmpeg and such, but want to find a way in PHP to parse that db file since Windows has already generated thumbs for me. It's not in plain text (which I was hoping for).
you could use a parser like vinetto via php's exec
I gave up trying to do parse these out and instead used ffmpeg to generate thumbs like this. The system call takes the 10th frame of the all the mp4 videos in my recordings folder and saves it as a 320x240 image and as the filename.jpg. I had a bunch of videos so I had to increase the max execution time of PHP to handle it.
foreach (glob("F:\\Recordings\\*.mp4") as $filename) {
$path = pathinfo($filename);
system("c:\\ffmpeg\\bin\\ffmpeg.exe -itsoffset -10 -i \"$filename\" -vcodec mjpeg -vframes 1 -an -f rawvideo -s 320x240 \"".$path['filename'].".jpg\"");
}
I have a php script that executes ffmpeg and then it starts working. I can execute this script 4 times and spawn 4 processes. I would like to get progress in real time. How do I get and identify which process is which for each output?
$h = fopen('php://stdin', 'r'); $str = fgets($h);
I know that line will get whatever output is in stdin but how do you separate it? How do I continuously poll for output? Once a process is started, how do you access that output on a continuous basis? I know you need ajax but what php method will allow you to access that info and separate it from other processes.
Some have suggested directing output to a text file and getting the contents, but is this the only way? Why keep making more text files for each process? Another suggested redirecting output using 2>&1 | php script.php after the command. This only gets output at the start of the process since the file executes once.
I have been struggling with this for a while and am getting closer, so any expert help would be so appreciated.
See Edit
$process = exec("/usr/local/bin/ffmpeg -i /home/g/Desktop/cave.wmv -deinterlace -acodec libfaac -ab 96k -ar 44100 -vcodec libx264 -s 480x320 -f flv /home/g/Desktop/file.flv")
This is a bit complex. As far as I know exec blocks the PHP thread until it completes. I think ffmpeg can write it's progress to a file. You'd need to parse that file to get the progress. You'll want to call that with Ajax.
You need to figure out how you can get ffmpeg to report it's progress, the rest is easy :P
If only 4 processes should be running at a time, I really see no reason not to create 4 text files and directing the output of ffmpeg into those. After that it's just a question of reading those text files with PHP.
If you intend to make this HTTP operateable, you could create two scripts: start_process.php and process_status.php. The first file would take an ID as input, starting an ffmpeg process, continuously reading its progress and reporting to a MySQL database. process_status.php would take the same ID as input and report the status from MySQL to the user. You could hook that file up to Ajax and get live updates.
You would have to polish the solution with checking for double entry of the same ID, perhaps automatically creating an ID and returning it from start_process.php etc.
You should probably create an array for the four processes. Start your processes with proc_open and slap the returned resource in your array. You could then poll your the array looking at the output for each process resource.
http://us2.php.net/manual/en/function.proc-open.php
I have a script calling a command to run an ffmpeg conversion on an uploaded video. It works only at random times however. Sometimes the form will finish submitting and the ffmpeg process will be running; at other times, the ffmpeg command fails to run at all. Here is the command that I'm running in an exec() function:
ffmpeg -i "uploaded_file -b 450k "converted_file" >/dev/null 2>&1 &
Can anyone explain why this will only work on certain tries and not on others?
What if ffmpeg fails and throws and error? Right now you're sending all output to /dev/null so you'll never know.
Change >/dev/null into >>/tmp/ffmpeglog to keep a log