System command - doesn't get return - php

Seem to have a problem with PHP system command - I place a command with it and if return code is above 0 - I cite failure otherwise I carry on.
It never seems to return on FFMPEG commands lower than a certain number of seconds (generally 3-5 seconds duration video encodes that are very quick to encode). Is this something that'd be from FFMPEG not returning properly or system command?
An example command:-
system('ffmpeg -i /home/test.wmv -f flv 340x160 -vcodec libx264 export.flv', $returnCode);
if($returnCode > 0) { error(); }
The only way to get round this seems to be to run a timer and check log files if nothing back after an amount of time but if anyone has any pointers - be gratefully received.

have you checked the code syntax? Seems you're closing the string earlier on the first parameter

Related

How to speedup screenshot capture of an rtsp stream via ffmpeg?

I want to capture a fresh screenshot from an rtsp stream and pass the data on via php whenever an ajax endpoint is called.
Currently I am simply running
$resp = exec( 'ffmpeg -rtsp_transport tcp -y -i '.$post['link'].' -vframes 1 '.$path.$imagename);
The problem is this takes a lot of time to generate a screenshot, about 4 second per capture if the stream is good. If the stream drops frames or causes an error, I won't get a response and the script will timeout.
Is there a way to optimize this or allow for fault tolerance? And overall speedup things so I can get a screenshot every second? If there's a better solution over ffmpeg, I am willing to try it as well!
try:
ffmpeg -i "rtsp://192.168.1.73/live/ch00_1" -s 320x240 -f image2 -vf fps=fps=3 /tmp/ipcam_73_%04d.jpg
(example generates 3 images per second)

PHP Exec (ffmpeg) fails on IIS every other request

<?PHP
exec("ffmpeg.exe -i something.mp4 -ss 1 -t 1 -r 1 -s 320x240 -y something.jpg");
?>
Calling this script results in a server error 500 every other request.
PHP 7.01, IIS 10.
I have already ruled out that the problem might be related to the specific ffmpeg paramters of my call.
The execution is < than 1 second, so it can't be a PHP or IIS script execution timeout.
No matter how much time passes between one "call" to the script and the next, the odd numbered calls result in error 500, the even numbered calls are just fine.
Note that when I say "call" I actually refer to calling the script (i.e. http://server/script.php ) - whereas if I put 2,3, or 100 calls to Exec within the same script, they will all succeed.
Edit: Quite randomly, I tried to trigger a timeout by calling the same Exec("ffmpeg etc. ) line 100 times in a loop. To my surprise, the Error 500 disappears. So I removed the loop and added a similar pause by adding a call to sleep(10): the error 500 returns, and it's instant - like the server fails to run the script even before parsing it. Now I am totally lost..
Any hint?
Well, it seems that changing the FastCGI protocol for PHP from Named Pipe to TCP, fixed the problem.
It would still be interesting, though, understanding what makes Named Pipes fail immediately every other time. Setting Named Pipe Flushing didn't help.

Call ffmpeg via PHP but dont wait on it

I have my upload script that handles the file upload and then calls exec(ffmpeg -i......
I call ffmpeg 3 times. One to convert to mp4, one to convert to wemb and one to get thumbnail.
Problem is that if an investigator uploads ALL his footage, he doesnt want to have to sit there and wait for it to convert before he can do anything else. So how can i send the 3 commands and then exit the script?
I also want to run exec(rm -r /path/to/original) so the original video is deleted but obviously I have to wait for conversions to be complete before it can be run
You can use the ampersand character after the command to run it in the background and get your prompt (control) back right away.
I suggest creating a bash script like:
#!/bin/sh
ffmpeg -i $1 -ab 64k -vb 420k $1.mp4
ffmpeg -i $1 -ab 64k -vb 420k $1.webm
Then run it from php like:
system ('myscript.sh tempfile &');
The following example uses if structure to decide whether to use sound or no. The first parameter is the filename, if the second one equals 0 then no sound on the mp4, if the third one is zero then no sound on the webm.
#!/bin/sh
if [ $2 -eq 0 ]
then
ffmpeg -i $1 -an -vb 420k $1.mp4
else
ffmpeg -i $1 -ab 64k -vb 420k $1.mp4
fi
if [ $3 -eq 0 ]
then
ffmpeg -i $1 -an -vb 420k $1.webm
else
ffmpeg -i $1 -ab 64k -vb 420k $1.webm
fi
You can use it from PHP like:
system ('myscript.sh ' . $tempfile . ' ' . (0 + $mp4sound) . ' ' . (0 + $webmsound) . ' &');
... but remember you can even start a PHP file instead of a shellscript like:
system ("php converter.php \"$tempfile\" \"$mp4sound\" \"$webmsound\" &');
Handle the file upload and conversion separately
After file has been uploaded,
Put it into a job queue
Run a cron job that check for latest queue every minute + if currently processing job has finished, if so then do the conversion.
After conversion do a unlink(/path/to/original ) to delete original video
Update the job status
Not sure if it make sense, something i am thinking of better than u do everything at once.
Add little abit more of the work behind to ease the pain infront
Try handling all ffmpeg conversions separately in one page. ex: convert.php
Once upload was done. sent hidden request to convert.php?id=someid
in top of the convert.php use ignore_user_abort(TRUE);
The process will run even the user closed the browser.
if the user is not closed that page still in that page with ajax response update status.
i hope it may help to get an idea..
Perhaps you can move the code that does the conversion into a script that is being called by a cron job.
For example, the user can upload a video to a directory and perhaps insert a record into a database table that contains all unconverted videos.
The cron job then runs every x minutes, checks the database for any unconverted videos, and then converts them. After that is done, the script can then delete the original videos and delete the row from the database.
If you are running on Linux, a trailing ampersand to your command in the exec()-call will start the process in the background.
You probably also want to redirect the output from ffmpeg to /dev/null
ffmpeg -i (...) > /dev/null 2>&1 &
The same approach could be used to call a shell script if you wanted to clean up after converting.

Best way to ping client

Right, I have a PHP script at work where the server ping's a client. The problem I am facing is that sometimes the server cannot contact the client although when I manually ping the client it ping's successfully.
The ping command I am using is this ping -q -w 3 -c 1 < ipaddresshere >
What would be the best way of pinging the clients maybe 2/3 times leaving like a 2/3 second gap if a ping fails before a retry?
As you are in the unix environment, you can always make and then call a shell script to handle the looping and waiting. But I'm surprised that you can't do that inside of php.
Also, i'm not sure about your sample ping command, the 2 different environments I checked seem to have different meanings for the options you mention than what you seem to intend. Try man ping OR ping --help
The script below should give you a framework for implementing a ping-retry, but I can't spend a lot of time on it.
cat pingCheck.sh
#! /bin/bash -vx
IPaddr=$1
: ${maxPingTries:=3}
echo "maxPingTries=${maxPingTries}"
pingTries=0
while ${keepTryingToPing:-true} ; do
if ping -n 3 -r 1 ${IPaddr} ;then
keepTryingToPing=false
else
sleep ${sleepSecs:-3}
if (( ++pingTries >= maxPingTries )) ; then
printf "Execeeded count on ping attempts = ${maxPingTries}\n" 1>&2
keepTryingToPing=false
fi
fi
done
I hope this helps.
P.S. as you appear to be a new user, if you get an answer that helps you please remember to mark it as accepted, and/or give it a + (or -) as a useful answer.
for php, you can try PEAR's Net_PING package.
here is a link guiding you through it
http://www.codediesel.com/php/ping-a-server-using-php/

getting progress for multiple exec() processes realtime

I have a php script that executes ffmpeg and then it starts working. I can execute this script 4 times and spawn 4 processes. I would like to get progress in real time. How do I get and identify which process is which for each output?
$h = fopen('php://stdin', 'r'); $str = fgets($h);
I know that line will get whatever output is in stdin but how do you separate it? How do I continuously poll for output? Once a process is started, how do you access that output on a continuous basis? I know you need ajax but what php method will allow you to access that info and separate it from other processes.
Some have suggested directing output to a text file and getting the contents, but is this the only way? Why keep making more text files for each process? Another suggested redirecting output using 2>&1 | php script.php after the command. This only gets output at the start of the process since the file executes once.
I have been struggling with this for a while and am getting closer, so any expert help would be so appreciated.
See Edit
$process = exec("/usr/local/bin/ffmpeg -i /home/g/Desktop/cave.wmv -deinterlace -acodec libfaac -ab 96k -ar 44100 -vcodec libx264 -s 480x320 -f flv /home/g/Desktop/file.flv")
This is a bit complex. As far as I know exec blocks the PHP thread until it completes. I think ffmpeg can write it's progress to a file. You'd need to parse that file to get the progress. You'll want to call that with Ajax.
You need to figure out how you can get ffmpeg to report it's progress, the rest is easy :P
If only 4 processes should be running at a time, I really see no reason not to create 4 text files and directing the output of ffmpeg into those. After that it's just a question of reading those text files with PHP.
If you intend to make this HTTP operateable, you could create two scripts: start_process.php and process_status.php. The first file would take an ID as input, starting an ffmpeg process, continuously reading its progress and reporting to a MySQL database. process_status.php would take the same ID as input and report the status from MySQL to the user. You could hook that file up to Ajax and get live updates.
You would have to polish the solution with checking for double entry of the same ID, perhaps automatically creating an ID and returning it from start_process.php etc.
You should probably create an array for the four processes. Start your processes with proc_open and slap the returned resource in your array. You could then poll your the array looking at the output for each process resource.
http://us2.php.net/manual/en/function.proc-open.php

Categories