Run a ffmpeg process in the background - php

I am wanting to use ffmpeg to convert video to .flv in php. Currently I have this working, but it hangs the browser until the file is uploaded and is finished. I have been looking at the php docs on how to run an exec() process in the background, while updating the process using the returned PID. Here is what I found:
//Run linux command in background and return the PID created by the OS
function run_in_background($Command, $Priority = 0)
{
if($Priority)
$PID = shell_exec("nohup nice -n $Priority $Command > /dev/null & echo $!");
else
$PID = shell_exec("nohup $Command > /dev/null & echo $!");
return($PID);
}
There is also a trick which I use to track if the background task is running using the returned PID :
//Verifies if a process is running in linux
function is_process_running($PID)
{
exec("ps $PID", $ProcessState);
return(count($ProcessState) >= 2);
}
Am I suppose to create a separate .php file which then runs from the php cli to execute one of these functions? I just need a little nudge in getting this working and then I can take it from there.
Thanks!

Am I suppose to create a separate .php
file which then runs from the php cli
to execute one of these functions?
This is probably the way I would do it :
the PHP webpage adds a record in database to indicate "this file has to be processed"
and displays a message to the user ; something like "your file will be processed soon"
In CLI, have a batch process the new inserted files
first, mark a record as "processing"
do the ffmpeg thing
mark the file as "processed"
And, on the webpage, you can show to the user in which state his file is :
if it has not been processed yet
if it's being processed
or if it's been processed -- you can then give him the link to the new video file.
Here's a couple of other thoughts :
The day your application becomes bigger, you can have :
one "web server"
many "processing servers" ; in your application, it's the ffmpeg thing that will require lots of CPU, not serving web pages ; so, being able to scale that part is nice (that's another to "lock" files, indicating them as "processing" in DB : that way, you will not have several processing servers trying to process the same file)
You only use PHP from the web server to generate web pages, which is je job of a web server
Heavy / long processing is not the job of a web server !
The day you'll want to switch to something else than PHP for the "processing" part, it'll be easier.
Your "processing script" would have to be launch every couple of minutes ; you can use cron for that, if you are on a Linux-like machine.
Edit : a bit more informations, after seeing the comment
As the processing part is done from CLI, and not from Apache, you don't need anykind of "background" manipulations : you can just use shell_exec, which will return the whole ouput of the command to your PHP script when it's finished doing it's job.
For the user watching the web page saying "processing", it will seem like background processing ; and, in a way, it'll be, as the processing will be done by another processus (maybe even on another machine).
But, for you, it'll be much simpler :
one webpage (nothing "background")
one CLI script, with no background stuff either.
Your processing script could look like something like this, I suppose :
// Fetch informations from DB about one file to process
// and mark it as "processing"
// Those would be fetched / determined from the data you just fetched from DB
$in_file = 'in-file.avi';
$out_file = 'out-file.avi';
// Launch the ffmpeg processing command (will probably require more options ^^ )
// The PHP script will wait until it's finished :
// No background work
// No need for any kind of polling
$output = shell_exec('ffmpeg ' . escapeshellarg($in_file) . ' ' . escapeshellarg($out_file));
// File has been processed
// Store the "output name" to DB
// Mark the record in DB as "processed"
Really easier than what you first thought, isn't it ? ;-)
Just don't worry about the background stuff anymore : only thing important is that the processing script is launched regularly, from crontab.
Hope this helps :-)

You don't need to write a separate php script to do this (Though you may want to later if you implement some sort of queuing system).
You're almost there. The only problem is, the shell_exec() call blocks to wait for the return of the shell. You can avoid this if you redirect all output from the command in the shell to wither a file or /dev/null and background the task (with the & operator).
So your code would become:
//Run linux command in background and return the PID created by the OS
function run_in_background($Command, $Priority = 0)
{
if($Priority) {
shell_exec("nohup nice -n $Priority $Command 2> /dev/null > /dev/null &");
} else {
shell_exec("nohup $Command 2> /dev/null > /dev/null &");
}
}
I don't think there is any way to retrieve the PID, unfortunately.

Related

How to make a non-blocking php exec call?

I need to echo text to a named pipe (FIFO) in Linux. Even though I'm running in background with '&' and redirecting all output to a /dev/null, the shell_exec call always blocks.
There are tons of answers to pretty much exactly this question all over the internet, and they all basically point to the following php manual section:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
And sure enough, when I try the non-blocking approach (of backgrounding and redirecting to /dev/null) with other commands like sleep, php successfully executes without hanging. But for the case of echo-ing to the FIFO, php hangs even though running the same command with bash produces no visible output and immediately returns to the shell.
In bash, I can run:
bash$ { echo yay > fifo & } &> /dev/null
bash$ cat fifo
yay
[1]+ Done echo yay > fifo
but when running the following php file with php echo.php:
<?php
shell_exec("{ echo yay > fifo & } &> /dev/null");
?>
it hangs, unless I first open fifo for reading.
So my question is, why is this blocking, but sleep isn't? In addition, I want to know what is happening behind the scenes: when I put the '&' in the php call, even though the shell_exec call blocks, the echo call clearly doesn't block whatever bash session php invoked it on, because when I CTRL+C out of php, I can read 'yay' from the FIFO (if I don't background the echo command, after CTRL+C the FIFO contains no text). This suggests that perhaps php is waiting on the pid of the echo command before going to the next instruction. Is this true?
I've been trying something similar and in the end came up with this solution:
/**
* This runs a shell command on the server under the current PHP user, that is in CLI mode it is the user you are logged in with.
* If a command is run in the background the method will return the location of the tempfile that captures the output. In that case you will have to manually remove the temporary file.
*/
static public function command($cmd, $show_output = true, $escape_command = false, $run_in_background = false)
{
if ($escape_command)
$cmd = escapeshellcmd($cmd);
$f = trim(`mktemp`);
passthru($cmd . ($show_output ? " | tee $f" : " > $f") . ($run_in_background ? ' &' : ''));
return $run_in_background ? $f : trim(`cat $f ; rm -rf $f`);
}
The trick is to write the output to a temporary file and return that when the command has finished (blocking behavior) or just return the file path (non-blocking behavior). Also, I'm using passthru rather than shell_exec because interactive sessions are not possible with the latter because of the blocking behavior.

Big XML file import to database PHP

I am facing a problem that somehow I don't see the solution to it. I have a XML file that needs to be importet to custom DB structure, when user uploads / imports the file the ajax post is waiting untill the file import is finished, but this could take 5 hours or more I don't know. What is the best way to handle this UI issue.
I was thinkg about thread uplaod, to split the file in multiple parts and upload each with it's own thread (pthreads, having problems with instalation on centos 7 / PHP 7)
Or if there is any other way that I could import the file in the background and whenerever the user refreshes the page there would be a status log output so that user would know when the import is finished and if successful.
You would want to run them using a background job ( a detached process ) this way the end user gets a confirmation message right away, and then send an email when the long running task is complete. Then they don't have to wait for it to finish. As I mentioned in the comments I have a class I wrote on my git hub for this
https://github.com/ArtisticPhoenix/MISC/blob/master/BgProcess.php
But it passes the args as a path because it's setup for Code Igniter, so you would have to change that or split the arguments up within your code.
Anyway the basics is similar to running a cron job, This varies in the implantation depending on the OS of the server. But on Linux the command is like this
php -f "path/to/phpfile.php" "{args}" > /dev/null &
The > /dev/null & part sends the output to null ( throws it away ) and the & runs it as a non-blocking process meaning the script starting the command can continue on. So using an example as this
.. other code before starting background job ..
exec( 'php -f "path/to/phpfile/xmlProcessor.php" "testXML/2" > /dev/null &');
.. code to tell user job is started .. this runs right after the call without waiting for that process to finish.
Then in xmlProcessor.php you would have this
<?php
$args = explode('/', $argv[1]);
$file = $ags[0];
$user_id = $args[1];
... code to process xml
... email user confirmation of completion
http://php.net/manual/en/reserved.variables.argv.php
As I said typically would call it this way,
exec( 'php -f "path/to/phpfile/xmlProcessor.php" "testXML" "2" > /dev/null &');
And access them using
$argv[1] // = testXML
$argv[2] // = 2
But because I use this with CI, it does it's routing for me to a special controller and handles all that. The nice thing about my class is that it should find the PHP executable under most cases, and it has windows compatibility built in ( which was a pain in the ...)
Using that class you would just call it like this
$command = new BgProcess( "path/to/phpfile/xmlProcessor.php", "testXML", 2);
echo $command;
Would output 'php -f "path/to/phpfile/xmlProcessor.php" "testXML/2" > /dev/null &' after starting the process ( the return is just for debugging )
Basically your running a separate background job with PHP via the command line.

PHP Recurring Operation

For an iOS Push Notification server, I am implementing a web service that checks a feed on the net for a particular price.
Therefore I need my PHP to keep checking a price (every 20 seconds or so) and check values.
I was wondering (forgive my ignorance I just started with PHP today) is the way people do this a cronjob? Or is there some special way to fire a php script that runs until it's killed and repeats a task?
Thanks!
John
If PHP was your preferred route, a simple script such as the following can be set to run indefinitely in the background (name this grabber.php):
#!/usr/bin/php
<?php
do {
// Grab the data from your URL
$data = file_get_contents("http://www.example.com/data.source");
// Write the data out somewhere so your push notifications script can read it
file_put_contents("/path/to/shared/data.store", $data);
// Wait and do it all over again
sleep(20);
} while (true);
And to start it (assuming you're on a unixy OS):
$ chmod u+x grabber.php
$ ./grabber.php > /path/to/a/file/logging/script/output.log 2>&1 &
That & at the end sends the process to run in the background.
PHP is probably overkill for this however, perhaps a simple bash script would be better:
#!/bin/bash
# This downloads data and writes to a file ('data-file')
doWork () {
data=$(curl -L http://www.example.com/data.source)
echo $data > data-file
sleep 20
doWork
}
# Start working
doWork
$ chmod u+x grabber.sh
$ ./grabber.sh > /path/to/logger.log 2>&1 &
That is possible by setting up a cron jobs on your server.
Login to your web hosting e.g cpanel create a new cron job and add the path to the php file that you want to run. e.g php /home/[your username]/public_html/rss/import_feeds.php. There is field where you can input the number of minutes would you like the php script to run.
Run a PHP file in a cron job using CPanel

php system() shell_exec() hangs the browser [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Asynchronous shell exec in PHP
i need to run a java program in the background.
process.php contains
shell_exec("php php_cli.php")
php_cli.php contains
shell_exec("java -jar BiForce.jar settings.ini > log.txt");
I am calling process.php asynchronously using ajax
When i click the link in the webpage that calls ajax function (for running process.php) the webage shows "loading". when i click other links at the same time it does not responds.
The java program takes about 24 hours to finish executing, so user will not wait until the execution ends.
The problem is that the browser keeps on loading and does not go to other pages when clicked the link.
I also tried with system(), but the same problem ....
Help will greatly be appreciated.
Using shell_exec waits for the command to hang up, so that's what your script is doing.
If your command doesn't have any wait time, then your script will not either.
You can call another PHP script from your original, without waiting for it to hang up:
$processId = shell_exec(
"nohup " . // Runs a command, ignoring hangup signals.
"nice " . // "Adjusted niceness" :) Read nice --help
"/usr/bin/php -c " . // Path to your PHP executable.
"/path/to/php.ini -f " . // Path to your PHP config.
"/var/www/php_cli.php " . // Path to the script you want to execute.
"action=generate > /process.log " . // Log file.
"& echo $!" // Make sure it returns only the process id.
);
It is then possible to detect whether or not the script is finished by using this command:
exec('ps ' . $processId, $processState);
// exec returns the result of the command - but we need to store the process state.
// The third param is a referenced variable.
// First key in $processState is that it's running.
// Second key would be that it has exited.
if (count($processState) < 2) {
// Process has ended.
}
You could call the command in the page displayed, but appending an & at the end:
shell_exec("java -jar BiForce.jar settings.ini > log.txt &");
This way the process is launched on the background.
Also, there is no need (unless defined by your application) to create a process.php wich itself calls php via a shell exec. You could archive the same functionality via an include to the other file.
As in normal shell scripting you can use the ampersand to background the process:
shell_exec("java -jar BiForce.jar settings.ini > log.txt &");
See Asynchronous shell exec in PHP .
First, you might want to redesign this concept. I am not sure exactly what these programs do, but clearly this is can lead to potential problems...
This is what I suggest you do, instead of starting external processes via PHP:
Your ajax call creates (or reuse) a file in some temporary directory (probably using the user session to generate that file)
some data is written unto the file, and the request ends
Your jar is launched separately, and runs indefinitely
At regular intervals, the Java program scans the temporary directory for new files, or if some file has been modified
parse it, and execute the 24 hour long process, or adjust any previous execution if necessary
Along the same idea, you can even use sockets instead to communicate with that Java program, or any other way.
The advantage of having the Java program running all the time instead of starting a new process is to be able to reuse system resources within the lifetime of the application; for example, if your program is using DB connections, or any data, cache, etc.

php process forking and get the child process id

Objective:
My script will download a remote file upon form submission, since the file might be big, I would like to fork off a process and let the user go on with his/her life.
Example of a command:
wget -q --limit-rate=1k --tries=10 "http://helios.gsfc.nasa.gov/image_euv_press.jpg" -O /web/assets/content/image/image_euv_press.jpg
Method tried:
pcntl forking,
$pid = pcntl_fork();
if ( $pid == -1 ) {
exit;
} else if ( $pid ) {
//We are the parent process, the pid is child pid right?
return $pid;
} else {
// We are the child process
exec($command.' > /dev/null');
// > /dev/null &
posix_kill(getmypid(),9);
return;
}
I do get the PID but then there is a risk that the forked process becomes a zombie and since I am using nginx -> php-fpm (tested and confirmed, upon running several times alot of defunct php-fpm processes), I would have to restart the server just to eliminate the zombies, this would leave me to PID exhaustion attack? ( I am guessing)
Background process:
exec($command . ' > /dev/null &');//background process
$proc = passthru ("ps aux | grep '$command'");//search for the command
echo var_dump($proc);
$pid = (int)(next(explode(' ',$proc)));//parse the pid (not implemented)
Question:
the background process method works but it's not clean, is there a better way to fork off a process to download and get that wget command PID so I can kill it later?.
I have tried echoing $! after doing the exec just to get the PID but exec('echo $!') doesnt return anything, i think it's because every exec is a different "space"
I added '> /dev/null 2>/dev/null &' to the end of the command on my terminal it would return something like: [3] 30751, but through php exec, there is no way to capture that returned PID.
Thank you.
While not a direct answer, the following link might help you get it done:
The Mysteries Of Asynchronous Processing With PHP - Part 3
Practical PHP: Process control
As an alternative to PHP's native pctl functions, consider using Gearman:
Gearman provides a generic application framework to farm out work to other machines or processes that are better suited to do the work. It allows you to do work in parallel, to load balance processing, and to call functions between languages. It can be used in a variety of applications, from high-availability web sites to the transport of database replication events.
Try the following command:
exec("ps -C $command -o pid=", $pids);
But I recommend you to use Zend Server Job Queue, which exists for these objectives.
Try adding the "echo $!" to the same execution flow as the launched background process.
I.e. something like this:
shell_exec("$command & echo $!");

Categories