PHP exec in background using & is not working - php

I am using this code on Ubuntu 13.04,
$cmd = "sleep 20 &> /dev/null &";
exec($cmd, $output);
Although it actually sits there for 20 seconds and waits :/ usually it works fine when using & to send a process to the background, but on this machine php just won't do it :/
What could be causing this??

Try
<?PHP
$cmd = '/bin/sleep';
$args = array('20');
$pid=pcntl_fork();
if($pid==0)
{
posix_setsid();
pcntl_exec($cmd,$args,$_ENV);
// child becomes the standalone detached process
}
echo "DONE\n";
I tested it for it works.
Here you first fork the php process and then exceute your task.
Or if the pcntl module is not availabil use:
<?PHP
$cmd = "sleep 20 &> /dev/null &";
exec('/bin/bash -c "' . addslashes($cmd) . '"');

The REASON this doesn't work is that exec() executes the string you're passing into it. Since & is interpreted by the shell as "execute in the background", but you don't execute a shell in your exec call, the & is just passed along with 20 to the /bin/sleep executable - which probably just ignores that.
The same applies to the redirection of output, since that is also parsed by the shell, not in exec.
So, you either need to find a way to fork your process (as described above), or a way to run the subprocess as a shell.

My workaround to do this on ubuntu 13.04 with Apache2 and any version of PHP:
libssh2-php, I just used nohup $cmd & inside a local SSH session using PHP and it ran it just fine the background, of course this requires putting certain security protocols in place, such as enabling SSH access for the webserver user, so it would have exec-like permissions then only allowing localhost to login to the webserver ssh account.

Related

How i can run function exec in PHP

I can't add * to my code to find file
this code work
exec("mediaconvert -t wav -i /home/20220228/11/23401.rec -o /var/www/html/test.mp3");
if i add a the *, it don't work
exec("mediaconvert -t wav -i /home/20220228/11/*01.rec -o /var/www/html/test.mp3");
p.s. in path is only one file, when i try execute this code from shell it work. Pls help me)
Filename expansion and other bash-specific features may/will not work in other shells (e.g. standard POSIX). If your command with * is not executed in bash/compatible, it won't work as expected. You need to verify the environment/shell that your PHP installation executes commands in.
Run the following test script:
<?php
exec('echo "$SHELL"', $out);
var_dump($out);
When I run the PHP script directly on CLI, I get "/bin/bash" for the shell that's called. When I run it via browser, curiously I get "/sbin/nologin" instead. There are different environments for user apache that executes PHP via browser calls, and the "actual" user logging in via SSH. Bash shell is not available for the Apache user by default.
These results are from a Centos7 server with Apache 2.4/PHP 8.1.4 running. Your mileage may vary. Bottom line: if the command you are executing depends on bash-specific features, it must execute in a bash environment, or another shell that supports the required features.
If bash is not available, your other option is using e.g. glob to get files matching the pattern in your directory, and then loop over them while executing the command for each specific file.
Edit: As pointed out by #Sammitch (see comments), /sbin/nologin is a common "shell name" choice for non-login users, and most likely uses /bin/sh. This should still allow for filename expansion/globbing. Testing browser script call with exec('ls *.php', $out); the wildcard functions as expected.
You may find this question/answer relevant: Use php exec to launch a linux command with brace expansion.
I recommend you do the opposite. First, get the files you want to input then you exec. For instance:
$input_files = ...
exec("mediaconvert -t wav -i " . $input_files . " -o /var/www/html/test.mp3");
You can try to find files with glob() function and after that you can use exec(). You can try a similiar solution with the following code:
$input_files = '/home/20220228/11/*01.rec';
foreach (glob($input_files) as $filename) {
exec("mediaconvert -t wav -i " . $filename . " -o /var/www/html/test.mp3");
}

calling exec() in php

Just migrated a website from a linux server without a cpanel to a server with a cpanel. My problem is when I hit start to start a process which executes a php file. It will not run the file. If I run the file from terminal everything works.
The code that calls the file.
if ($do === "start_service") {
create_marker_file("$service_running_marker");
system_bg("php $dir/$service_script");
$message = "Started Service!";
$running = true;
$status = "Running";
}
The systembg is a function.
function system_bg($command) {
exec('bash -c "exec nohup setsid ' . $command . ' > /dev/null 2>&1 &"');
}
If i do ps ax | grep service.php in terminal is see this
/opt/cpanel/ea-php70/root/usr/bin/php-cgi /home/xsocial1/public_html/xsmp/service.php
When I run the file in terminal this is what i see when I run ps ax | grep service.php
/opt/cpanel/ea-php70/root/usr/bin/php service.php
I think my problem is I need the server to call a cli call and not a cgi call
This is correct, you will need to figure out the CLI PHP path and use that to start the process. This is sadly not exactly standardised, but on a *nix machine you will usually have the right one by using PHP_BINDIR .'/php' (PHP_BINDIR being a PHP constant).
You may try to check the webserver user has a privilegies to run the bash.

launching php script running on server (and opening sockets) from a php page

I'd need you precious help on a matter I am spending hours on.
Scope: Apache2 and PHP running on a raspberry pi;
Premise: my little knowledge of Linux environment!
The objective: launching a long-run php script, that opens sockets, from another php script running as webpage. In other terms, the application is a chat and I need to start the php server script form a web page, for my convenience.
The issue: if I run it from the the console, logged as "pi", with the following command
php -q /var/www/chatSocket.php > /var/www/tmp/socketProcessOutput.txt 2>&1 & echo $!
it works like a charm, but if it try to do so from a script, with the following (don't mind the concatenated strings and assignment of output to variables - it made no difference removing them):
$result .= "Result of pkill (killed process): " .shell_exec('sudo pkill -f SongWebSocket.php') ."\n";
$result .= "Launching new process: id returned:". shell_exec('php -q /var/www/chatSocket.php > /var/www/tmp/socketProcessOutput.txt 2>&1 & echo $!') ."\n";
$result .= "Checking running SongWebSocket.php process:" ."\n";
$result .= shell_exec('ps -A aux| grep -e SongWebSocket.php -e USER') ."\n";
.. it does not work (it seems like it launch the script but the sockets ar not open).
Any clue why this happens?
Also, and this can be for my little knowledge of Linux, why i get a dioffrent aoutput from the command
ps aux| grep -e SongWebSocket.php -e USER
if I launch it from the shell, as user pi, or from the sript, as www-data user.
I Look forward for your help. Thanks in advance!
Marco.
www-data user doesn't have the permisson I guess. why not add "sudo" for every shell_exec line? (sudo starts the programm with root permission). it's not pretty and not secure but it might work on you local home-network. sudo php ... sudo ps -A aux etc. In addition you should make sure that the php safe_mode is off. you can see that by adding phpinfo(); to you php code

Run Script in Background Linux Server w/ PHP exec()

I'm trying to trigger a PHP script to run in the background using the exec() function but I cannot get it to work. I've read countless posts on stack overflow and other forums and tried many variations to no avail.
Server Info:
Operating System: Linux
PHP: 5.2.17
Apache Version: 2.2.23
Home Directory: /home1/username
I'm currently using the code:
exec("/home1/username/php /home1/username/public_html/myscript.php > /dev/null &");
When I run the above script I get no error_log and no error in my cPanel error log, however the script definitely doesn't execute. When I browse to http://www.mydomain.com/myscript.php it runs and e-mails me instantly. Any idea why this isn't working / how I can find out what error is being produced?
Update cPanel Process Manager Output
exec("php /home1/username/php /home1/username/public_html/myscript.php > /dev/null &");
Produces:
27183 php /home1/username/php /home1/username/public_html/myscript.php
27221 [sh]
27207 php /home1/username/php /home1/username/public_html/myscript.php
27219 php /home1/username/php /home1/username/public_html/myscript.php
27222 php /home1/username/php /home1/username/public_html/myscript.php
27224 php /home1/username/php /home1/username/public_html/myscript.php
27249 sh -c php /home1/username/php /home1/username/public_html/myscript.php > /dev/null &
Is that normal? Script appears to hang around for a long time even though it should execute very quickly.
Couldn't get the exec working with php. Even when I got shell access to the server the command just hung. I decided to use wget instead which accomplishes the same thing. Works great :)
exec("wget http://www.mydomain.com/myscript.php > /dev/null &");
Have you tried invoking the php CLI directly?
exec("php /home1/username/php /home1/username/public_html/myscript.php > /dev/null &");
You will not need the #!, which would output to the browser if called through Apache.
EDIT.
It looks like your script is working, but your PHP script executing in the background is hanging (not exiting). Try this variation:
exec("php /home1/username/php /home1/username/public_html/myscript.php > /dev/null 2>&1 &");
What does “> /dev/null 2>&1″ mean?
since you want to run the myscript from your command line, wy not do this:
exec('(/home1/username/public_html/myscript.php) > /dev/null &',$r,$s);
And write this as a first line in the myscript.php:
#!/home1/username/php -n
<?php
//script goes here
?>
That should work. The hashbang tells the system what programme to use to run the script that follows, so you don't need to add that to your exec call. Also, it's safer (and therefore better) to put brackets around the full script call, just so PHP knows what output has to be redirected to what stream, to avoid any issues that might occur. Especially when libs or packages like PHP-GTK are installed on the server (hence the -n option).

How to check if a php script is still running

I have a PHP script that listens on a queue. Theoretically, it's never supposed to die. Is there something to check if it's still running? Something like Ruby's God ( http://god.rubyforge.org/ ) for PHP?
God is language agnostic but it would be nice to have a solution that works on windows as well.
I had the same issue - wanting to check if a script is running. So I came up with this and I run it as a cron job. It grabs the running processes as an array and cycles though each line and checks for the file name. Seems to work fine. Replace #user# with your script user.
exec("ps -U #user# -u #user# u", $output, $result);
foreach ($output AS $line) if(strpos($line, "test.php")) echo "found";
In linux run ps as follows:
ps -C php -f
You could then do in a php script:
$output = shell_exec('ps -C php -f');
if (strpos($output, "php my_script.php")===false) {
shell_exec('php my_script.php > /dev/null 2>&1 &');
}
The above code lists all php processes running in full, then checks to see if "my_script.php" is in the list of running processes, if not it runs the process and does not wait for the process to terminate to carry on doing what it was doing.
Just append a second command after the script. When/if it stops, the second command is invoked. Eg.:
php daemon.php 2>&1 | mail -s "Daemon stopped" you#example.org
Edit:
Technically, this invokes the mailer right away, but only completes the command when the php script ends. Doing this captures the output of the php-script and includes in the mail body, which can be useful for debugging what caused the script to halt.
Simple bash script
#!/bin/bash
while [true]; do
if ! pidof -x script.php;
then
php script.php &
fi
done
Not for windows, but...
I've got a couple of long-running PHP scripts, that have a shell script wrapping it. You can optionally return a value from the script that will be checked in the shell-script to exit, restart immediately, or sleep for a few seconds -and then restart.
Here's a simple one that just keeps running the PHP script till it's manually stopped.
#!/bin/bash
clear
date
php -f cli-SCRIPT.php
echo "wait a little while ..."; sleep 10
exec $0
The "exec $0" restarts the script, without creating a sub-process that will have to unravel later (and take up resources in the meantime). This bash script wraps a mail-sender, so it's not a problem if it exits and pauses for a moment.
Here is what I did to combat a similar issue. This helps in the event anyone else has a parameterized php script that you want cron to execute frequently, but only want one execution to run at any time. Add this to the top of your php script, or create a common method.
$runningScripts = shell_exec('ps -ef |grep '.strtolower($parameter).' |grep '.dirname(__FILE__).' |grep '.basename(__FILE__).' |grep -v grep |wc -l');
if($runningScripts > 1){
die();
}
You can write in your crontab something like this:
0 3 * * * /usr/bin/php -f /home/test/test.php my_special_cron
Your test.php file should look like this:
<?php
php_sapi_name() == 'cli' || exit;
if($argv[1]) {
substr_count(shell_exec('ps -ax'), $argv[1]) < 3 || exit;
}
// your code here
That way you will have only one active instace of the cron job with my-special-cron as process key. So you can add more jobs within the same php file.
test.php system_send_emails sendEmails
test.php system_create_orders orderExport
Inspired from Justin Levene's answer and improved it as ps -C doesn't work in Mac, which I need in my case. So you can use this in a php script (maybe just before you need daemon alive), tested in both Mac OS X 10.11.4 & Ubuntu 14.04:
$daemonPath = "FULL_PATH_TO_DAEMON";
$runningPhpProcessesOfDaemon = (int) shell_exec("ps aux | grep -c '[p]hp ".$daemonPath."'");
if ($runningPhpProcessesOfDaemon === 0) {
shell_exec('php ' . $daemonPath . ' > /dev/null 2>&1 &');
}
Small but useful detail: Why grep -c '[p]hp ...' instead of grep -c 'php ...'?
Because while counting processes grep -c 'php ...' will be counted as a process that fits in our pattern. So using a regex for first letter of php makes our command different from pattern we search.
One possible solution is to have it listen on a port using the socket functions. You can check that the socket is still listening with a simple script. Even a monitoring service like pingdom could monitor its status. If it dies, the socket is no longer listening.
Plenty of solutions.. Good luck.
If you have your hands on the script, you can just ask him to set a time value every X times in db, and then let a cron job check if that value is up to date.
troelskn wrote:
Just append a second command after the script. When/if it stops, the second command is invoked. Eg.:
php daemon.php | mail -s "Daemon stopped" you#example.org
This will call mail each time a line is printed in daemon.php (which should be never, but still.)
Instead, use the double ampersand operator to separate the commands, i.e.
php daemon.php & mail -s "Daemon stopped" you#example.org
If you're having trouble checking for the PHP script directly, you can make a trivial wrapper and check for that. I'm not sufficiently familiar with Windows scripting to put how it's done here, but in Bash, it'd look like...
wrapper_for_test_php.sh
#!/bin/bash
php test.php
Then you'd just check for the wrapper like you'd check for any other bash script: pidof -x wrapper_for_test_php.sh
I have used cmder for windows and based on this script I came up with this one that I managed to deploy on linux later.
#!/bin/bash
clear
date
while true
do
php -f processEmails.php
echo "wait a little while for 5 secobds...";
sleep 5
done

Categories