I am trying to execute a script in a remote server and I am using the symfony process.
$process = new Process('ssh -q root#remoteserver <path 2 script>');
$process->run();
When I execute this, I am getting NULL value.
Please note: The command I used is working fine in bash.
Related
I have a Laravel App with Short Schedule package installed and a crontab to execute it.
Inside the executed method I have a SHELL_EXEC function with some code inside it.
The problem is that when the code runs automatically using the cron, the SHELL EXEC doesn't work. The output is null.
When I run the method directly using php artisan or by simply running the schedule using php artisan manually, it works.
To have a clear vision of what's going on:
Inside the crontab -e I have the following
* * * * * php /var/www/html/project/artisan short-schedule:run --lifetime=60
The cron executes this method which has the following code:
$shell_command = '/home/paul/elrondsdk/erdpy --verbose tx new --receiver xxxx --send --pem asdfa.pem --gas-limit 300 --nonce 12';
$output = shell_exec($shell_command);
Log::info('output', [$output]);
If it runs automatically using the cron, the $output is NULL.
If I run the command manually, I get a proper output.
Initially I thought I need to specify the exact path in the shell exec because cron doesn't not PATH. Unfortunately did not solve my problem.
Thanks!
I tried to run the command manually or to run php artisan schedule:run and wait and it worked!
It doesn't work ONLY when its runs by the cron.
I am using symfony 4 with app engine flex env. I wrote a symfony console command which is meant to be running long term. Docs says that GAE has supervisord in place so I can use it to manage the script. How can I make sure the worker is actually running?
I've created the file additional-supervisord.conf with content:
[program:custom-worker]
command = php bin/console app:my-console-command
stdout_logfile = /dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile = /dev/stderr
stderr_logfile_maxbytes=0
user = www-data
autostart = true
autorestart = true
priority = 5
stopwaitsecs = 20
But I can't see anything in the log and don't know is command running properly or not. I also ssh to the instance and checked the processes - supervisord is running, but in php processes I can't see my script running. So I assume it does not work.
How can I check supervisord logs and track what's going with the worker? Would appreciate any advise.
Assuming you have stored your additional-supervisord.conf file correctly in the root of your project.
Give the full path to the console executable inside your script:
[program:custom-worker]
command = php %(ENV_APP_DIR)s/bin/console your:command
Further you can log stdout & stderr of the command to a file as follows:
[program:custom-worker]
command = php %(ENV_APP_DIR)s/bin/console custom:command 2>&1 1>%(ENV_APP_DIR)s/app/logs/custom.command.out.log
file_put_contents('/opt/lampp/htdocs/imslivedec/crontab.txt','0 11 15 1 * /opt/lampp/htdocs/imslivedec/sendmail.php'.PHP_EOL,FILE_APPEND);
shell_exec('crontab /opt/lampp/htdocs/imslivedec/crontab.txt');
When I run this exact same command in the terminal, the new cron job is getting
created but when I do from the php script using shell_exec() the command is not executing in the terminal.May I know why, am I doing anything wrong here?The shell_ exec works fine when I tried to create a new folder.
I need to execute a Laravel long running process in the background for consuming the Twitter Streaming API. Effectively the php artisan CLI command I need to run is
nohup php artisan startStreaming > /dev/null 2>&1 &
If I run that myself in the command line it works perfectly.
The idea is that I can click a button on the website which kicks off the stream by executing the long running artisan command which starts streaming (needs to run in the background because the Twitter Streaming connection is never ending). Going via the command line works fine.
Calling the command programatically however doesn't work. I've tried calling it silently via callSilent() from another command as well as trying to use Symfony\Component\Process\Process to run the artisan command or run a shell script which runs the above command but I can't figure it out.
Update
If I queue the command which opens the stream connection, it results in a Process Timeout for the queue worker
I effectively need a way to run the above command from a PHP class/script but where the PHP script does not wait for the completion/output of that command.
Help much appreciated
The Symfony Process Component by default will execute the supplied command within the current working directory, getcwd().
The value returned by getcwd() will not be the Laravel install directory (the directory that contains artisan), so the command will most likely returning a artisan: command not found message.
It isn't mentioned in the Process Component documentation but if you take a look at the class file, we can see that the construct allows us to provide a directory as the second parameter.
public function __construct(
$commandline,
$cwd = null,
array $env = null,
$input = null,
$timeout = 60, array
$options = array())
You could execute your desired command asynchronously by supplying the second parameter when you initialise the class:
use Symfony\Component\Process\Process;
$process = new Process('php artisan startStreaming > /dev/null 2>&1 &', 'path/to/artisan');
$process->start();
I had a same problem and I solved this with pure php and use proc_open function.
My code :
$descriptionProcOpen = [
["pipe", "r"],
["pipe", "r"],
["pipe", "r"]
];
proc_open("php " . base_path() . "/artisan your:command {$argument}", $descriptionProcOpen, $pipes);
How about queuing the execution of the command via Laravels build-in queues?
$this->callSilently('mail:send', [
'user' => 1, '--queue' => 'default'
]);
https://laravel.com/docs/9.x/artisan#calling-commands-from-other-commands
I am using this code on Ubuntu 13.04,
$cmd = "sleep 20 &> /dev/null &";
exec($cmd, $output);
Although it actually sits there for 20 seconds and waits :/ usually it works fine when using & to send a process to the background, but on this machine php just won't do it :/
What could be causing this??
Try
<?PHP
$cmd = '/bin/sleep';
$args = array('20');
$pid=pcntl_fork();
if($pid==0)
{
posix_setsid();
pcntl_exec($cmd,$args,$_ENV);
// child becomes the standalone detached process
}
echo "DONE\n";
I tested it for it works.
Here you first fork the php process and then exceute your task.
Or if the pcntl module is not availabil use:
<?PHP
$cmd = "sleep 20 &> /dev/null &";
exec('/bin/bash -c "' . addslashes($cmd) . '"');
The REASON this doesn't work is that exec() executes the string you're passing into it. Since & is interpreted by the shell as "execute in the background", but you don't execute a shell in your exec call, the & is just passed along with 20 to the /bin/sleep executable - which probably just ignores that.
The same applies to the redirection of output, since that is also parsed by the shell, not in exec.
So, you either need to find a way to fork your process (as described above), or a way to run the subprocess as a shell.
My workaround to do this on ubuntu 13.04 with Apache2 and any version of PHP:
libssh2-php, I just used nohup $cmd & inside a local SSH session using PHP and it ran it just fine the background, of course this requires putting certain security protocols in place, such as enabling SSH access for the webserver user, so it would have exec-like permissions then only allowing localhost to login to the webserver ssh account.