PHP running multiple scripts concurrently - php

I have an array with object server like this:
Array
(
[0](
(
[id] => 1
[version] => 1
[server_addr] => 192.168.5.210
[server_name] => server1
)
)
[1](
(
[id] => 2
[server_addr] => 192.168.5.211
[server_name] => server2
)
)
)
By running the code below, I'm able to get the desired output
foreach ($model as $server) {
$cpu_usage = shell_exec('sudo path/to/total_cpu_usage.sh '.$server->server_addr);
$memory_usage = shell_exec('sudo path/to/total_memory_usage.sh '.$server->server_addr);
$disk_space = shell_exec('sudo path/to/disk_space.sh '.$server->server_addr);
$inode_space = shell_exec('sudo path/to/inode_space.sh '.$server->server_addr);
$network = shell_exec('sudo path/to/network.sh '.$server->server_addr);
exec('sudo path/to/process.sh '.$server->server_addr, $processString);
$processArray = array();
foreach ($processString as $i) {
$row = explode(" ", preg_replace('/\s+/', ' ', $i));
array_push($processArray,$row);
}
$datetime = shell_exec('sudo path/to/datetime.sh '.$server->server_addr);
echo $cpu_usage;
echo $mem_usage;
echo $disk_space;
......
}
My scripts are similar like:
#!/bin/bash
if [ "$1" == "" ]
then
echo "To start monitor, please provide the server ip:"
read IP
else
IP=$1
fi
ssh root#$IP "date"
But the whole process took like 10 sec for 5 servers compared to 1 server for less than 2 sec. Why is that? Is there anyway to lessen the time? My guess is that the exec command was waiting for the output to be assign to the variable before going to next loop? I tried to google a little bit but most of the answer are for without returning any output at all... I need the output though

You can run your scripts simultaneously with popen() and grab the output later with fread().
//execute
foreach ($model as $server) {
$server->handles = [
popen('sudo path/to/total_cpu_usage.sh '.$server->server_addr, 'r'),
popen('sudo path/to/total_memory_usage.sh '.$server->server_addr, 'r'),
popen('sudo path/to/disk_space.sh '.$server->server_addr, 'r'),
popen('sudo path/to/inode_space.sh '.$server->server_addr, 'r'),
popen('sudo path/to/network.sh '.$server->server_addr, 'r'),
];
}
//grab and store the output, then close the handles
foreach ($model as $server) {
$server->cpu_usage = fread($server->handles[0], 4096);
$server->mem_usage = fread($server->handles[1], 4096);
$server->disk_space = fread($server->handles[2], 4096);
$server->inode_space = fread($server->handles[3], 4096);
$server->network = fread($server->handles[4], 4096);
foreach($server->handles as $h) pclose($h);
}
//print everything
print_r($model);
I tested a similar code to execute 5 scripts that sleep for 2 seconds and the whole thing took only 2.12 seconds
instead of 10.49 seconds with shell_exec().
Update 1: Big thanks to Markus AO for pointing out an optimization potential.
Update 2: Modified the code to remove the possibility of overwrite.
The results are now inside $model.
This can also show which server refused the connection, in case that issue about sshd is affecting you.

All you need to do is add an > /dev/null & at the end on Linux, you wont get the output though, but it will run as a background ( async ) process.
shell_exec('sudo path/to/datetime.sh '.$server->server_addr.' > /dev/null &');
see also this Background process script from my GitHub, ( it has windows compatible background processes )
https://github.com/ArtisticPhoenix/MISC/blob/master/BgProcess.php
Cheers!

I don't know how to make your logic faster but I can tell you how I use to track time of running when I have scripts. At the begin of the script put some var $start = date('c'); and at the end just simple echo ' start='.$start; echo ' end='.date(c);

Yes you're correct: your PHP script is waiting for each response before moving onward.
I presume you're hoping to run the requests to all servers simultaneously, instead of waiting for each server to respond. In that case, assuming you're running a thread-safe version of PHP, look into pthreads. One option is to use cURL multi-exec for making asynchronous requests. Then there's also pcntl_fork that may help you out. Also see this & this thread for possible thread/async approaches.
Aside that, do test and benchmark the shell scripts individually to see where the bottlenecks are, and whether you can speed them up. That may be easier than thread/async setups in PHP. If you have issues with network latency, then write an aggregator shell script that executes the other scripts and returns the results in one request, and only call that in your PHP script.

Related

Get all user processes via script run by cron job

I have a corn job that executes a PHP script every minute and execute processchecker.php. The script processchecker.php will then check from user process, which one contains the filename backgroundprocess.php.
This works perfectly if iam triggering these files manually by visiting their respective URLs.
Problem comes in when i automate the process as a cron job which for some reason does not return the processes that i am looking for. Cron jobs seem to be running with no user account and i am suspecting i need a method of listing all processes especially those started by the cron job itself.
processcheck.php
<?php
exec("ps aux", $output, $result);
$found=0;
foreach ($output AS $line) if(strpos($line, "backgroundprocess.php")){ $found=$found+1;};
if($found==0){
//service not running start it all over again
if (!$pid = shell_exec("nohup php backgroundprocess.php > /dev/null 2>&1 & echo $!")) return false;
}else{
//service is Already running
}
?>
From what i am seeing exec("ps aux", $output, $result); is not fetching processes started by the cron job itself......and therefore my background process will always be started over and over.
Please note, all this is on a remote vps server and i am using cpanel.
EDIT
Result is 0
Output is
Array
(
[0] => USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
[1] => rycjptbb 1 0.0 0.0 6324 600 ? SN 14:51 0:00 jailshell (rycjptbb) [init] ell -c nohup php public_html/processchecker.php > /dev/null & echo $!
[2] => rycjptbb 3 1.0 0.0 248940 12308 ? SN 14:51 0:00 php public_html/processchecker.php
[3] => rycjptbb 4 0.0 0.0 110236 1112 ? RN 14:51 0:00 ps aux
)
From what I understand, you just want to check if there is not other same process currently running.
For example, if process1.php runs every 1 minute, and the runtime can take lets say 3 minutes, you dont want another process to run in the second minute.
If that is your case then you can check if there is another process with the same name exists with this:
function is_other_process_exists()
{
$my_name = 'xxx'; //your process name, e.g. process1.php
$ps = `ps gax`;
$psLines = explode("\n", $ps);
array_pop($psLines);
$myLines = array();
foreach ( $psLines as $psLine )
{
if (strstr($psLine, $my_name))
{
$myLines[] = $psLine;
}
}
if ( count($myLines) > 1 ) {
$myPid = posix_getpid();
echo "process is already running with process id {$myPid}";
exit;
}
}

how can i assign values to my c program with php

I'm trying to run a C program of adding two numbers with PHP in a web browser. But when I run the command
exec"gcc name.c -o a & a" it
returns some garbage result like sum is : 8000542.00. It doesn't ask for any input.
I want to give inputs to scanf from the browser. Please suggest to me how can I resolve my problem.
I have tried this but couldn't handle it successfully.
$desc = array(0=> array ('pipe','w'), 1=> array ('pipe','r'));
$cmd = "C:\xampp\htdocs\add.exe";
$pipes=array();
$p = proc_open($cmd,$desc,$pipes);
if(is_resource($p))
{
echo stream_get_contents($pipes[0]);
fclose($pipes[0]);
$return_value=proc_close($p);
echo $return_value;

PHP Script on timer, can't get file to cache

I've been stuck on this for a couple of days now, and I'm really having trouble getting this script to function correctly.
I have a very basic starting script, which outputs a random page of text/html/php every time the page refreshes.
<?php
$pages = array(1 => 'text1-1.php', 2 => 'text1-2.php', 3 => 'text1-3.php', 4 => 'text1- 4.php');
$key = array_rand ( $pages );
include($pages[$key]) ;
?>
My goal is to have a script that only changes the output every 1 or 2 days (or what ever time is specify), so no matter how many times you refresh the page, the output won't change until the timer expires.
I have tried the following pieced together from tips people have given me, but no matter what I try the script always outputs something different, every time the page is refreshed.
I think the problem is the file isn't caching, but I don't understand why.
If there are any other problems you can see, I would be grateful for some pointers. :)
Thank you for any help you can offer. :)
<?php
$pages = array(1 => 'text1-1.php', 2 => 'text1-2.php', 3 => 'text1-3.php', 4 => 'text1- 4.php');
$cachefile = "cache/timer.xml";
$time = $key = null;
$time_expire = 24*60*60;
if(is_file($cachefile)) {
list($time, $key) = explode(' ', file_get_contents($cachefile));
}
if(!$time || time() - $time > $time_expire) {
$key = rand(0,count($pages)-1);
file_put_contents($cachefile, time().' '.$key);
}
include($pages[$key]) ;
?>
How about this method to generate your random number:
srand(floor(time()/60/60/24/2));
$key = rand(0,count($pages)-1);
It fixes the seed for two days (technically for 48 hours, not necessarily matching two full days) so the first call to rand() always returns the first number based on that seed.
Have you checked to make sure the file is actually created? Does the directory "cache" exist? Can you web server process write to it? Note that file_put_contents will issue a WARNING only if it cannot create the file; no error will be produced and the script will appear to run without a problem if you have your server set to not show warnings.
I absolutely agree the file is not being written; your code works fine for me.
Without cache/:
Warning: file_put_contents(cache/timer.xml): failed to open stream: No such file or directory in ...
With cache/ and write permissions:
$ php test.php
text1-1.php
$ php test.php
text1-1.php
$ php test.php
text1-1.php
$ php test.php
text1-1.php
$ php test.php
text1-1.php
$ php test.php
text1-1.php
$ php test.php
text1-1.php
$ php test.php
text1-1.php
$ php test.php
text1-1.php
$ php test.php
text1-1.php
Replace
if(!$time || time() - $time > $time_expire) {
With
if (! $time || (time () - $time) > $time_expire) {
Also
mt_rand is better than rand you might want to change that too
Edit 1
Since your array is not starting form 0 you should also
replace
$key = rand(0,count($pages)-1);
With
$key = mt_rand( 1, count ( $pages ));
Or
make your array
$pages = array (
0 => 'text1-1.php',
1 => 'text1-2.php',
2 => 'text1-3.php',
3 => 'text1-4.php'
);
Tested your script now .. it works perfectly fine ... let me know if you need anything else
Thanks
:)

how to make PHP lists all Linux Users?

I want to build a php based site that (automate) some commands on my Ubuntu Server
first thing I did was going to the file (sudoers) and add the user www-data so I can execute php commands with root privileges!
# running the web apps with root power!!!
www-data ALL=(ALL) NOPASSWD: ALL
then my PHP code was
<?php
$command = "cat /etc/passwd | cut -d\":\" -f1";
echo 'running the command: <b>'.$command."</b><br />";
echo exec($command);
?>
it returns only one user (the last user) !!! how to make it return all users?
thank you
From the PHP manual on exec:
Return Values
The last line from the result of the
command. If you need to execute a
command and have all the data from the
command passed directly back without
any interference, use the passthru()
function.
To get the output of the executed command, be sure to set and use the
output parameter.
So you have to do something similar to this:
<?php
$output = array();
$command = "cat /etc/passwd | cut -d\":\" -f1";
echo 'running the command: <b>'.$command."</b><br />";
exec($command, &$output);
echo implode("<br />\n", $output);
?>
As #benjamin explains, no need to be root or sudo, no need for SUID.
Just pure PHP. I used the field names from posix_getpwnam.
function getUsers() {
$result = [];
/** #see http://php.net/manual/en/function.posix-getpwnam.php */
$keys = ['name', 'passwd', 'uid', 'gid', 'gecos', 'dir', 'shell'];
$handle = fopen('/etc/passwd', 'r');
if(!$handle){
throw new \RuntimeException("failed to open /etc/passwd for reading! ".print_r(error_get_last(),true));
}
while ( ($values = fgetcsv($handle, 1000, ':')) !== false ) {
$result[] = array_combine($keys, $values);
}
fclose($handle);
return $result;
}
It returns an array containing all users, formatted like this:
[
[
'name' => 'root',
'passwd' => 'x',
'uid' => '0',
'gid' => '0',
'gecos' => 'root',
'dir' => '/root',
'shell' => '/bin/bash',
],
[
'name' => 'daemon',
'passwd' => 'x',
'uid' => '1',
'gid' => '1',
'gecos' => 'daemon',
'dir' => '/usr/sbin',
'shell' => '/usr/sbin/nologin',
],
...
]
Like Matt S said, that's an incredibly bad idea to allow www-data root access on your server. The slightest compromise through your web applications could allow anyone full control of your system.
A better idea would be to make separate scripts for specific accessions then use SUID permissions. This means, a specific user (in this case, www-data) can make small changes to the system through the execution of scripts. Still not a good idea, though. You may be able to work around it with suPHP but security is still a major concern.
/etc/passwd is readable by anyone, so you should be able to execute your command without having any special rights (unless PHP prevents it?).

linux worker script/queue (php)

I need a binary/script (php) that does the following.
Start n process of X in the background and maintain the number processes.
An example:
n = 50
initially 50 processes are started
a process exits
49 are still running
so 1 should be started again.
P.S.: I posted the same question on SV, which makes me probably very unpopular.
Can you use the crontab linux and write to a db or file the number of current process?.
If DB, the advantage is that you can use to procedure and lock the table, and write the number of process.
But to backgroun you should use & at the end of the call to script
# php-f pro.php &
Pseudocode:
for (i=1; i<=50; i++)
myprocess
endfor
while true
while ( $(ps --no-headers -C myprocess|wc -l) < 50 )
myprocess
endwhile
endwhile
If you translate this to php and fix its flaws, it might just do what you want.
I would go in the direction that andres suggested. Just put something like this at the top of your pro.php file...
$this_file = __FILE__;
$final_count = 50;
$processes = `ps auwx | grep "php -f $this_file"`;
$processes = explode("\n", $processes);
if (count($processes)>$final_count+3) {
exit;
}
//... Remaining code goes here
Have you tried making a PHP Daemon before?
http://kevin.vanzonneveld.net/techblog/article/create_daemons_in_php/
Here's something in Perl I have in my library (and hey, let's be honest, I'm not going to rig this up in PHP just to give you something working in that language this moment. I'm just using what I can copy / paste).
#!/usr/bin/perl
use threads;
use Thread::Queue;
my #workers;
my $num_threads = shift;
my $dbname = shift;
my $queue = new Thread::Queue;
for (0..$num_threads-1) {
$workers[$_] = new threads(\&worker);
print "TEST!\n";
}
while ($_ = shift #ARGV) {
$queue->enqueue($_);
}
sub worker() {
while ($file = $queue->dequeue) {
system ('./4parser.pl', $dbname, $file);
}
}
for (0..$num_threads-1) { $queue->enqueue(undef); }
for (0..$num_threads-1) { $workers[$_]->join; }
Whenever one of those systems calls finishes up, it moves on dequeing. Oh, and damn if I know hwy I did 0..$numthreads instead of the normal my $i = 0; $i < ... idiom, but I did it that way that time.
I have to solutions to propose. Both do child process reboot on exit, do child process reloading on USR1 signal, wait for the children exit on SIGTERM and so on.
The first is based on swoole php extension. It is very performant, async, non-blocking. Here's the usage example code:
<?php
use Symfony\Component\Process\PhpExecutableFinder;
require_once __DIR__.'/../vendor/autoload.php';
$phpBin = (new PhpExecutableFinder)->find();
if (false === $phpBin) {
throw new \LogicException('Php executable could not be found');
}
$daemon = new \App\Infra\Swoole\Daemon();
$daemon->addWorker(1, $phpBin, [__DIR__ . '/console', 'quartz:scheduler', '-vvv']);
$daemon->addWorker(3, $phpBin, [__DIR__ . '/console', 'enqueue:consume', '--setup-broker', '-vvv']);
$daemon->run();
The daemon code is here
Another is based on Symfony process library. It does not require any extra extensions. The usage example and daemon code could be found here

Categories