PHP not executing shell script and overloading apache - php

I want to execute my book generator script with php.
I tried to run my script on my linux terminal and all worked out fine.
sudo /home/repoadmin/apple/arduino.raamatu.generaator.sh
I used some execute function from stackoverflow to see what it echos during execute(also tried regular exec() and shell_exec()):
function liveExecuteCommand($cmd)
{
while (# ob_end_flush()); // end all output buffers if any
$proc = popen("$cmd 2>&1 ; echo Exit status : $?", 'r');
$live_output = "";
$complete_output = "";
while (!feof($proc))
{
$live_output = fread($proc, 4096);
$complete_output = $complete_output . $live_output;
echo "$live_output";
# flush();
}
pclose($proc);
// get exit status
preg_match('/[0-9]+$/', $complete_output, $matches);
// return exit status and intended output
return array (
'exit_status' => intval($matches[0]),
'output' => str_replace("Exit status : " . $matches[0], '', $complete_output)
);
}
define('__FILEDIR__','/home/repoadmin/apple');
chdir(__FILEDIR__);
readdir(__FILEDIR__);
liveExecuteCommand("sudo /home/repoadmin/apple/arduino.raamatu.generaator.sh");
closedir(__FILEDIR__);
I added www-data ALL=NOPASSWD:/home/repoadmin/apple/arduino.raamatu.generaator.sh to /etc/sudoer for executing with sudo.
My apache2 server freezes when I'm executing this php. When I restart my apache2, the output what I get is this:
/home/repoadmin/apple/arduino/arduino_raamat.pdf Could not open input file: /home/www/xxx/xxx/bookcreator.php?lang=et&toc=arduinotoc&book=book --2017-07-14 15:34:11-- http://xxx.xxx.xx/bookcreator.php Resolving xxx.xxx.xx (xxx.xxx.xx)... 195.xxx.xxx.xxx Connecting to xxx.xxx.xx (xxx.xxx.xx)|195.xxx.xxx.xxx|:80... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: '/home/repoadmin/apple/bookcreator_output.html' 0K .
What could be the problem and how can I fix it?

Related

Printing out one line to screen from popen in PHP?

I'm making a batch megadownload site in PHP. I'm parsing the input, validating it, then passing it to a bash script. I'm piping the output to the web page, but the contents of the webpage show like this.
8lX0JsBi.part01.rar: 0.00% - 0 bytes of 1000.0?MiB
8lX0JsBi.part01.rar: 0.09% - 917.7?KiB (939752 bytes) of 1000.0?MiB (916.7?KiB/s)
8lX0JsBi.part01.rar: 0.63% - 6.3?MiB (6566240 bytes) of 1000.0?MiB (5.3?MiB/s)
8lX0JsBi.part01.rar: 1.38% - 13.8?MiB (14430560 bytes) of 1000.0?MiB (7.0?MiB/s)
8lX0JsBi.part01.rar: 2.30% - 23.0?MiB (24129888 bytes) of 1000.0?MiB (9.0?MiB/s)
I'd like it to only show one line at a time that updates a single line.
8lX0JsBi.part01.rar: 2.30% - 23.0?MiB (24129888 bytes) of 1000.0?MiB (9.0?MiB/s)
Here is the function I'm using to pipe stdout to the webpage.
<?php
/**
* Execute the given command by displaying console output live to the user.
* #param string cmd : command to be executed
* #return array exit_status : exit status of the executed command
* output : console output of the executed command
*/
function liveExecuteCommand($cmd)
{
while (# ob_end_flush()); // end all output buffers if any
$proc = popen("$cmd 2>&1 ; echo Exit status : $?", 'r');
$live_output = "";
$complete_output = "";
while (!feof($proc))
{
$live_output = fread($proc, 4096);
$complete_output = $complete_output . $live_output;
echo "$live_output";
# flush();
}
pclose($proc);
// get exit status
preg_match('/[0-9]+$/', $complete_output, $matches);
// return exit status and intended output
return array (
'exit_status' => intval($matches[0]),
'output' => str_replace("Exit status : " . $matches[0], '', $complete_output)
);
}
?>
You have a few options:
Using clear, though this will clear the entire console.
system('clear');
echo '5%';
sleep(5);
system('clear');
echo '10%';
Or using /r, This will replace the last printed out line.
echo "5%";
sleep(5);
echo "\r10%";
You can also user cursor manipulation and character counting or the output buffer. But I think that will be a bit much for this.

sudo_exec returns nothing

I try to ping www.google.de with shell_exec and store the result into a variable but i get no result back from shell_exec.
<?php
$ping = 'sudo ping -c 4 ';
$url = 'www.google.de';
$command = $ping . $url;
$ping_result = shell_exec($command);
$datei = fopen("/var/www/myProject/result_ping","w") or die ("Could not open file!");
sleep(10);
if ($datei == false)
{
$ping_result = "Cannot open file!";
}
else
{
fwrite ($datei , $ping_result);
fclose ($datei);
}
echo $command; //Output: sudo ping -c 4 www.google.de
echo $ping_result; //Output: nothing
?>
The file result_ping has all rights (chmod 777).
Maybe the webserver is not allowed to execute ping?
Add 2>&1 to your command to ensure you're not getting an error message that shell_exec would filter off:
$command = $ping . $url . ' 2>&1';
shell_exec will return NULL in case of error. With that modification you redirect any error message to normal output, thus forcing shell_exec show every message you would normally get on a console session.

Impose time limit to popen/fgets in PHP

I want impose a time limit to a process reading using fgets opened by popen in PHP.
I have the next code:
$handle = popen("tail -F -n 30 /tmp/pushlog.txt 2>&1", "r");
while(!feof($handle)) {
$buffer = fgets($handle);
echo "data: ".$buffer."\n";
#ob_flush();
flush();
}
pclose($handle);
I tried without success:
set_time_limit(60);
ignore_user_abort(false);
The process is as follow:
The browser send a GET request waiting for a Answer in HTML5 Server side
events format.
The request is received by AWS Load Balancer and is
forwarded to EC2 instances.
The answer is the last 30 lines of the file
The browser receive it in 30 messages and the connection is persisted.
If tail command sends a new line it is returned else fgets wait undefined time until new line is returned from tail command.
AWS Load Balancer after 60 seconds of network inactivity (No new lines in 60 seconds) closes the connection to the browser. The connection to EC2 instance is not closed.
The browser detect that the connection is closed and it opens a new connection, the process go back to step 1.
AS this steps describe, the connection between AWS Load Balancer and EC2 instance is never closed, after a few hours/days there is hundreds and hundreds of tail and httpd process running and the server start not answering.
Of course it appear to be a AWS Load Balancer bug, but I don't want start a process to gain the attention from Amazon and wait for a fix.
My temporary solution is do a sudo kill tail to kill the process before the server becomes unstable.
I think PHP doesn't stop the script because PHP is "blocked" waiting for fgets to finish.
I know that the time limit of AWS Load Balancer is editable, but I want keep in the default value, even a higher limit is not going to fix the problem.
I don't know if I need change the question to How to execute a process in linux with a time limit / timeout?.
PHP 5.5.22 / Apache 2.4 / Linux Kernel 3.14.35-28.38.amzn1.x86_64
Tested with PHP 5.5.20:
//Change configuration.
set_time_limit(0);
ignore_user_abort(true);
//Open pipe & set non-blocking mode.
$descriptors = array(0 => array('file', '/dev/null', 'r'),
1 => array('pipe', 'w'),
2 => array('file', '/dev/null', 'w'));
$process = proc_open('exec tail -F -n 30 /tmp/pushlog.txt 2>&1',
$descriptors, $pipes, NULL, NULL) or exit;
$stream = $pipes[1];
stream_set_blocking($stream, 0);
//Call stream_select with a 10 second timeout.
$read = array($stream); $write = NULL; $except = NULL;
while (!feof($stream) && !connection_aborted()
&& stream_select($read, $write, $except, 10)) {
//Print out all the lines we can.
while (($buffer = fgets($stream)) !== FALSE) {
echo 'data: ' . $buffer . "\n";
#ob_flush();
flush();
}
}
//Clean up.
fclose($stream);
$status = proc_get_status($process);
if ($status !== FALSE && $status['running'] === TRUE)
proc_terminate($process);
proc_close($process);
Rather than using a process file pointer, I went with my "multitasking" approach. I use this code to spawn other "processes" Kind of a multitasking cheat.
I call a Script, hang.php, that just hangs for 90 seconds: sleep(90).
You may want to adjust the stream and stream_select timeouts.
Create stream(s)
header('Content-Type: text/plain; charset=utf-8');
$timeout = 20;
$result = array();
$sockets = array();
$buffer_size = 8192;
$id = 0;
$stream = stream_socket_client("ispeedlink.com:80", $errno,$errstr, $timeout,
STREAM_CLIENT_ASYNC_CONNECT|STREAM_CLIENT_CONNECT);
if ($stream) {
$sockets[$id++] = $stream; // supports multiple sockets
$http = "GET /testbed/hang.php HTTP/1.0\r\nHost: ispeedlink.com\r\n\r\n";
fwrite($stream, $http);
}
else {
echo "$id Failed\n";
}
Additional scripts can be run by adding the stream: $sockets[$id++] = $stream;
Below will put anything read in to the $result[$id] array.
Monitor the streams:
while (count($sockets)) {
$read = $sockets;
stream_select($read, $write = NULL, $except = NULL, $timeout);
if (count($read)) {
foreach ($read as $r) {
$id = array_search($r, $sockets);
$data = fread($r, $buffer_size);
if (strlen($data) == 0) { // either reads data or EOF
echo "$id Closed: " . date('h:i:s') . "\n\n\n";
fclose($r);
unset($sockets[$id]);
}
else {
$result[$id] .= $data;
}
}
}
else {
echo 'Timeout: ' . date('h:i:s') . "\n\n\n";
break;
}
}
echo system('ps auxww');
.
When I want to kill a process I use system('ps auxww') to get the pid and kill it with system("kill $pid")
kill.php
header('Content-Type: text/plain; charset=utf-8');
//system('kill 220613');
echo system('ps auxww');

Run shell command asynchronously using PHP and node

I need to execute a shell program that will run a rather long process and I dont want to wait until that process has ended for my PHP script to carry on execution. So far i tried:
1:Pure PHP
exec("longCommand &");
2:Node and php
exec("/usr/local/bin/node nodeLauncher.js &");
Node:
var spawn = require('child_process').spawn,
proc = spawn('longCommand', ['&']);
console.log('return');
In both cases the script carry on execution only after the "longCommand" has returned. Am I doing something wrong?
From PHP's page on exec():
If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.
That means, unless you direct the output to a file, exec() is blocking and will pause execution of your PHP script until the command you issued exits.
You can redirect the output to a file, or if you don't care about the output, redirect it to /dev/null.
Finally, yet another alternate could be to fork a new PHP process and exec the command from there. You can fork a new PHP process using pcntl_fork.
for node try passing detached option
var spawn = require('child_process').spawn,
proc = spawn('longCommand', ['&'], { detached: true } );
Node documentation on spawn
Although my filenames used here seems weird, why dont try to look at my working prototype of the raw code below... i can't post the other parts dude as I have attached to it my private DB passwords..eheheh
LINK: http://affiliateproductpromotions.net/sml1r.php
<?php
if(isset($_GET['y']))
$y =false;
else $y =true;
if(isset($_GET['count']))
{
echo getCount($_GET['f'],$y);
exit;
}
if(isset($_GET['stop']) && $_GET['stop']=='true')
{
$fr=fopen("huhu.txt","w");
fwrite($fr,"<script>document.getElementById('send').disabled=false;document.getElementById('stop').disabled=true;document.getElementById('process').innerHTML='<b style=color:GREY>Current Status: Stopped!</b>';document.getElementById('stop').style='width:90px;color:LIGHTYELLOW;background-color:GREY';document.getElementById('send').style='width:90px;color:LIGHTYELLOW;background-color:BLUE';</script>");
fclose($fr);
include('../semail/killexec.php');
sleep(2);
//exit;
}
else
{
header("Connection: close");
ignore_user_abort(); // optional
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
function run_in_background($Command, $Priority = 0)
{
if($Priority)
$PID = shell_exec("nohup nice -n $Priority $Command > /dev/null 2>&1 & echo $!");
else
$PID = shell_exec("nohup $Command > /dev/null 2>&1 & echo $!");
return($PID);
}
function is_process_running($PID)
{
exec("ps $PID", $ProcessState);
return(count($ProcessState) >= 2);
}
//ob_end_clean();
echo("Running hmmsearch. . .");
$ps = run_in_background("hmmsearch $hmmfile $fastafile > $outfile");
$fpf = fopen("pid.txt","w");
fwrite($fpf,exec('ps '.$ps));
fclose($fpf);
while($i<=getCount())
{
$fp2 = fopen("sent1email.txt","w");
fwrite($fp2,getEmailSent($i));
fclose($fp2);
$fp = fopen("haha.txt","w");
fwrite($fp,"$i\n");
// echo("<br> [ ".$i++." ] ");
// ob_flush(); flush();
$i++;
sleep(2);
if($i==getCount())
{
$fr=fopen("huhu.txt","w");
fwrite($fr,"<script>document.getElementById('send').disabled=false;document.getElementById('stop').disabled=true;document.getElementById('process').innerHTML='<b style=color:GREY>Current Status: Finished Sending!</b>';document.getElementById('stop').style='width:90px;color:LIGHTYELLOW;background-color:GREY';document.getElementById('send').style='width:90px;color:LIGHTYELLOW;background-color:BLUE';</script>");
fclose($fr);
sleep(1);
include('../semail/killexec.php');
}
if($i<getCount())
{
$fr=fopen("huhu.txt","w");
fwrite($fr,"<script>document.getElementById('send').disabled=true;document.getElementById('stop').disabled=false;document.getElementById('process').innerHTML='<b style=color:GREY>Current Status: Sending...</b>';document.getElementById('send').style='width:90px;color:LIGHTYELLOW;background-color:GREY';document.getElementById('stop').style='width:90px;color:LIGHTYELLOW;background-color:RED';</script>");
fclose($fr);
sleep(2);
}
}
fclose($fp);
//sleep(1);
ob_end_flush(); // <-- this trash will not work
flush(); // <--- if this garbage dont exist
sleep(5);// <-- but dont worry, a collector is here...
}
?>

Error Reporting? Import .sql file from PHP and show errors

I am building a way of importing .SQL files into a MySQL database from PHP. This is used for executing batches of queries. The issue I am having is error reporting.
$command = "mysql -u $dbuser --password='$dbpassword' --host='$sqlhost' $dbname < $file";
exec($command, $output);
This is essentially how I am importing my .sql file into my database. The issue is that I have no way of knowing if any errors occurred within the PHP script executing this command. Successful imports are entirely indistinguishable from a failure.
I have tried:
Using PHP's sql error reporting functions.
Adding the verbose argument to the command and examining the output. It simply returns the contents of the .sql file and that is all.
Setting errors to a user variable within the .sql file and querying it from the PHP script.
I hope I am not forced to write the errors into a temporary table. Is there a better way?
UPDATE:
If possible, it would be very preferable if I could determine WHAT errors occurred, not simply IF one occurred.
$command = "mysql -u $dbuser --password='$dbpassword' --host='$sqlhost' $dbname"
. " < $file 2>&1";
exec($command, $output);
The error message you're looking for is probably printed to stderr rather than stdout. 2>&1 causes stderr to be included in stdout, and as a result, also included in $output.
Even better, use proc_open instead of exec, which gives you far more control over the process, including separate stdout and stderr pipes.
Try using shell_exec
$output = shell_exec( "mysql -u $dbuser --password='$dbpassword' --host='$sqlhost' $dbname < $file" );
// parse $output here for errors
From the manual:
shell_exec — Execute command via shell and return the complete output as a string
Note:
This function is disabled when PHP is running in safe mode.
EDIT: Full solution:
what you need to do is grab STDERR and discard STDOUT. Do this by adding '2>&1 1> /dev/null' to the end of your command.
$output = shell_exec( "mysql -u $dbuser --password='$dbpassword' --host='$sqlhost' $dbname < $file 2>&1 1> /dev/null" );
$lines = explode( PHP_EOL, $output );
$errors = array();
foreach( $lines as $line )
{
if ( strtolower( substr( $line, 0, 5 ) ) == 'error' )
{
$errors[] = $line;
}
}
if ( count( $errors ) )
{
echo PHP_EOL . 'Errors occurred during import.';
echo implode( PHP_EOL, $errors );
}
else
{
echo 'No Errors' . PHP_EOL;
}
When issuing a exec, the shell will return a 0 on succes, or a number indicating a failure.
$result = exec( $command, $output );
should do the trick. Check result and handle appropiate.
You have done everything but look at the PHP manual! There is an additional parameter for the exec to return a result
http://php.net/manual/en/function.exec.php
"If the return_var argument is present along with the output argument, then the return status of the executed command will be written to this variable."
exec($command,$output,$result);
if ($result === 0) {
// success
} else {
// failure
}

Categories