I use phpseclib to SSH into my remote server from a web browser and execute a php file. Below is the code I use:
$ssh->exec('cd myfolder/; php main.php ' . $file, 'packet_handler');
function packet_handler(){
echo "Completed";
header("Location: exec_completed.php");
}
The main.php file gets executed without any issue. The problem is with returning the data after execution. I have the following questions:
I do a lot of processing in the main.php file and i need to show real time progress of what the script does. When i execute the file through exec, only the first echo in the main.php is printed and the execution stops. Is there any way to get real time data from the executing script.
I follow this example from phpseclib for callbacks although my callback function packet_handler doesn't run after exec is executed. I want to redirect to another page once the main.php I execute through SSH has completed its execution. Now if i redirect to that page i get only partial results as the main.php file has not completed its execution. I tried to use sleep(10) but my main.php may take longer times to execute at times so it didn't work. please suggest any ideas
Callback example from phpseclib:
<?php
include('Net/SSH2.php');
$ssh = new Net_SSH2('www.domain.tld');
if (!$ssh->login('username', 'password')) {
exit('Login Failed');
}
function packet_handler($str)
{
echo $str;
}
$ssh->exec('ping 127.0.0.1', 'packet_handler');
?>
For #1... doing $ssh->exec without the callback should work. If it doesn't I'd need to see the logs, which you can get by doing define('NET_SSH2_LOGGGING', 2) at the top and then echo $ssh->getLog() after. Posting the log at pastebin.com and then posting a link would be good. But that said, that won't get you real time output either.
For #2... the callback function is mainly intended for real-time updates and odds are very likely that what you'll get with each call of the callback function will be an incomplete output. So for your callback to output "Completed" and redirect the user to another location is, in all likelihood, incorrect.
Another approach that may work for you: use the interactive shell. Example:
http://phpseclib.sourceforge.net/ssh/examples.html#sudo,
I don't know what your output is like. Maybe you could read() until you got to certain parts of the output that are guaranteed to be output. Or maybe you could use $ssh->setTimeout(5) and get updated output every five seconds or something..
Related
I'm trying to find a way in which I can echo out the output of an exec call, and then flush that to the screen while the process is running. I have written a simple PHP script which accepts a file upload and then converts the file if it is not the appropriate file type using FFMPEG. I am doing this on a windows machine. Currently my command looks like so:
$cmd = "ffmpeg.exe -i ..\..\uploads\\".$filename." ..\..\uploads\\".$filename.".m4v 2>&1";
exec( $cmd, $output);
I need something like this:
while( $output ) {
print_r( $output);
ob_flush(); flush();
}
I've read about using ob_flush() and flush() to clear the output buffer, but I only get output once the process has completed. The command works perfectly, It just doesn't update the Page while converting. I'd like to have some output so the person knows what's going on.
I've set the time out
set_time_limit( 10 * 60 ); //5 minute time out
and would be very greatful if someone could put me in the right direction. I've looked at a number of solutions which come close one Stackoverflow, but none seem to have worked.
Since the exec call is a blocking call you have no way of using buffers to get status.
Instead you could redirect the output in the system call to a log file. Let the client query the server for progress update in which case the server could parse the last lines of the log file to get information about current progress and send it back to the client.
exec() is blocking call, and will NOT return control to PHP until the external program has terminated. That means you cannot do anything to dump the output on a line-by-line basis because PHP is suspended while the external app is running.
For what you want, you need to use proc_open, which returns a filehandle you can read from in a loop. e.g.
$fh = proc_open('.....');
while($line = fgets($fh)) {
print($line);
flush();
}
There are two problems with this approach:
The first is that, as #Marc B notes, the fact that exec will block until it's finished. You'll have to devise some way of measuring progress.
The second is that using ob_flush() in this way amounts to holding the connection between server & client open and dribbling the data out a little at a time. This is not something that the HTTP protocol was designed for and while it might work sometimes, it's not going to work consistently - different browsers and different servers will time out differently. The better way to do it is via AJAX calls: using Javascript's setTimeout() function (or setInterval()), make a call to the server periodically and have the server send back a progress report.
I am building a WebService, using PHP:
Basically,
User sends a request to the server, via HTTP Request. 'request.php', ie.
Server starts php code asynchronously. 'update.php', ie.
The connection with the user is finished.
The code 'update.php' is still running, and will finish after some time.
The code 'update.php' is finished.
The problem is with php running asynchronously some external code.
Is that possible? Is there another way to do it? With shell_exec?
Please, I need insights! An elegant way is preferable.
Thank you!
The best approach is using message queue like RabbitMQ or even simple MySQL table.
Each time you add new task in front controller it goes to queue. Then update.php run by cron job fetch it from queue, process, save results and mark task as finished.
Also it will help you distribute load over time preventing from DoS caused by your own script.
You could have the user connect to update.php, generate some sort of unique ID to keep track of the process, and then call fsockopen() on itself with a special GET variable to signify that it's doing the heavy lifting rather than user interaction. Close that connection immediately, and then print out the appropriate response to the user.
Meanwhile, look for the special GET variable you specified, and when present call ignore_user_abort() and proceed with whatever operations you need in that branch of the if clause. So here's a rough skeleton of what your update.php file would look like:
<?php
if ( isset($_GET['asynch']) ) {
ignore_user_abort();
// check for $_GET['id'] and validate,
// then execute long-running code here
} else {
// generate $id here
$host = $_SERVER['SERVER_NAME'];
$url = "/update.php?asynch&id={$id}";
if ( $handle = fsockopen($host, 80, $n, $s, 5) ) {
$data = "GET {$url} HTTP/1.0\r\nHost: {$host}\r\n\r\n";
fwrite($handle, $data);
fclose($handle);
}
// return a response to the user
echo 'Response goes here';
}
?>
You could build a service with PHP.
Or launch a PHP script using bash : system("php myScript.php param param2 &")
Look into worker processes with Redis resque or gearman
I have a script that shows a result done then i call an include file to run at intervals. I would like the script result to display on browser then include file will run in the back. Right now my browser is connecting but showing nothing.I would like the browser echo Done and Logging.
<?php
ob_implicit_flush(true);
require_once('syslog.php');
$syslog = new Syslog();
$line="My msg";
$hostname = gethostname();
$ip= #$REMOTE_ADDR;
$hostnameip = GetHostByName($ip);
$syslog->Send('127.0.9.1', $hostname." ".$hostnameip." ".$line);
echo"Done";
echo "Logging......";
ob_end_flush();
include('execute.php');
?>
I believe you'll be able to get what you want by calling the flush() method right after your echo commands.
The flush() function is described as -
Flushes the write buffers of PHP and whatever backend PHP is using (CGI, a web server, etc). This attempts to push current output all the way to the browser with a few caveats.
There are some special considerations when using this function. Certain apache modules and client side buffering will still be enforced, but for now however, I do believe this will help.
Try flush() before include after last echo.
Here I am writing about my flow in detail.
I am doing an Ajax request on same host.
AJAX call to saveUser.php
in saveUser.php, I have included Common.php.
Common.php have createFile1() and createFile2() function.
createFile1() function just creating a sample1.php for another purpose.
in createFile2(), I am creating a file sample2.php and executing a exec('php /home/public/sample2.php > /dev/null &'); command to execute that file.
I am calling this createFile1() and createFile2() function respectively from saveUser.php which is being called by an AJAX request.
As I have set the exec command to run in background by '&' at end of command, it is returning without waiting for the response and all goes smoothly in front.
But when I am checking on my server, these files sample1.php and sample2.php are getting created again and again. It seems all this saveUser.php action are getting executed again and again until I stops the sample2.php process from SSH.
I checked the processes list, each time 'php /home/public/sample2.php' is having new process_id, so it confirms that it is getting executed again and again. If I remove the exec() code and execute this sample2.php from SSH, it works as expected, there is not such problem.
Please suggest me whats going on wrong? Is there any problem with server configuration, I am using hostgator shared account.
Also I am including same Common.php file in sample2.php also, informing in case it can help it.
Thanks for your help in advance.
saveUser.php code
include_once dirname(__FILE__).'/Common.php';
createFile1();
createFile2();
echo 'saved successfully!';
Common.php code
function createFile1()
{
$template = file_get_contents('sendDmTemplate.php');
$serviceFp = fopen('sendDmFiles/sample1.php',"w");
fwrite($serviceFp , $fileContent);
fclose($serviceFp);
}
function createFile2()
{
$fileContent = file_get_contents(dirname(__FILE__).'/sampleFileTemplate.php');
$serviceFp = fopen('sample2.php',"w");
fwrite($serviceFp , $fileContent);
fclose($serviceFp);
exec('php '.dirname(__FILE__).'/sample2.php > /dev/null &');
}
sample2.php code
include_once dirname(__FILE__).'/Common.php';
echo 'hi';
This exact same thing happened to me. I was trying to have one php script call exec() on another php script but for some strange reason it would just exec() itself again creating an infinite loop. The currently accepted answer didn't get me anywhere, but I was able to get it to work by specifying an absolute path to php.
So, instead of
exec('php /home/public/sample2.php > /dev/null &');
it would be
exec('/usr/local/bin/php /home/public/sample2.php > /dev/null &');
Check the following:
1) Are you sure you are NOT calling your script saveUser.php multiple times? Mayby a codingerror somewhere in the javascript XHR? Check this by looking in the (apache?) log.
2) Are you sure your php executes alright without the -q? I use php -q pathtoscript.php
3) If not 1 or 2: Post the code in here (or somewhere) of saveUser.php
EDIT: I see your problem. The file you create includes common.php again, and executes that again. Etc. <-- wrong Oops. I wrote that too early. Looking into it again now.
4) Is it possible you use some errorhandling that redirects to your saveUser.php?
Edit:
5) There might arise a problem from the fact that you are including the file that is executing the command itself in combination with include_once, but I am not sure. You could simply test this by changing your content of sample2.php content by adjusting sampleFileTemplate.php. Create a common2.php (with identical content as common.php), and use that one. COuld you testdrive that setup and report back?
I have a PHP script which calls exec(). I've been having trouble all day with some code calling the same script working and some not (exec() returns a 127 error code).
I have finally worked out that the code that is not working is the code that is being called from jQuery on my web page:
$('#next_button').click(function(event) {
$.get('download_forms.php', function(data) {
alert(data);
});
});
However, if I type the url for download_forms.php into the address bar of my browser, then exec() will execute properly. I have tries to run other scripts that call exec() from jQuery to test and they all fail, but work if typed into the address bar.
I don't see why this would be an issue. Whether I type the url into Firefox's address bar, or whether pressing the button on my webpage, an HTTP request will be made.
Does anybody know what the difference could possibly be?
Note: I have tried different commands in exec() and they are all failing from my jQuery (note all the rest of the PHP code runs fine) but work when the script address is typed directly into the address bar.
Many thanks
Update
This is my download_forms.php code. The initial exec() was just to see if exec() worked at all. As above, it only executes properly if typed directly into the address bar.
include ('inc/session.inc.php');
require_once('Downloader.php');
exec('id', $output, $r);
echo var_dump($output);
echo($r);
try {
$downloader = new Downloader();
$saveMessages = $downloader->saveToDatabase();
// exec() in the combineAndDownloadForms() method
$downloadMessages = $downloader->combineAndDownloadForms();
} catch(Exception $e) {
echo $e->getMessage();
}
Further Update
I made a hyperlink through from my webpage to the download_forms.php page (ie a <a>), but exec() still doesn't execute. At least I know it's nothing to do with ajax.
For what it's worth, I fixed this issue and all of the above was a red herring.
I hadn't noticed that the page I was on was using the secure https protocol, so when the download_forms.php script was being used, it was also accessed using https and it seems that the exec() and passthru() functions won't execute commands on the server in these circumstances, which makes sense.
I changed the script so that it changed the protocol of the download_forms.php url to ordinary http and it's now functioning fine.
HTH