I need to build a system that a user will send file to the server
then php will run a command-line tool using system() ( example tool.exe userfile )
i need a way to see the pid of the process to know the user that have start the tool
and a way to know when the tool have stop .
Is this possible on a Windows vista Machine , I can't move to a Linux Server .
besides that the code must continue run when the user close the browser windows
Rather than trying to obtain the ID of a process and monitor how long it runs, I think that what you want to do is have a "wrapper" process that handles pre/post-processing, such as logging or database manipulation.
The first step to the is to create an asynchronous process, that will run independently of the parent and allow it to be started by a call to a web page.
To do this on Windows, we use WshShell:
$cmdToExecute = "tool.exe \"$userfile\"";
$WshShell = new COM("WScript.Shell");
$result = $WshShell->Run($cmdToExecute, 0, FALSE);
...and (for completeness) if we want to do it on *nix, we append > /dev/null 2>&1 & to the command:
$cmdToExecute = "/usr/bin/tool \"$userfile\"";
exec("$cmdToExecute > /dev/null 2>&1 &");
So, now you know how to start an external process that will not block your script, and will continue execution after your script has finished. But this doesn't complete the picture - because you want to track the start and end times of the external process. This is quite simple - we just wrap it in a little PHP script, which we shall call...
wrapper.php
<?php
// Fetch the arguments we need to pass on to the external tool
$userfile = $argv[1];
// Do any necessary pre-processing of the file here
$startTime = microtime(TRUE);
// Execute the external program
exec("C:/path/to/tool.exe \"$userfile\"");
// By the time we get here, the external tool has finished - because
// we know that a standard call to exec() will block until the called
// process finishes
$endTime = microtime(TRUE);
// Log the times etc and do any post processing here
So instead of executing the tool directly, we make our command in the main script:
$cmdToExecute = "php wrapper.php \"$userfile\"";
...and we should have a finely controllable solution for what you want to do.
N.B. Don't forget to escapeshellarg() where necessary!
Related
First of all use windows.
I have the following code:
index.php
<?php
error_reporting(E_ALL);
$tiempo_inicio = microtime(true);
exec('C:\wamp\bin\php\php5.5.12\php.exe -e C:\wamp\www\mail.php > /dev/null &');
$tiempo_fin = microtime(true);
echo ($tiempo_fin - $tiempo_inicio);
?>
Mail.php
<?php
$tiempo_inicio = microtime(true);
$logs = fopen("test.txt","a+");
sleep(2);
$tiempo_fin = microtime(true);
fwrite($logs, ($tiempo_fin - $tiempo_inicio)."
");
sleep(4);
$tiempo_fin = microtime(true);
fwrite($logs, ($tiempo_fin - $tiempo_inicio)."
");
sleep(6);
$tiempo_fin = microtime(true);
fwrite($logs, ($tiempo_fin - $tiempo_inicio)."
");
echo 'fin';
?>
But it does not work I hope, because what I want is to run the file in the background without the user wait for the completion of this file.
What am I doing wrong?
You're talking about a non-blocking execution (where one process doesn't wait on another. PHP really can't do that very well (natively anyways) because it's designed around a single thread. Without knowing what your process does I can't comment on this precisely but I can make some suggestions
Consider asynchronous execution via AJAX. Marrying your script to a Javascript lets the client do the request and lets your PHP script run freely while AJAX opens another request that doesn't block the activity on the main page. Just be sure to let the user visually know you're waiting on data
pthreads (repo)- Multi-threaded PHP. Opens another process in another thread.
Gearman - Similar to pthreads but can be automated as well
cron job - Fully asynchronous. Runs a process on a regular interval. Consider that it could do, say, data aggregation and your script fetches on the aggregate data
I've had success in the past on Windows using pclose/popen and the Windows start command instead of exec. The downside to this is that it is difficult to react to any errors in the program you are calling.
I would try something like this (I'm not on a machine to test this today):
$command_string = 'C:\wamp\bin\php\php5.5.12\php.exe -f "C:\wamp\www\mail.php" -- variablestopass';
pclose(popen("start /B ".$command_string, 'r'));
The main reason is because I don't want to hold up the current PHP process. I want users to be able to navigate around during the script execution.
The script in question (importer.php) updates a txt file with a percentage as it completes, javascript intercepts this txt file and outputs the percentage using a timer every 5 seconds to keep the user updated (all in the form of a load bar).
I've been able to launch the script like so:
$cmd = '"C:\/path\/to\/v5.4\/php" importer.php';
pclose(popen($cmd, "r"));
exit;
This runs the script, but hangs the current process until importer.php completes. Is there a way to get out of the current process and launch this using another one instead?
I read that using & at the end of the cmd tells the script to not wait, but I believe this is a *nix command and since I'm running on a Windows box, I can't use it... unless perhaps there is an alternative for Windows?
According to the documentation at http://php.net/passthru you should be able to execute your command using that, as long as you redirect your output.
$cmd = '"C:\/path\/to\/v5.4\/php" importer.php';
// Use passthrough here, and redirect the output to a temp textfile.
passthru($cmd . '>%TEMP%\importerOutput.txt');
exit;
I was able to resolve this issue by using a WshShell Object; WScript.Shell
$WshShell = new COM("WScript.Shell");
$WshShell->Run('"C:\/path\/to\/v5.4\/php-win.exe" -f "C:\/path\/to\/code\/snippet\/importer.php" var1 var2 var3', 0, false);
Note: I have spaces in my file structure so I needed to add quotes around the paths to the files. I was also able to pass variables, var1, var2, and var3. I've also used \/ to escape my slashes.
I'll break the Run array down a bit for my case:
The first; is the command you want to run (path to php, path to script, and variables to pass).
The second; 0 - Hides the window and activates another window (link below for more options).
The third; false - Boolean value indicating whether the script should wait for the program to finish executing before continuing to the next statement in your script. If set to true, script execution halts until the program finishes.
For more information on WScript.Shell visit http://msdn.microsoft.com/en-us/library/d5fk67ky(v=vs.84).aspx for details.
Hope this helps someone else!
I'm tearing my hair trying to figure out why this could possibly not work when it works when executing from the command line. I'm sending a POV-Ray animation to the user after it is done rendering and compiling the frames into a GIF using ImageMagick. I have the email that is sent through mutt sleep for an hour so that it allows time for the user animation to be compiled. (I've changed it to 120 seconds however so that I don't have to wait an hour to troubleshoot. In that respect I also made the animation only have 3 frames until I figure out the issue.)
A new bash file, animation_done_bash.sh, is created each time a user clicks animate on my site which in turn creates a new folder that stores the frames for the animation and with animation_done_bash.sh in it. To make the PHP page advance to the next page the bash file is executed in dev null. I already tested if that was the problem on the command line and it works there so I really have no idea what is the issue.
Below is the code and the output (I echoed this out onto my php page to see what was wrong) that is produced that is being executed with shell_exec:
$animation_done_email = "#!/usr/bin/bash \nsleep 120; mutt -s \"Animation Finished\" -a animation.gif -- ".$email." <email.txt;";
/*Creates new animation_done_bash each time user hits animate*/
$directory_bash = "./User_Files/Animation/".$location."/";
$filename_bash = 'animation_done_bash.sh';
$handler_bash = fopen($directory_bash.$filename_bash, "w");
fwrite($handler_bash, $animation_done_email);
fclose($handler_bash);
/*Need to change the permissions on bash file so it can execute*/
chmod($directory_bash."/animation_done_bash.sh", 0777);
$command_5 = 'cd /home/ouraccount/public_html/User_Files/Animation/'.$location.'/; bash animation_done_bash.sh > /dev/null 2>/dev/null &';
$shellOutput = shell_exec($command_5);
Where $email is the user's email and $location is the folder in which the frames are stored. email.txt is stored within the same folder.
Output:
cd /home/ouraccount/public_html/User_Files/Animation/ani_51/; bash animation_done_bash.sh > /dev/null 2>/dev/null &
Any guidance would be much appreciated. Thanks!
In this kind of situations pushing (invoking a script when the rendering operation is complete) is preferrable to polling (periodically checking if the rendering operation is complete).
If you cannot push, do it in one language only, don't create a hybrid of bash and PHP.
Here is are 2 examples you could try which might suit your situation:
Example if rendering command returns after finishing:
<?php
/** PHP wrapper around rendering command X that mails the output **/
// Don't write attachment code yourself, use a class like PHPMailer: https://github.com/Synchro/PHPMailer
require_once('class.phpmailer.php');
// Ignore user browser close, rendering takes a long time
ignore_user_abort(true);
// On windows you'll also need to set_time_limit to something large. On Unix PHP doesn't count things like database queries and shell commands, on Windows it does
// Execute render command, don't forget to escape if you use user input
// Script will only continue once renderer returns. If renderer return before rendering is finished you cannot use this
$render_output = shell_exec('renderer input.file output.file');
// Could also be done in PHP for finer control and error handling
$imagemagick_output = shell_exec("convert output.file animation.gif");
unlink("output.file");
$mail = new PHPMailer();
$mail->addAttachment('animation.gif');
// etc.
unlink("animation.gif");
?>
Example if rendering command returns before finishing:
<?php
/** PHP wrapper around rendering command X that mails the output **/
// Don't write attachment code yourself, use a class like PHPMailer: https://github.com/Synchro/PHPMailer
require_once('class.phpmailer.php');
// Ignore user browser close
ignore_user_abort(true);
// Execute render command, don't forget to escape if you use user input
// If renderer returns before file is finished use this
$render_output = shell_exec('renderer input.file output.file 2> error.file');
// Wait for rendering to finish
// Implement these function, e.g. file_exists for output.file or error.file
while(!rendering_has_failed() && !rendering_is_finished()) {
sleep(15*60); // Low-resource wait for 15 minutes
}
// Could also be done in PHP for finer control and error handling
$imagemagick_output = shell_exec("convert output.file animation.gif");
unlink("output.file");
$mail = new PHPMailer();
$mail->addAttachment('animation.gif');
// etc.
unlink("animation.gif");
?>
I'll show some code, first:
echo exec("compile\\save.exe Untitled.c tmpUntitled.c");
I have a program, named save.exe and i want to know if it already stopped?
If stopped, ok... Do something...
If not, may be an error, or a loop...
Now: I want to build same way to control the time that program use, and put a limit (time limit exceed, something like this...)
Any one has a sugestion ?
Edit[1]:save.exe is a program wrote on C language, and use two parameters: source and destiny.
popen just don't execute save.exe, i say this because it don't gerenate anymore the destiny (with execit happens);
exec() will suspend your script while the exec'd program is running. Control will be returned only when the external program terminates. If you want to see if the program's hung up or waiting for input or somesuch, you use popen(), which returns a filehandle from which you can read the program's output. At that point you can do a polling loop to see if there's been any output:
$app = popen("your shell command here");
while($output = fgets($app)) {
// handle output
sleep(5); // sleep 5 seconds
}
If you want to send something to the app as input, then you have to use proc_open(), which allows bi-directional communication between your script and the external program.
So I am trying to execute some script from my php code. That lives in page blah.php
<?php
// ....
// just basic web site that allows upload of file...
?>
Inside I use system call
if (system("blah.pl arg1") != 0)
{
print "error\n";
}
else
{
print "working on it..you will receive e-mail at completion\n";
}
Works great but it waits until it completes until it prints working on it.
I am aware that I am calling perl script from php not a typo.
How can I just start program execution and let it complete in background.
The blah.pl script handles e-mail notification.
Anyone?
Thanks I appreciate it
From system function documentation:
Note: If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
Simply redirect the output of your Perl script and PHP should be able to run without having to wait for the script to finish.
system("blah.pl arg1 > /dev/null 2>&1 &");
Can you add an ampersand to spawn the process in the background?
if (system("blah.pl arg1 &") != 0)
You could probably use the popen function
http://www.php.net/manual/en/function.popen.php
Since all the PHP command line functions wait for a result - I recommend that you have a separate page to call the emailer script.
In other words, send a request to the email.php page (or whatever) using cURL or the file wrappers which allows you to close the connection before receiving a result. So the process will look something like this.
page_running.php
|
cURL call to (email.php?arg1=0)
| |
final output email.php
|
calls system("blah.pl arg1")