About exec() function and time - php

I'll show some code, first:
echo exec("compile\\save.exe Untitled.c tmpUntitled.c");
I have a program, named save.exe and i want to know if it already stopped?
If stopped, ok... Do something...
If not, may be an error, or a loop...
Now: I want to build same way to control the time that program use, and put a limit (time limit exceed, something like this...)
Any one has a sugestion ?
Edit[1]:save.exe is a program wrote on C language, and use two parameters: source and destiny.
popen just don't execute save.exe, i say this because it don't gerenate anymore the destiny (with execit happens);

exec() will suspend your script while the exec'd program is running. Control will be returned only when the external program terminates. If you want to see if the program's hung up or waiting for input or somesuch, you use popen(), which returns a filehandle from which you can read the program's output. At that point you can do a polling loop to see if there's been any output:
$app = popen("your shell command here");
while($output = fgets($app)) {
// handle output
sleep(5); // sleep 5 seconds
}
If you want to send something to the app as input, then you have to use proc_open(), which allows bi-directional communication between your script and the external program.

Related

How can I run a script in another process on Windows Server 2008, using PHP v5.4?

The main reason is because I don't want to hold up the current PHP process. I want users to be able to navigate around during the script execution.
The script in question (importer.php) updates a txt file with a percentage as it completes, javascript intercepts this txt file and outputs the percentage using a timer every 5 seconds to keep the user updated (all in the form of a load bar).
I've been able to launch the script like so:
$cmd = '"C:\/path\/to\/v5.4\/php" importer.php';
pclose(popen($cmd, "r"));
exit;
This runs the script, but hangs the current process until importer.php completes. Is there a way to get out of the current process and launch this using another one instead?
I read that using & at the end of the cmd tells the script to not wait, but I believe this is a *nix command and since I'm running on a Windows box, I can't use it... unless perhaps there is an alternative for Windows?
According to the documentation at http://php.net/passthru you should be able to execute your command using that, as long as you redirect your output.
$cmd = '"C:\/path\/to\/v5.4\/php" importer.php';
// Use passthrough here, and redirect the output to a temp textfile.
passthru($cmd . '>%TEMP%\importerOutput.txt');
exit;
I was able to resolve this issue by using a WshShell Object; WScript.Shell
$WshShell = new COM("WScript.Shell");
$WshShell->Run('"C:\/path\/to\/v5.4\/php-win.exe" -f "C:\/path\/to\/code\/snippet\/importer.php" var1 var2 var3', 0, false);
Note: I have spaces in my file structure so I needed to add quotes around the paths to the files. I was also able to pass variables, var1, var2, and var3. I've also used \/ to escape my slashes.
I'll break the Run array down a bit for my case:
The first; is the command you want to run (path to php, path to script, and variables to pass).
The second; 0 - Hides the window and activates another window (link below for more options).
The third; false - Boolean value indicating whether the script should wait for the program to finish executing before continuing to the next statement in your script. If set to true, script execution halts until the program finishes.
For more information on WScript.Shell visit http://msdn.microsoft.com/en-us/library/d5fk67ky(v=vs.84).aspx for details.
Hope this helps someone else!

PHP exec - echo output line by line during progress

I'm trying to find a way in which I can echo out the output of an exec call, and then flush that to the screen while the process is running. I have written a simple PHP script which accepts a file upload and then converts the file if it is not the appropriate file type using FFMPEG. I am doing this on a windows machine. Currently my command looks like so:
$cmd = "ffmpeg.exe -i ..\..\uploads\\".$filename." ..\..\uploads\\".$filename.".m4v 2>&1";
exec( $cmd, $output);
I need something like this:
while( $output ) {
print_r( $output);
ob_flush(); flush();
}
I've read about using ob_flush() and flush() to clear the output buffer, but I only get output once the process has completed. The command works perfectly, It just doesn't update the Page while converting. I'd like to have some output so the person knows what's going on.
I've set the time out
set_time_limit( 10 * 60 ); //5 minute time out
and would be very greatful if someone could put me in the right direction. I've looked at a number of solutions which come close one Stackoverflow, but none seem to have worked.
Since the exec call is a blocking call you have no way of using buffers to get status.
Instead you could redirect the output in the system call to a log file. Let the client query the server for progress update in which case the server could parse the last lines of the log file to get information about current progress and send it back to the client.
exec() is blocking call, and will NOT return control to PHP until the external program has terminated. That means you cannot do anything to dump the output on a line-by-line basis because PHP is suspended while the external app is running.
For what you want, you need to use proc_open, which returns a filehandle you can read from in a loop. e.g.
$fh = proc_open('.....');
while($line = fgets($fh)) {
print($line);
flush();
}
There are two problems with this approach:
The first is that, as #Marc B notes, the fact that exec will block until it's finished. You'll have to devise some way of measuring progress.
The second is that using ob_flush() in this way amounts to holding the connection between server & client open and dribbling the data out a little at a time. This is not something that the HTTP protocol was designed for and while it might work sometimes, it's not going to work consistently - different browsers and different servers will time out differently. The better way to do it is via AJAX calls: using Javascript's setTimeout() function (or setInterval()), make a call to the server periodically and have the server send back a progress report.

Windows PHP repeating script via popen

I'm trying to create a browser-started self-calling/repeating PHP script on Windows with PHP (currently 5.3.24 but soon will be latest). It will act as a daemon to monitor changes in a database (every few seconds, so cron/schedule is out) and then call other PHP scripts to perform work when changes are found. For the purposes of this question please ignore the fact that I'd be better off doing this in C# or some other language :)
To keep things simple I started out by trying to use popen to run a second PHP script in the background...
// BatchMonitor.php
SaveToMonitorTable(1); // save 1st test entry to see if the script reached this point
$Command = '"" "C:\Program Files (x86)\PHP\v5.3\php.exe" C:\inetpub\wwwroot\Test.php --Instance=' . $Data->Instance;
pclose(popen("start /B $Command", "r"));
SaveToMonitorTable(2); // save 2nd test entry to see if the script reached this point
exit();
// Test.php
SaveToTestTable(1);
Sleep(10);
SaveToTestTable(2);
exit();
If I run BatchMonitor.php in the browser it works fine. As expected it will save 1 to the monitor table, call Test.php which saves 1 to the test table, the original BatchMonitor.php will continue without waiting for a response and save 2 to the monitor table before exiting, then 10 seconds later the test page saves 2 to the test table before exiting. The second script starts fine, the first script does not wait for a reply and all parameters are correctly passed between scripts. With everything working as intended I then changed the system to work as a repeating loop by calling itself (with delay) instead of another script...
// BatchMonitor.php
SaveToMonitorTable(1); // save 1st test entry to see if the script reached this point
$Command = '"" "C:\Program Files (x86)\PHP\v5.3\php.exe" C:\inetpub\wwwroot\BatchMonitor.php --Instance=' . $Data->Instance;
pclose(popen("start /B $Command", "r"));
SaveToMonitorTable(2); // save 2nd test entry to see if the script reached this point
exit();
If I run BatchMonitor.php in the browser it runs once and that is it. It will save 1 to the database, wait 10 seconds and then save 2 to the database before exiting. The page returns successfully with no script or PHP errors but it doesn't repeat as it should.
Both BatchMonitor.php and Test.php use line-for-line identical functions to get the parameters and both files run correctly and identical on the first iteration. If I use exec instead of popen then the page loops correctly with all logic working as expected (with the one obvious flaw of creating a never-ending chain of scripts awaiting for response values that will never come).
Am I missing something obvious? Does popen have some sort of secret rule that prevents a page/process from opening duplicates of itself? Are there any alternatives to using popen or exec? I read about WScript.Shell but it might be a while before I can schedule that to get enabled so for now it's not an option and I'm hoping there is something more standard that I can use.
I dont feel like this should cbe your actual answer, But why do you disbandon scheduled tasks/cronjobs because you want something done every X seconds? Having the script minute.php calling 5seconds.php with ofcouse 5 second intervals in between would create a repeated taak evert 5 seconds right?
Strangely enough you are kinda using the same sort of mechanism from your browser already.
My only concern would be to take the processed time in account and create a safe script which ensures no more than 1 '5seconds.php' can run at any given time.

I want to execute more than one method at a time in php

Hi Please help me in executing more than one method at a time in PHP.
Below is example:
<?php
function writeName()
{
sleep(3);
echo "Kai Jim Refsnes";
}
function b(){
sleep(3);
echo"b";
}
b();
writeName());
?>
Here above program take 6 sec to execute.But I want to run my both method simultaneously so that program should execute with in 3 sec(Multi threading).
With common PHP its not possible, because PHP is executed sequential. You may have a look at a job-server like gearman, or you may try to use forks (pcntl_fork()). It's not multi-threading, because there is no shared memory.
Sorry, but multithreading is not supported in PHP.
But you could start a PHP script which can run in the background using exec(). Just make sure you redirect it's output elsewhere.
That should be the closest you can get to "multithreading" without additional tools. Here's what the manual says:
Note: If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.

Abandon Long Processes in PHP (But let them complete)

I have an HTML form that submits to a PHP page which initiates a script. The script can take anywhere from 3 seconds to 30 seconds to run - the user doesn't need to be around for this script to complete.
Is it possible to initiate a PHP script, immediately print "Thanks" to the user (or whatever) and let them go on their merry way while your script continues to work?
In my particular case, I am sending form-data to a php script that then posts the data to numerous other locations. Waiting for all of the posts to succeed is not in my interest at the moment. I would just like to let the script run, allow the user to go and do whatever else they like, and that's it.
Place your long term work in another php script, for example
background.php:
sleep(10);
file_put_contents('foo.txt',mktime());
foreground.php
$unused_but_required = array();
proc_close(proc_open ("php background.php &", array(), $unused_but_required));
echo("Done);
You'll see "Done" immediately, and the file will get written 10 seconds later.
I think proc_close works because we've giving proc_open no pipes, and no file descriptors.
In the script you can set:
<?php
ignore_user_abort(true);
That way the script will not terminate when the user leaves the page. However be very carefull when combining this whith
set_time_limit(0);
Since then the script could execute forever.
You can use set_time_limit and ignore_user_abort, but generally speaking, I would recommend that you put the job in a queue and use an asynchronous script to process it. It's a much simpler and durable design.
You could try the flush and related output buffer functions to immediately send the whatever is in the buffer to the browser:
Theres an API wrapper around pcntl_fork() called php_fork.
But also, this question was on the Daily WTF... don't pound a nail with a glass bottle.
I ended up with the following.
<?php
// Ignore User-Requests to Abort
ignore_user_abort(true);
// Maximum Execution Time In Seconds
set_time_limit(30);
header("Content-Length: 0");
flush();
/*
Loooooooong process
*/
?>

Categories