I am running a script every night and the output of the script will only be send to a mail address. But the problem is that I need to receive a copy of the output in my own mailbox. I registered an shutdown handler in the script and I tried to send a mail with functions like ob_get_contents which actually shows data. But only the last thing I printed to the terminal.
cronMail('Cron', ob_get_contents());
The function called is just a simple function which adds the default receiver and sender and call the PHP Mail function.
The output in the mail is:
array()
While the terminal has te following output:
Starting cron...
Exiting...
array()
Can anyone tell me how to receive the whole output? I started the output buffer by using the ob_start method. And after each line I make sure there is an ob_flush method called so the output will also be send to the browser if the script is called directly.
ob_flush stands in your way, see the linked description on the manual page, it is pretty clear about that: It flushes the buffer so outputs it.
You do not want that. Remove the calls to it and you should be fine.
ob_start();
... your script without "ob_flush()" ...
$buffer = ob_get_clean(); # finally get the output buffer as string
echo $buffer; # pass output along for cron
cronMail('Cron', $buffer); # send your mail
This variant ensures that you get your own email but also the output is passed along to cron. This can be useful if you do some error reporting in the cronMail function, so that at least there is some way to further debug that.
Another alternative is to register an output handling function that stores the output on the go. But that is less trivial so I keep it out.
Related
I'm trying to find a way in which I can echo out the output of an exec call, and then flush that to the screen while the process is running. I have written a simple PHP script which accepts a file upload and then converts the file if it is not the appropriate file type using FFMPEG. I am doing this on a windows machine. Currently my command looks like so:
$cmd = "ffmpeg.exe -i ..\..\uploads\\".$filename." ..\..\uploads\\".$filename.".m4v 2>&1";
exec( $cmd, $output);
I need something like this:
while( $output ) {
print_r( $output);
ob_flush(); flush();
}
I've read about using ob_flush() and flush() to clear the output buffer, but I only get output once the process has completed. The command works perfectly, It just doesn't update the Page while converting. I'd like to have some output so the person knows what's going on.
I've set the time out
set_time_limit( 10 * 60 ); //5 minute time out
and would be very greatful if someone could put me in the right direction. I've looked at a number of solutions which come close one Stackoverflow, but none seem to have worked.
Since the exec call is a blocking call you have no way of using buffers to get status.
Instead you could redirect the output in the system call to a log file. Let the client query the server for progress update in which case the server could parse the last lines of the log file to get information about current progress and send it back to the client.
exec() is blocking call, and will NOT return control to PHP until the external program has terminated. That means you cannot do anything to dump the output on a line-by-line basis because PHP is suspended while the external app is running.
For what you want, you need to use proc_open, which returns a filehandle you can read from in a loop. e.g.
$fh = proc_open('.....');
while($line = fgets($fh)) {
print($line);
flush();
}
There are two problems with this approach:
The first is that, as #Marc B notes, the fact that exec will block until it's finished. You'll have to devise some way of measuring progress.
The second is that using ob_flush() in this way amounts to holding the connection between server & client open and dribbling the data out a little at a time. This is not something that the HTTP protocol was designed for and while it might work sometimes, it's not going to work consistently - different browsers and different servers will time out differently. The better way to do it is via AJAX calls: using Javascript's setTimeout() function (or setInterval()), make a call to the server periodically and have the server send back a progress report.
Let's say a user submits a form to submit.php, the server will return two possible responses: A or B. If the server is going to return B, then I'd like the server to send an email to an address.
<?php
if(B){
echo 'B';
}
mail($to, $subj, $msg);
?>
The problem is that it often takes some time to wait mail() to finish... the user can only get responses after the server finishes mail(). This is a very bad experience for my users.
Is there any way that the server can return response 'B' to the user immediately, then it sends mail?
Take a look at the flush() function
Flushes the write buffers of PHP and whatever backend PHP is using (CGI, a web server, etc). This attempts to push current output all the way to the browser with a few caveats.
flush() may not be able to override the buffering scheme of your web server and it has no effect on any client-side buffering in the browser. It also doesn't affect PHP's userspace output buffering mechanism. This means you will have to call both ob_flush() and flush() to flush the ob output buffers if you are using those.
You can use a task queueing platform to execute asynchronous tasks, it shouldn't be so complicated to setup, something like ActiveMQ
Change the code to the following:
if (B) {
echo 'B';
flush();
ob_flush();
}
However, it should be noted that many things could prevent this from working (as quoted from the flush() documentation):
Several servers, especially on Win32, will still buffer the output from your script until it terminates before transmitting the results to the browser.
Server modules for Apache like mod_gzip may do buffering of their own that will cause flush() to not result in data being sent immediately to the client.
Even the browser may buffer its input before displaying it. Netscape, for example, buffers text until it receives an end-of-line or the beginning of a tag, and it won't render tables until the </table> tag of the outermost table is seen.
Some versions of Microsoft Internet Explorer will only start to display the page after they have received 256 bytes of output, so you may need to send extra whitespace before flushing to get those browsers to display the page.
Use this PHP functionality register_shutdown_function
<?php
if(B){
echo 'B';
}
$m = "mail($to, $subj, $msg);";
register_shutdown_function(create_function('',$m));
?>
register_shutdown_function: Register a function for execution on shutdown
the trick is to use create_function() to create a "function" that calls the desired function with static parameters.
more about create_function()
I have a php page that when called from a browser displays a page, it also generates a static html page so - http://somewebsite/create.php when run from a browser creates newpage.html.
I also run this page from the CLI #php create.php it creates newpage.html but I get all the output on the screen, this slows down execution and if I have to run it many times can take hours.
Is there a way to run #php create.php and to supress all output to the screen
MArtyn
Check out Output Buffering
You begin with ob_start() to start buffering output, then you can use $output = ob_get_clean() to get everything that would have been displayed and store it in $output, so you can do whatever you want with it.
I believe there are $_SERVER variables you can use to check if you're running from the command line or not. (Obviously you can check for the HTTP headers that are normally there, they will not be set from the CLI)
var_dump($_SERVER) and see if there's anything intuitive looking :)
You can pipe the output on the command-line to /dev/null such as:
php http://somewebsite/create.php > /dev/null
This doesn't actually stop output from being produced by php, only from being displayed.
If you actually want to stop the output from being produced, you could use php's output buffering with ob_start() and put the output into a variable with ob_get_flush() at the end of the code and only echo the output if a certain param is/is not passed to the script.
I'm making a PHP IRC Bot, and it works great.
What I want to do though, is to have a live debugging option, which means I need to see the commands sent from an operator to the bot live. The problem is, as long as the bot's running, no output is being sent, even if I echo, printf, or var_dump.
So my question is, how can I force PHP to send the current output buffer without waiting for the logic to finish (because theoretically, it won't finish ever :P)
EDIT:
flush() or ob_flush() doesn't seem to work, see this simple example: http://codepad.viper-7.com/ks7zEy
use flush();
You're looking for ob_flush
Just put this at the top of your page:
while( ob_get_level() > 0 ) {
#ob_end_flush();
}
#ob_implicit_flush();
Final update
Seems like I did make a very simple error. Since I already have a stream implementation I can just not start reading from the stream :D
I'm trying to achieve fire-and-forget like functionality in PHP.
From php.net
<?php
ignore_user_abort(true);
header("Content-Length: 4");
header("Connection: Close");
echo "abcd";
flush();
sleep(5);
echo "Text user should not see"; // because it should have terminated
?>
This works if I open the script with a browser. (shows "abcd").
But if I open it with file_get_contents or some stream library it will wait for ~5 seconds and show the second text as well.
I'm using PHP 5.2.11 / Apache 2.0
Update
I seems there is some confusion about what I'm trying to accomplish.
I don't want to hide output using output buffers (that's stupid). I want to have the client terminate before the server starts a possibly lengthy process (sleep(5)) and I don't want the client to wait for it (this is what fire-and-forget means, sort off).
The use of output buffers is merely a side effect. I've amended the sample code without the use of output buffers.
What I don't understand is: why does this script behave differently when accessing it from the browser vs. fetching it in PHP with file_get_contents("http://dev/test.php") or some stream library? What I've seen in testing is that for instance stream_get_contents will actually block for 5 seconds before it returns any output at all, the is quite the opposite of what I want.
Update2
Some more results:
The browser somehow responds to the flush(). I can't figure out how to replicate this behavior with streams in PHP, my streams keep blocking.
I've tried fread and found that it behaves similar to stream_get_contents.
Specifying a maxlength has no effect, it will still block for ~5 seconds.
Changing the blocking mode has no effect (other than generating a bunch more calls to stream_get_contents()). It will wait ~5 seconds before returning anything.
stream_set_read_buffer has no effect (tested on a PHP 5.3.5 sever)
The second portion of text is showing up because you're stopping output buffering with ob_end_flush() and ob_end_clean(). When that happens PHP outputs content as normal. Try something like the following:
<?php
ob_start(); // turn on output buffering
print "Text the user will see.";
ob_flush(); // send above output to the user and keep output buffering on
print "Text the user will never see";
ob_end_clean(); // empty the buffer and turn off output buffering. your script should end here.
?>
It's important for ob_end_clean() to appear at the end of the script. It empties the buffer and does not send its contents to the user, thus keeping everything after ob_flush() hidden.
How do you access the script using file_get_contents? How do you access it with your browser? If you access the script without "http://", of course it will never get executed. Use the same URL as in the browser.
Edit:
The browser will render the page even before the connection is closed. Even if you flush, I don't think the connection is closed. You can fire up Wireshark and check. stream_get_contents and file_get_contents will block until they have all the output. Even if you flushed, they can't be sure that there isn't more content. Since the content-length header didn't seem to make {file,stream}_get_contents return earlier, you probably need to implement your own buffering, ala. fopen, read, fclose.
Seems like I did make a very simple error. Since I already have a stream implementation I can just not start reading from the stream :D