Hi I'm running a process with popen;-
$handle = popen('python scriptos.py', "r");
while (!feof($handle)) {
$data = fgets($handle);
echo "> ".$data;
}
And I'm only getting 3 lines from a process that returns 5 lines. I run this exact command in CLi and I will get more response. It's as if it stops reading early (it can take time to complete and updates the next 2 lines whilst working, it's a progress indicator).
Am I doing anything wrong? Is proc_open more suitable (i've started seeing if I can try that).
The two missing lines are probably being written to STDERR, and popen() only returns a pointer for STDOUT.
You can either get a pointer for STDERR using proc_open(), or change the popen() line to
$handle = popen('python scriptos.py 2>&1', "r");
to redirect STDERR to STDOUT, so they are included in your output.
Related
I have
$bytesCount = file_put_contents( "somefile.log", "some text\n", FILE_APPEND | LOCK_EX );
What happens if another process is writing** on somefile.log?
Does file_put_contents fails with a runtime error ?
Does if fails with $bytesCount === false
Or does does it pause the script until the file is unlocked and then performs the write operation?
(**) or more generally another process has an exclusive lock on the file
[ I'm on a *nix platform with php 5.6 ]
When file_put_contents attempts to write on a locked file waits until the file is unlocked then performs the write and returns the number of bytes written.
Proof:
I wrote a simple two-scripts test:
the first script writes a 100MB file (on a slow USB2-connected drive);
the second script appends a short string to the same file.
The core of the two scripts are these four lines:
echo Milliseconds() . ": Start writing file on file\n";
$bytesCount = file_put_contents( "/Volumes/myHD/somefile.txt", $buffer, FILE_APPEND | LOCK_EX );
var_export( $bytesCount );
echo "\n" . Milliseconds() . ": Done writing on file\n";
Where Milliseconds() is a function that returns the current unix timestamp in milliseconds.
In the first script $buffer is a 100MB string, in the second script $buffer = "MORE-DATA\n";
Running the first script and quickly starting the second one result in this output:
Script 1:
$ php test1.php
1481892766645: Start writing file on file
100000000
1481892769680: Done writing on file
$
Script 2:
$ php test2.php
1481892766831: Start writing file on locked file
10
1481892769909: Done writing file on locked file
$
Note that:
the second script attempted writing 186 ms after the first one but before the second script was done writing. So the second script actually accessed a locked file.
the second script termiated writing 229 ms after the first one
Checking the result after both scripts terminated execution:
$ stat -f%z /Volumes/myHD/somefile.txt
100000010
$
10MB + 10 bytes were written
$ tail -c 20 /Volumes/myHD/somefile.txt
0123456789MORE-DATA
$
The second script actually appended the string at the end of the file
It should be obvious that this should only be used if you're making one write, if you are writing multiple times to the same file you should handle it yourself with fopen and fwrite, the fclose when you are done writing.
Benchmark below:
file_put_contents() for 1,000,000 writes - average of 3 benchmarks:
real 0m3.932s
user 0m2.487s
sys 0m1.437s
fopen() fwrite() for 1,000,000 writes, fclose() - average of 3 benchmarks:
real 0m2.265s
user 0m1.819s
sys 0m0.445s
For overwriting when using ftp, this is helpful:
/* create a stream context telling PHP to overwrite the file */
$options = array('ftp' => array('overwrite' => true));
$stream = stream_context_create($options);
http://php.net/manual/en/function.file-put-contents.php
I have a php script that reads text files. I use fgetc() to get every character one by one. I open file to read from with fopen(),and then I use file descriptor returned from fopen() as a first argument to fgetc(). I tried to do the same thing with reading from STDIN. I wanted to run the script in a terminal, give it the whole text (that was in a text file before) and press enter. I thought that the script would read it and will run as if it read from a text file, but it doesn't work. It only works when a type every single character alone and press enter after it. Why is that? Is there a possibility to make the script behave the way I wanted? That I can give it the whole text to the terminal at once? Should I use different functions or something?
$inputFile = fopen($path, "w");
while(($char = fgetc($inputFile)) !== false){
dosomething();
}
What I'm trying to do is to replace $inputFile in fgetc()with STDIN.
See http://php.net/manual/en/features.commandline.io-streams.php, second comment
Note, without the stream_set_blocking() call, fgetcsv() hangs on STDIN, awaiting input from the user, which isn't useful as we're looking for a piped file. If it isn't here already, it isn't going to be.
<?php
stream_set_blocking(STDIN, 0);
$csv_ar = fgetcsv(STDIN);
I think it's the same for fgetc. After all it
string fgetc ( resource $handle ) Gets a character from the given file pointer.
Emphasis mine.
See http://php.net/manual/en/function.fgetc.php
...
So I open a process with $process = proc_open("my_process", $descriptors, $pipes);
Then I write to the stdin of the process using fwrite($pipes[0], "some_command");
Then I have to close the pipe using fclose($pipes[0]); before i can read from the pipes stdout using $output = stream_get_contents($pipes[1]);. If I don't close the pipe my php script hangs on this call.
But once I have received the output from stdout what if I want to send another command to the process...the stdin pipe is closed so I have no way to send it. So is it possible to somehow send another command to the process?
It sounds like the other process is blocking waiting for EOL or EOF on STDIN. What are you trying to execute?
Regardless, there's a pretty good chance this will sort it out: Just append \n to the command you are sending to the other process.
E.g.
$process = proc_open("my_process", $descriptors, $pipes);
$command = "some_command";
fwrite($pipes[0], $command."\n");
// Fetch the contents of STDOUT
Now, one issue that you may also be running into is to do with the fact that you are using stream_get_get_contents() - which will wait for EOF before it returns. You may have to be a bit more intelligent about how your retrieve the data from $pipes[1], using fgets() and looking for a specific number of lines or a string to indicate the end of the output.
If you tell us what you are executing, I may be able to give you a more specific answer.
I want to read everything from a textfile and echo it. But there might be more lines written to the text-file while I'm reading so I don't want the script to exit when it has reached the end of the file, instead I wan't it to wait forever for more lines. Is this possible in php?
this is just a guess, but try to pass through (passthru) a "tail -f" output.
but you will need to find a way to flush() your buffer.
IMHO a much nicer solution would be to build a ajax site.
read the contents of the file in to an array. store the number of lines in the session. print the content of the file.
start an ajax request every x seconds to a script which checks the file, if the line count is greater then the session count append the result to the page.
you could use popen() inststed:
$f = popen("tail -f /where/ever/your/file/is 2>&1", 'r');
while(!feof($f)) {
$buffer = fgets($f);
echo "$buffer\n";
flush();
sleep(1);
}
pclose($f)
the sleep is important, without it you will have 100% CPU time.
In fact, when you "echo" it, it goes to the buffer. So what you want is "appending" the new content if it's added while the browser is still receiving output. And this is not possible (but there are some approaches to this).
I solved it.
The trick was to use fopen and when eof is reached move the cursor to the previous position and continue reading from there.
<?php
$handle = fopen('text.txt', 'r');
$lastpos = 0;
while(true){
if (!feof($handle)){
echo fread($handle,8192);
flush();
$lastpos = ftell($handle);
}else{
fseek($handle,$lastpos);
}
}
?>
Still consumes pretty much cpu though, don't know how to solve that.
You may also use filemtime: you get latest modification timestamp, send the output and at the end compare again the stored filemtime with the current one.
Anyway, if you want the script go at the same time that the browser (or client), you should send the output using chunks (fread, flush), then check any changes at the end. If there are any changes, re-open the file and read from the latest position (you can get the position outside of the loop of while(!feof())).
I have a PHP script which executes a shell command:
$handle = popen('python last', 'r');
$read = fread($handle, 4096);
print_r($read);
pclose($handle);
I echo the output of the shell output. When I run this in the command I get something like this:
[root#localhost tester]# python last
[last] ZVZX-W3vo9I: Downloading video webpage
[last] ZVZX-W3vo9I: Extracting video information
[last] ZVZX-W3vo9I: URL: x
[download] Destination: here.flv
[download] 0.0% of 10.09M at ---b/s ETA --:--
[download] 0.0% of 10.09M at 22.24k/s ETA 07:44
[download] 0.0% of 10.09M at 66.52k/s ETA 02:35
[download] 0.1% of 10.09M at 154.49k/s ETA 01:06
[download] 0.1% of 10.09M at 162.45k/s ETA 01:03
However, when I run that same command from PHP I get this output:
[last] ZVZX-W3vo9I: Downloading video webpage
[last] ZVZX-W3vo9I: Extracting video information
[last] ZVZX-W3vo9I: URL: x
[download] Destination: here.flv
As you can see the bottom bit is missing which is the bit I need!! The problem before was that the percentages were being updated on the same line but now I have changed my Python script so that it creates a new line. But this made difference! :(
This question is related to this one.
Thank you for any help.
Update
Needed to redirect output "2>&1". Arul got lucky :P since I missed the deadline to pick the one true answer which belonged to Pax!
You read only the first 4,096 bytes from the pipe, you'll need to place the fread/print_r in a loop and check for the end-of-file using the feof function.
$handle = popen('python last', 'r');
while(!feof($handle))
{
print_r(fread($handle, 4096));
}
pclose($handle);
The first step is to see where the output is going. The first thing I would do is choose a slightly smaller file so that you're not waiting around for seven minutes for each test.
Step 1/ See where things are being written in the shell. Execute the command python last >/tmp/stdout 2>/tmp/stderr then look at those two files. Ideally, everything will be written to stdout but that may not be the case. This gives you the baseline behavior of the script.
Step 2/ Do the same thing when run from PHP by using $handle = popen('python last >/tmp/stdout 2>/tmp/stderr', 'r');. Your PHP script probably won't get anything returned in this case but the files should still be populated. This will catch any changed behavior when running in a non-terminal environment.
If some of the output goes to stderr, then the solution should be as simple as $handle = popen('python last 2>&1', 'r');
Additionally, the doco for PHP states (my bolding):
Returns a file pointer identical to that returned by fopen(), except that it is unidirectional (may only be used for reading or writing) and must be closed with pclose(). This pointer may be used with fgets(), fgetss(), and fwrite().
So I'm not sure you should even be using fread(), although it's shown in one of the examples. Still, I think line-based input maps more to what you're trying to achieve.
Irrespective of all this, you should read the output in a loop to ensure you can get the output when it's more than 4K, something like:
$handle = popen ('python last 2>&1', 'r');
if ($handle) {
while(! feof ($handle)) {
$read = fgets ($handle);
echo $read;
}
pclose ($handle);
}
Another thing to look out for, if you're output is going to a browser and it takes too long, the browser itself may time out since it thinks the server-side connection has disappeared. If you find a small file working and your 10M/1minute file not working, this may be the case. You can try flush() but not all browsers will honor this.
It is much much easier to do this:
$output = `python last`;
var_dump($output);
The 'ticks' (`) will execute the line and capture the output. Here is a test example:
File test.php:
<?php
echo "PHP Output Test 1\n";
echo "PHP Output Test 2\n";
echo "PHP Output Test 3\n";
echo "PHP Output Test 4\n";
echo "PHP Output Test 5\n";
?>
File capture.php:
<?php
$output = `php test.php`;
var_dump($output);
?>
Output from php capture.php:
string(80) "Test PHP Script
Test PHP Script
Test PHP Script
Test PHP Script
Test PHP Script
"
You can then split the output into an array based on line breaks:
$outputArray = explode('\n', $output);
OR use proc_open(), which gives you much more control than popen as you can specify where you want stdin, stdout and stderr to be handled.
OR you can fopen STDIN or php://stdin and then pipe to php:
? python last | php script.php
I would go with option 1 and using the backticks, easiest way.
I would not be surprised to find that the progress report is omitted when the output is not going to a tty. That is, the PHP is capturing everything that is sent, but the progress report is not being sent.
There is ample precedent for commands behaving differently depending on where the output goes - starting with the good old ls command.
Of course, if you wrote the Python script that you're running, this is much less likely to be the cause of the trouble.
How can you verify whether this hypothesis is valid? Well, you could start by running the script at the command line with the standard output going to one file and the standard error going to another. If you see the progress information in one of those files, you know a whole lot more about what is going on. If you see the progress information on the screen still, then the script is probably echoing the progress information to /dev/tty, but when PHP runs it, there is no /dev/tty for the process. If you don't see the progress information at all (on screen or in a file), then my original hypothesis is probably verified.
Try running stream_get_contents() on $handle. It's a better way to work with resources where you don't know the exact size of what you're trying to retrieve.
Could you read in a while loop instead of using fread()?
while( !feof($handle) )
echo fgets($handle);
You may have to flush() also.
What do you see with
python last | less
?
Maybe the output you want is emitted on STDERR. Then you have to start it this way:
$handle = popen('python last 2>&1', 'r');
The 2>&1 directs STDERR into STDOUT, which you are capturing with popen.
If you're just trying to show the progress of the python command inside a terminal window, then I would recommend the following instead:
<?php
system("python last > `tty`");
You won't need to capture the output then, and the user can even Ctrl+C the download without aborting the PHP script.
You are missing a flush call. (In your python app, and possibly your php app aswell)
That is because when you use standard stream stdin/stdout interactively (from cmdline) they are in a line-buffered mode (in short system flushes on each new line), but when you call it from within your program streams are in a fully buffered mode(doesn't output till system buffer is full).
More info on this here buffering in standard streams