PHP set proc_open to output directly to user - php

I'm using proc_open with pdftk to pre-fill some forms with data, this works pretty well, but rather than save the result to a file and then read the file back, I'd like to print the file directly out. I've got this working so I'm not having any problems. But I'd like to direct the output of proc_open directly to the stream returned to the user so that I don't have to hold the value in memory in php. So far I have the following:
$descriptorspec = array(
0 => array("pipe","r"),
1 => array("pipe","w"),
2 => array("file","error.txt","a") );
$process = proc_open($command, $descriptorspec, $pipes);
if( is_resource( $process ) ) {
fwrite($pipes[0], $fdf);
fclose($pipes[0]);
echo stream_get_contents($pipes[1]));
fclose($pipes[1]);
proc_close($process);
}
I'd like to direct the result directly out to the use as you would get in the above code, without actually saving the value in a variable and then printing it out. Is there a way to direct a stream in php's output directly to the output. I'd like to do this to save memory and reduce the time it takes for the code to run.
Does anyone know of a function for a stream in php that prints the stream result directly out. Or is there a setting for proc_open that does this. Either way I fear this may not work at all, as I may need to add a content-length header to the output to display the PDF directly. If anyone knows of the function to print the stream directly out, is there also a way to get the byte length of the stream without actually reading it in.

Since no one has posted an answer this I may as well post the answer I've found.
Open a stream to php://output with fopen mode w, and use stream_copy_to_stream to send the data from the proc_open process to the user directly. There does not appear to be an easy method to get the length or size of the data in the stream however.
$descriptorspec = array(
0 => array("pipe","r"),
1 => array("pipe","w"),
2 => array("file","error.txt","a") );
$process = proc_open($command, $descriptorspec, $pipes);
if( is_resource( $process ) ) {
fwrite($pipes[0], $fdf);
fclose($pipes[0]);
/* New Code */
if( $fout = fopen("php://output","w") ) {
stream_copy_to_stream($pipes[1],$fout);
fclose($fout);
}
fclose($pipes[1]);
$result = proc_close($process);
}

Related

Chain-processing stream of data

In my app, I need to generate rather large zip archives, and to store them encrypted for security reasons.
I have found a way to do so using streams so as to reduce the amount of RAM required and so far this is working rather well. I've piped ZipStream's output to an OpenSSL process call and that gives me the result I want, using the following code:
$cmd = "openssl enc -e -aes-256-cbc -K $myKey -iv $myIv -out $myFile";
$pipesStructure = [0 => array("pipe", "r"), 1 => array("pipe", "w"), 2 => array("pipe", "w")];
$process = proc_open($cmd, $pipesStructure, $pipes);
if (is_resource($process)) {
$zipOptions = new Archive();
$zipOptions->setOutputStream($pipes[0]);
$zip = new ZipStream('archive.zip', $zipOptions);
foreach($myFiles as $f) {
$zip->addFile($f->name, $f->content);
}
$zip->finish();
fclose($pipes[0]);
fclose($pipes[1]);
fclose($pipes[2]);
$return_value = proc_close($process);
}
Now, I would like, while I do all that, to compute the hash of the unencrypted zip archive. The idea is to insert an additional "processing block" through which the stream would pass:
[ ZipStream ] -----> [ Hash computing ] -----> [ Encryption ] ------> Final output
But so far I'm unable to understand well enough how streams work and I'm failing to achieve that...
I'm pretty sure I need to use hash_init, hash_update_stream, and hash_final, but I can't seem to be able to "feed" the data to hash_update_stream.
I've initialized a hash context and added the following code before the loop that adds all files to the zip, hoping hash_update_stream would merely read what goes through $pipes[0], but apparently that's not the way to do it
...
hash_update_stream($hashContext, $pipes[0]);
...
Upon "finalizing" the hash, the output is the hash of an empty string ('')
Any pointers as to how to proceed?

Php proc_open - save process handle to a file and retrieve it

I use the following code to open a process with proc_open, and to save the handle and the pipes to a file:
$command = "COMMAND_TO_EXECUTE";
$descriptors = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("file", "error-output.txt", "a") // stderr is a file to write to
);
$pipes = array();
$processHandle = proc_open($command, $descriptors, $pipes);
if (is_resource($processHandle)) {
$processToSave = array(
"process" => $processHandle,
"pipes" => $pipes
);
file_put_contents("myfile.bin", serialize($processToSave) );
}
And in a second moment i need to retrieve this file handle from the file, i've used this code:
$processArray = unserialize(file_get_contents("myfile.bin"));
$processHandle = $processArray["process"];
$pipes = $processArray["pipes"];
But when I print a var_dump of $processHandle and $pipes after retrieving from file, I'll get integers instead of resource or process, but why??
var_dump($processHandle) -> int(0)
var_dump($pipes) - > array(2) { int(0), int(0) }
And at this point of course, if I try to close the pipes, i will get an error, resource expected, integer given.
How can I make this working? (NOTE: This is the solution I'm looking for)
But alternatively, I can get also the pid of process and then use this pid to stop or kill or do anything else with the process, but what about the pipes?
How can I read/write or save error from/to the process?
Thank you
Found the solution to myself, it's not to possible to serialize resource and when the script has done, those resource handler were free.
Solution was to create a daemon listening on a port, wich on request launch and stop process. Because the process is always running, it can maintain a list of handler process and stop when requested.

PHP move file at certain time

I am trying to move a file from one folder to another at a specific time. In order to achieve this I am trying to use the Linux at command with a pipe :
`mv file /to/dest | at h:m d.m.y`
This is what I've written:
$move = "mv $filename /destination/folder";
$at = "at $my_datetime";
$res = execute_pipe($move,$at);
where the execute_pipe function is defined as following:
function execute_pipe($cmd1 , $cmd2)
{
$proc_cmd1 = proc_open($cmd1,
array(
array("pipe","r"), //stdin
array("pipe","w"), //stdout
array("pipe","w") //stderr
),
$pipes);
$output_cmd1 = stream_get_contents($pipes[1]);
fclose($pipes[0]);
fclose($pipes[1]);
fclose($pipes[2]);
$return_value_cmd1 = proc_close($proc_cmd1);
$proc_cmd2 = proc_open($cmd2,
array(
array("pipe","r"), //stdin
array("pipe","w"), //stdout
array("pipe","w") //stderr
),
$pipes);
fwrite($pipes[0], $output_cmd1);
fclose($pipes[0]);
$output_cmd2 = stream_get_contents($pipes[1]);
fclose($pipes[1]);
fclose($pipes[2]);
$return_value_cmd2 = proc_close($proc_cmd2);
return $output_cmd2;
}
The problem is that the files get moved right away, ignoring the at command. What am I missing? Is there a better way to do this?
To me it seems like your problem has nothing to do with PHP. You're just using the Shell incorrectly.
The Man page of at reads:
at and batch read commands from standard input or a specified file
which are to be executed at a later time, using /bin/sh.
But your usage of the shell does execute your "mv file /destination" command, and then pipes the OUTPUT of that command to at. On a successful moving operation that output will ne nothing. So by using the pipe you actually move the file right away and tell at to do nothing at your specified time.
Read the man page of at by typing man at into your terminal to resolve the issue. Hint: if you want to use STD INPUT, echo'ing your command might help ;)

How to background a process via proc_open and have access to STDIN?

I'm happily using proc_open to pipe data into another PHP process.
something like this
$spec = array (
0 => array('pipe', 'r'),
// I don't need output pipes
);
$cmd = 'php -f another.php >out.log 2>err.log';
$process = proc_open( $cmd, $spec, $pipes );
fwrite( $pipes[0], 'hello world');
fclose( $pipes[0] );
proc_close($process);
In the other PHP file I echo STDIN with:
echo file_get_contents('php://stdin');
This works fine, but not when I background it. Simply by appending $cmd with & I get nothing from STDIN. I must be missing something fundamental.
It also fails with fgets(STDIN)
Any ideas please?
You can't write to STDIN of a background process (at least, not in the normal way).
This question on Server Fault may give you some idea of how to work around this problem.
Unrelated: you say do don't need outputs in the spec, yet you specify them im your $cmd; you can write $spec like this:
$spec = array (
0 => array('pipe', 'r'),
1 => array('file', 'out.log', 'w'), // or 'a' to append
2 => array('file', 'err.log', 'w'),
);

How can I invoke the MySQL interactive client from PHP?

I'm trying to get
`mysql -uroot`;
to enter the MySQL interactive client just as executing
$ mysql -uroot
from the shell does.
It's okay if the PHP script exists after (or before), but I need it to invoke the MySQL client.
I've tried using proc_open() and of course system(), exec() and passthru(). Wondering if anyone has any tips.
New solution:
<?php
$descriptorspec = array(
0 => STDIN,
1 => STDOUT,
2 => STDERR
);
$process = proc_open('mysql -uroot', $descriptorspec, $pipes);
Old one:
Save for tab completion (you could probably get it in there if you read out bytes with fread instead of using fgets), this gets you on your way, lots left to tweak:
<?php
$descriptorspec = array(
0 => array("pty"),
1 => array("pty"),
2 => array("pty")
);
$process = proc_open('mysql -uroot', $descriptorspec, $pipes);
stream_set_blocking($pipes[1], 0);
stream_set_blocking($pipes[2], 0);
stream_set_blocking(STDIN,0);
do {
echo stream_get_contents($pipes[1]);
echo stream_get_contents($pipes[2]);
while($in = fgets(STDIN)) fwrite($pipes[0],$in);
} while (1);
I guess it does work, but it's waiting for some input. Try sending some sql commands to it's stdin. Of course, since the backtick operator doesn't support IO remapping, you'll need more complex process handling.

Categories