Reading COM (Serial Modem) in PHP
I'd need a COM interface (Windows,COM2) to read with PHP.
This Demo is going on.
Reading is a problem, it's running sometimes.
Is there an other way (no dio,no C++) maybe w32api_register_function() is better?
function rs232init($com,$bautrate)
{
`mode $com: BAUD=$bautrate PARITY=N data=8 stop=1 xon=off`;
}
function send($comport,$char)
{
$fp = fopen ("$comport", "w+");
if (!$fp)
{
echo "not open for read";
}
else {
fputs ($fp, $char);
fclose ($fp);
}
}
function read($comport2,$sek)
{
$buffer = "";
$fp2 = fopen ("$comport2", "r+");
if (!$fp2)
{
echo "port is open for read";
}
else
{
sleep($sek);
$buffer .= fgets($fp2, 4096);
}
return $buffer;
fclose ($fp2);
}
rs232init("com2","9600");
send("com2","3");
$a = read("com2","2");
echo $a;
The com2 device should be referenced as 'COM2:'
I should point out that there is a PHP serial class already available at http://www.phpclasses.org/package/3679-PHP-Communicate-with-a-serial-port.html.
I don't know what methods it uses internally, but perhaps it will make this a bit easier to get started.
Related
I need to transfer files of any type or size over HTTP/GET in ~1k chunks. The resulting file hash needs to match the source file. This needs to be done in native PHP without any special tools. I have a basic strategy but I'm getting odd results. This proof of concept just copies the file locally.
CODE
<?php
$input="/home/lm1/Music/Ellise - Feeling Something Bad.mp3";
$a=pathinfo($input);
$output=$a["basename"];
echo "\n> ".md5_file($input);
$fp=fopen($input,'rb');
if ($fp) {
while(!feof($fp)) {
$buffer=base64_encode(fread($fp,1024));
// echo "\n\n".md5($buffer);
write($output,$buffer);
}
fclose($fp);
echo "\n> ".md5_file($output);
echo "\n";
}
function write($file,$buffer) {
// echo "\n".md5($buffer);
$fp = fopen($file, 'ab');
fwrite($fp, base64_decode($buffer));
fclose($fp);
}
?>
OUTPUT
> d31e102b1cae9c73bbf5a12615a8ea36
> 9f03f6c88ed61c07cb534922d6d31864
Thanks in advance.
fread already advances the file pointer position, so there's no need to keep track of it. Same with frwite, so consecutive calls automatically append to the given file. Thus, you could simplify your approach to (code adapted from this answer on how to efficiently write a large input stream to a file):
$src = "a.test";
$dest = "b.test";
$fp_src = fopen($src, 'rb');
if ($fp_src) {
$fp_dest = fopen($dest, 'wb');
$buffer_size = 1024;
while(!feof($fp_src)) {
fwrite($fp_dest, fread($fp_src, $buffer_size));
}
fclose($fp_src);
fclose($fp_dest);
echo md5_file($src)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
echo md5_file($dest)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
}
If you want to keep both processes separated, you'd do:
$src = "a.test";
$dest = "b.test";
if (file_exists($dest)) {
unlink($dest); // So we don't append to an existing file
}
$fp = fopen($src,'rb');
if ($fp) {
while(!feof($fp)){
$buffer = base64_encode(fread($fp, 1024));
write($dest, $buffer);
}
fclose($fp);
}
function write($file, $buffer) {
$fp = fopen($file, 'ab');
fwrite($fp, base64_decode($buffer));
fclose($fp);
}
echo md5_file($src)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
echo md5_file($dest)."\n"; // 88e4af2f85080a280e7f00e50d96b7f7
As for how to stream files over HTTP, you might want to have a look at:
Streaming a large file using PHP
I'm using the following code to create and append data to a tar-archive in PHP. The problem is that phar does not use an exclusive lock on the tar-file causing problem when I have atomic writes to the same file.
function phar_put_contents($fname, $archive, $data) {
$i=0;
do {
$fp = #fopen($archive.'.lock', 'w');
if(!$fp) {
usleep(25);
continue;
}
if(flock($fp, LOCK_EX)) {
try{
$myPhar = new PharData($archive.'.tar',0);
$myPhar[$fname] = $data;
$myPhar->stopBuffering();
flock($fp, LOCK_UN) && #fclose($fp);
#unlink($archive.'.lock');
return true;
} catch (Exception $e) {
error_log($e->getMessage()." in ".$e->getFile().":".$e->getLine(),0);
unset($e);
#flock($fp, LOCK_UN) && #fclose($fp);
}
}
} while ($i++<8);
return false;
}
Using a look file seems to be a "good" solution but it's not optimal since my archives gets currupted quite frequently.
Ok, it seems like the Phar and PharData classes in PHP is somewhat unfinished, they neither have lock() nor close() making my approach for external locking nonworking..
The following code is what I used to try to have a function that appends data to a tar archive.
function phar_put_contents($fname, $archive, $data) {
$i=0;
do {
$fp = #fopen($archive.'.lock', 'w');
if(!$fp) {
usleep(25);
continue;
}
if(flock($fp, LOCK_EX)) {
try{
file_put_contents('/tmp/'.$fname, $data);
$tarCmd = "tar ". (file_exists($archive.".tar") ? "-rf ":"-cf ") .$archive.".tar -C /tmp ".$fname;
exec($tarCmd, $result, $status);
if($status!=0)
throw new Exception($tarCmd . implode($result, "\n"));
#unlink('/tmp/'.$fname);
flock($fp, LOCK_UN) && #fclose($fp);
#unlink($archive.'.lock');
return true;
} catch (Exception $e) {
error_log($e->getMessage()." in ".$e->getFile().":".$e->getLine(),0);
unset($e);
#flock($fp, LOCK_UN) && #fclose($fp);
}
}
} while ($i++<8);
return false;
}
Note that I'm using exec() and calls the external version of tar. This were a necessity since Phar does very unreliable flushes to the archive, making the tar-file broken since two instances of the code can modify the same file at the same time.
I am working on to send request to VSP200 device, my device is connected to com port8 of windows machine. I am using fopen() of PHP to open the com port, but I am getting an error
Warning: fopen(COM8:) [function.fopen]: failed to open stream
can you please tell me, what is wrong in my code,
$fp = fopen ("COM8:", "w+");
if (!$fp) {
echo 'not open';
}
else{
echo 'port is open for write<br/>';
$string .= '<STX>C30C10178C10100C103110606C103081000C10100C10101C100<ETX>';
fputs ($fp, $string );
echo $string;
fclose ($fp);
}
$fp = fopen ("COM8:", "r+");
if (!$fp) {
echo 'not open for read';
}
else{
echo '<br/> port is open for read<br/>';
$buffer = fread($fp, 128 );
echo $buffer;
fclose ($fp);
}
You should not include the trailing colon in the port name:
$fp = fopen ("COM8", "w+");
I'm using a simple unzip function (as seen below) for my files so I don't have to unzip files manually before they are processed further.
function uncompress($srcName, $dstName) {
$string = implode("", gzfile($srcName));
$fp = fopen($dstName, "w");
fwrite($fp, $string, strlen($string));
fclose($fp);
}
The problem is that if the gzip file is large (e.g. 50mb) the unzipping takes a large amount of ram to process.
The question: can I parse a gzipped file in chunks and still get the correct result? Or is there a better other way to handle the issue of extracting large gzip files (even if it takes a few seconds more)?
gzfile() is a convenience method that calls gzopen, gzread, and gzclose.
So, yes, you can manually do the gzopen and gzread the file in chunks.
This will uncompress the file in 4kB chunks:
function uncompress($srcName, $dstName) {
$sfp = gzopen($srcName, "rb");
$fp = fopen($dstName, "w");
while (!gzeof($sfp)) {
$string = gzread($sfp, 4096);
fwrite($fp, $string, strlen($string));
}
gzclose($sfp);
fclose($fp);
}
try with
function uncompress($srcName, $dstName) {
$fp = fopen($dstName, "w");
fwrite($fp, implode("", gzfile($srcName)));
fclose($fp);
}
$length parameter is optional.
If you are on a Linux host, have the required privilegies to run commands, and the gzip command is installed, you could try calling it with something like shell_exec
SOmething a bit like this, I guess, would do :
shell_exec('gzip -d your_file.gz');
This way, the file wouldn't be unzip by PHP.
As a sidenote :
Take care where the command is run from (ot use a swith to tell "decompress to that directory")
You might want to take a look at escapeshellarg too ;-)
As maliayas mentioned, it may lead to a bug. I experienced an unexpected fall out of the while loop, but the gz file has been decompressed successfully. The whole code looks like this and works better for me:
function gzDecompressFile($srcName, $dstName) {
$error = false;
if( $file = gzopen($srcName, 'rb') ) { // open gz file
$out_file = fopen($dstName, 'wb'); // open destination file
while (($string = gzread($file, 4096)) != '') { // read 4kb at a time
if( !fwrite($out_file, $string) ) { // check if writing was successful
$error = true;
}
}
// close files
fclose($out_file);
gzclose($file);
} else {
$error = true;
}
if ($error)
return false;
else
return true;
}
I need to retrieve a small amount of data from a very large remote XML file that I access via http. I only need a portion of the file at the beginning, but the files I am accessing can often be so large that downloading them all will cause a timeout. It seems like it should be possible with fsockopen to pull only as much as needed before closing the connection, but nothing I have tried has worked.
Below is a simplified version of what I have been trying. Can anyone tell me what I need to do differently?
<?php
$k = 0;
function socketopen($funcsite, $funcheader){
$fp = fsockopen ($funcsite, 80, $errno, $errstr, 5);
$buffer = NULL;
if ($fp) {
fwrite($fp, "GET " . $funcheader . " HTTP/1.0\r\nHost: " . $funcsite. "\r\n\r\n");
while (!feof($fp)) {
$buffer = fgets($fp, 4096);
echo $buffer;
if($k == 200){
break;
}
$k++;
}
fclose ($fp);
} else {
print "No Response:";
}
return ( html_entity_decode($buffer));
}
$site = "www.remotesite.com";
$header = "/bigdatafile.xml";
$data = socketopen($site, $header);
?>
This works fine, but always opens and downloads the entire remote file. (I actually use a different conditional than the if($k = x), but that shouldn't matter).
Any help greatly appreciated. -Jim
Any reason not to use file_get_contents() instead?
$buffer = html_entity_decode(file_get_contents('http://www.remotesite.com/bigdatafile.xml', 0, null, $offsetBytes, $maxlenBytes));
You just need to specify $offsetBytes and $maxlenBytes.
Try this:
set_time_limit(0);
echo $buffer = html_entity_decode(file_get_contents('http://www.remotesite.com/bigdatafile.xml', 0, null, 1024, 4096));
with this code you could download the entire rss
if (!$xml = simplexml_load_file("http://remotesite.com/bigrss.rss))
{
throw new RuntimeException('Unable to load or parse feed');
}
else
{
file_put_contents($xml,'mybigrss.rss');
}
but if you want to get just some parts then do the following;
$limit = 512000; // set here a limit
$sourceData = fread($s_handle,$limit);
// your code ect..
Or with eof
$source='';
while (!feof($s_handle))
$source.=fread($s_handle,1024); // set limit