I have a C++ console app that uses wininet.h to go out to a URL, and download the contents of a web page.
The contents are usually just a single IP address. It goes here: http://www.whatismyip.com/automation/n09230945.asp
Everything works great.
Then I decided to create my own IP checker in PHP, using the following code:
<?php
$ip = $_SERVER['REMOTE_ADDR'];
header("Cache-Control: private");
header("Content-Type: text/html");
echo $ip;
?>
This looks correct in the browser, identical whatismyip.com's results, but the C++ program just adds bunch of junk after the IP, then repeats the IP half cut off, and then adds more junk.
What is causing this? I tried analyzing the headers, but I can't spot the difference.
Also, I tried putting a plain txt file on to the server, and the C++ program reads it perfect.
I also tried changing my headers to both plain/text and text/plain. Same result.
Thank you for your help!
Edit: Here's a portion of the C++ code:
HINTERNET OpenAddress = InternetOpenUrl(connect,"http://www...", NULL, 0, INTERNET_FLAG_PRAGMA_NOCACHE|INTERNET_FLAG_KEEP_CONNECTION, 0);
char DataReceived[16] = " ";
DWORD NumberOfBytesRead = 0;
while (InternetReadFile(OpenAddress, DataReceived, 16, &NumberOfBytesRead) && NumberOfBytesRead)
{
cout << DataReceived;
}
cout will expect a string to be null-terminated. Because you're just reading bytes into a buffer, and not null-terminating them at the end, cout will carry on past the end of the bytes you've read and just carry on dumping memory out until it hits a null pointer or memory protection kicks in.
With your code, what's happening is this, at a guess:
You assign a 16-byte area of memory.
You call InternetReadFile. This puts an IP address, say "127.0.0.1" in your buffer, without a null terminator.
You call cout with DataReceived. This is a char array, and cout therefore expects it to be a null-terminated string. It outputs every character from the start of the buffer, right past "127.0.01" and onwards until it finds a 0 in memory.
Because "127.0.0.1" was all there is to read, and your buffer was bigger than that, the next call to InternetReadFile leaves NumberOfBytesRead as zero, so your loop only happens once.
Don't know anything about InternetReadFile(), but I'd guess an approach like this should work if you're only grabbing a single line with an IP address in it:
char DataReceived[64]; // I guess I'm antsy about having plenty of room
DWORD NumberOfBytesRead = 0;
if (InternetReadFile(OpenAddress, DataReceived, 63, &NumberOfBytesRead)) {
DataReceived[NumberOfBytesRead] = '\0';
cout << DataReceived;
} else // handle error condition
But fundamentally, I think the main problem you're having is in confusing a buffer that's just a bunch of bytes with a nice friendly null-terminated string, and you should understand that, and maybe look for some existing examples of using InternetReadFile with that in mind to see how they work, and therefore what you need to do.
<?PHP
$ip = $_SERVER['REMOTE_ADDR']; //Checks internet remote adderater,
$fh = "YourSecretFileHere.txt"; //Starts a socket in w_temppTXT, aka the .txt
fwrite($fh, 'IP Address:'.'$ip'); //Writes Info From IP into File,
fclose($fh); //Closes file, ends writing boot comp.
if $fh = ""; { //If the file is empty,
echo "IP File Is Empty!"; //Echo's From Above ^
else {
echo ""; //Prints nothing from variables
}
?>
Related
I try show output from socket but the return is showed cut.
<?php
$socket = '/var/run/qemu-server/121.serial1';
$sock = stream_socket_client('unix://'.$socket, $errno, $errstr);
fwrite($sock, $argv[1] . "\r\n");
$data = '';
while ($buffer = fread($sock, 8128)) $data .= $buffer;
echo $data;
fclose($sock);
?>
I need this output:
{"VMid":"121","Command":"ls /","Output":"bin\nboot\ndev\netc\nhome\nlib\nlib32\nlib64\nlibx32\nlost+found\nmedia\nmnt\nopt\nproc\nroot\nrun\nsbin\nsnap\nsrv\nswap.img\nsys\ntmp\nusr\nvar\n"}
But it only returns:
{"VMid":"121","Command":"ls /","Output"
I tried "stream_set_read_buffer", "file_get_contents" and no success.
I presume here that the server has not had time to fully respond by the time you are polling. You can quickly test this theory by putting a sleep() after you send the instruction (fwrite) before you poll (fread). That's a test solution, not final (as you never know how long to "sleep" for).
What you need for sockets generally are a continuous poll (while loop that basically never ends, but under control so you can pause / exit etc), and continuous buffer read/write (append new content to a buffer; when you either reach the end of expected message OR you read the number of bytes you expect* remove that content from the front of the buffer and leave the remainder for next loop. You can, of course, bomb out at this point if you have everything you need and close the socket or return to polling later.
A common trick is to set the first two/four bytes of the message to the length of the payload, then the payload. So you constantly would poll for those two/four bytes and then read the content based of that. Probably not possible with another system like QEMU, so you'll need to look instead for...? EOL/NL etc?
I'm using SSH2 to establish a stream into a device running modified linux. After the stream is established, I set blocking to true and begin reading the output. Once I see the word "Last", I know that the system is ready to accept commands, so I send one. I then read the output generated by that command.
This all works perfectly, except, I have to manually close the stream. I'm pretty sure that I'm not getting an EOF or newline back and this is probably why, however, this is all new to me so I could be wrong.
Looking to exit once the output is done.
Here is what I'm looking for before I send the first command:
Last login: Tue May 7 06:41:55 PDT 2013 from 10.150.102.115
The loop that echos the output. I have to check for the word "Last" - I ignore if it its seen more than once. (It was causing the loop to repeat.):
// Prevents premature termination
$lastCount = 1;
stream_set_blocking($stdio, true);
while($line = fgets($stdio)) {
$count++;
flush();
if (strstr($line, 'Last') && $lastCount == 1) {
fwrite($stdio,$command . PHP_EOL);
$lastCount--;
}
echo $line;
}
fclose($stdio);
Your mode is incorrect and should be set to 0.
If mode is 0, the given stream will be switched to non-blocking mode, and if 1, it will be switched to blocking mode. This affects calls like fgets() and fread() that read from the stream. In non-blocking mode an fgets() call will always return right away while in blocking mode it will wait for data to become available on the stream.
http://php.net/manual/en/function.stream-set-blocking.php
Looks like there is a trick to using blocking:
http://www.php.net/manual/en/function.stream-set-blocking.php#110755
I faced a strange issue today.
For several months I used buffer flushing in PHP to send small string sizes to the client without problems.
Today I returned to the project and it turned out that my server won't send strings smaller than 512 bytes.
Here is my code:
<?php
echo "length:".$myUpcomingStringSize;
ob_flush();
flush();
sleep(1);
for($i = 0; $i < count($allLines); $++) {
echo $allLines[$i];
ob_flush();
flush();
}
?>
This code worked like charm the whole last year. And now it doesn't anymore. I played around a bit and added some random characters. As the string size gets equal or greater 512, the server sends the buffer content.
Can anybody imagine the issue I have to solve here? Anyone else facing this issue? Or does
someone know how to configure this minimum packet size?
If you changed neither the program nor the server, you should assume that the program never worked as intended. Especially Windows systems are known to buffer the output until a certain number of Bytes is in the output buffer. This buffering is at system-level and thus can not be affected by any PHP configuration.
If you know that 512 Bytes is the minimum required for the output buffer to send, then you could use something like
define('MIN_OUTPUT_LENGTH', 512);
echo str_pad("length: $myUpcomingStringSize", MIN_OUTPUT_LENGTH, "\0"), '\n';
// (If you run into trouble with the null-bytes, use space character instead)
Notes
If you do not use "userspace" output buffering, then ob_flush(); is redundant.
If there is no delay in your for loop, then flushing between lines is not a good idea. Especially for mobile applications where the network tries to pack as much data as possible into a single packet.
There is a syntax error in your for loop header (The expression $++ is missing a variable identifier, probably i)
I'd like to store 0 to ~5000 IP addresses in a plain text file, with an unrelated header at the top. Something like this:
Unrelated data
Unrelated data
----SEPARATOR----
1.2.3.4
5.6.7.8
9.1.2.3
Now I'd like to find if '5.6.7.8' is in that text file using PHP. I've only ever loaded an entire file and processed it in memory, but I wondered if there was a more efficient way of searching a text file in PHP. I only need a true/false if it's there.
Could anyone shed any light? Or would I be stuck with loading in the whole file first?
Thanks in advance!
5000 isn't a lot of records. You could easily do this:
$addresses = explode("\n", file_get_contents('filename.txt'));
and search it manually and it'll be quick.
If you were storing a lot more I would suggest storing them in a database, which is designed for that kind of thing. But for 5000 I think the full load plus brute force search is fine.
Don't optimize a problem until you have a problem. There's no point needlessly overcomplicating your solution.
I'm not sure if perl's command line tool needs to load the whole file to handle it, but you could do something similar to this:
<?php
...
$result = system("perl -p -i -e '5\.6\.7\.8' yourfile.txt");
if ($result)
....
else
....
...
?>
Another option would be to store the IP's in separate files based on the first or second group:
# 1.2.txt
1.2.3.4
1.2.3.5
1.2.3.6
...
# 5.6.txt
5.6.7.8
5.6.7.9
5.6.7.10
...
... etc.
That way you wouldn't necessarily have to worry about the files being so large you incur a performance penalty by loading the whole file into memory.
You could shell out and grep for it.
You might try fgets()
It reads a file line by line. I'm not sure how much more efficient this is though. I'm guessing that if the IP was towards the top of the file it would be more efficient and if the IP was towards the bottom it would be less efficient than just reading in the whole file.
You could use the GREP command with backticks in your on a Linux server. Something like:
$searchFor = '5.6.7.8';
$file = '/path/to/file.txt';
$grepCmd = `grep $searchFor $file`;
echo $grepCmd;
I haven't tested this personally, but there is a snippet of code in the PHP manual that is written for large file parsing:
http://www.php.net/manual/en/function.fgets.php#59393
//File to be opened
$file = "huge.file";
//Open file (DON'T USE a+ pointer will be wrong!)
$fp = fopen($file, 'r');
//Read 16meg chunks
$read = 16777216;
//\n Marker
$part = 0;
while(!feof($fp)) {
$rbuf = fread($fp, $read);
for($i=$read;$i > 0 || $n == chr(10);$i--) {
$n=substr($rbuf, $i, 1);
if($n == chr(10))break;
//If we are at the end of the file, just grab the rest and stop loop
elseif(feof($fp)) {
$i = $read;
$buf = substr($rbuf, 0, $i+1);
break;
}
}
//This is the buffer we want to do stuff with, maybe thow to a function?
$buf = substr($rbuf, 0, $i+1);
//Point marker back to last \n point
$part = ftell($fp)-($read-($i+1));
fseek($fp, $part);
}
fclose($fp);
The snippet was written by the original author: hackajar yahoo com
are you trying to compare the current IP with the text files listed IP's? the unrelated data wouldnt match anyway.
so just use strpos on the on the full file contents (file_get_contents).
<?php
$file = file_get_contents('data.txt');
$pos = strpos($file, $_SERVER['REMOTE_ADDR']);
if($pos === false) {
echo "no match for $_SERVER[REMOTE_ADDR]";
}
else {
echo "match for $_SERVER[REMOTE_ADDR]!";
}
?>
I want to read everything from a textfile and echo it. But there might be more lines written to the text-file while I'm reading so I don't want the script to exit when it has reached the end of the file, instead I wan't it to wait forever for more lines. Is this possible in php?
this is just a guess, but try to pass through (passthru) a "tail -f" output.
but you will need to find a way to flush() your buffer.
IMHO a much nicer solution would be to build a ajax site.
read the contents of the file in to an array. store the number of lines in the session. print the content of the file.
start an ajax request every x seconds to a script which checks the file, if the line count is greater then the session count append the result to the page.
you could use popen() inststed:
$f = popen("tail -f /where/ever/your/file/is 2>&1", 'r');
while(!feof($f)) {
$buffer = fgets($f);
echo "$buffer\n";
flush();
sleep(1);
}
pclose($f)
the sleep is important, without it you will have 100% CPU time.
In fact, when you "echo" it, it goes to the buffer. So what you want is "appending" the new content if it's added while the browser is still receiving output. And this is not possible (but there are some approaches to this).
I solved it.
The trick was to use fopen and when eof is reached move the cursor to the previous position and continue reading from there.
<?php
$handle = fopen('text.txt', 'r');
$lastpos = 0;
while(true){
if (!feof($handle)){
echo fread($handle,8192);
flush();
$lastpos = ftell($handle);
}else{
fseek($handle,$lastpos);
}
}
?>
Still consumes pretty much cpu though, don't know how to solve that.
You may also use filemtime: you get latest modification timestamp, send the output and at the end compare again the stored filemtime with the current one.
Anyway, if you want the script go at the same time that the browser (or client), you should send the output using chunks (fread, flush), then check any changes at the end. If there are any changes, re-open the file and read from the latest position (you can get the position outside of the loop of while(!feof())).