PHP can't count (binary-hex weirdness)? - php

In Perl I have code that is working correctly:
print unpack('B*','10071C2');
returns 00110001001100000011000000110111001100010100001100110010
The code ported to PHP using GMP:
function gmp_convert($num, $base_a, $base_b)
{
return gmp_strval ( gmp_init($num, $base_a), $base_b );
}
$test = "10071C2";
$testb=gmp_convert($test, 16, 2);
produces 10000000110110001110000101001101111110110001101110000111
I thought it might be byte order, however if I use b* instead in Perl it still produces something else:
PHP---10000000110110001110000101001101111110110001101110000111
PERL--10001100000011000000110011101100100011001100001001001100
I simply do not understand this, can anyone help?

Your Perl and PHP implementations are doing entirely separate things.
The Perl code is converting each of the characters from the input string into the binary representation of the ASCII code for that character. For instance, the first character ("1") gets converted into "00110001" which is equal to decimal 49, the ASCII code for the character 1.
Your PHP code successfully converts the hex number represented in string form into an equivalent binary representation in string form.

Related

BIN2HEX & HEX2BIN Convesion in PHP

I want to convert 48 bit binary number (in string format) to HEX (12 char long). And same thing in reverse manner.
E.g
binary '000000000000000000000000000000000000000000000000' into hex '00000000000'
binary '111111111111111111111111111111111111111111111111' into hex 'FFFFFFFFFFFF'
hex 'FFFFFFFFFFFF' into binary '111111111111111111111111111111111111111111111111'
hex '00000000000' into binary '000000000000000000000000000000000000000000000000'
Tried the default bin2hex(), dechex(bindec($binary)), etc.
I am a newbie so please explain in detail.
dechex(bindec($binary)) should have worked ok, but you don't say what the problem was.
Fundamentally though, the hex2bin is for encoding binary data, not a binary string. For simplicity's sake, you might want to stick to using the in-built base_convert function, e.g.
echo base_convert('FFFFFFFFFFFF', 16, 2);
// 111111111111111111111111111111111111111111111111
echo base_convert('111111111111111111111111111111111111111111111111', 2, 16);
// ffffffffffff
Note that for your "zero" examples, you'll just get back a single zero. There's no real concept of length with the number zero in any base system that I'm aware of.

Why do PHP and Obj-C encode strings differently?

I'm trying to convert a string to UTF8, on both obj-c and php.
I get different results:
"\xd7\x91\xd7\x93\xd7\x99\xd7\xa7\xd7\x94" //Obj-C
"\u05d1\u05d3\u05d9\u05e7\u05d4" //PHP
Obj-C code:
const char *cData = [#"בדיקה" cStringUsingEncoding:NSUTF8StringEncoding]
PHP code:
utf8_encode('בדיקה')
This difference breaks my hash algorithm that follows.
How can I make the two strings encoded the same way? Should I change the obj-c\php ?
Go to http://www.utf8-chartable.de/unicode-utf8-table.pl
In the combo box switch to “U+0590 … U+5FF Hebrew”
Scroll down to “U+05D1” which is the rightmost character of your input string.
The third column shows the two UTF-8 bytes: “d7 91”
If you keep looking you will see that the PHP and the Objective-C are actually the same. The “problem” you are seeing is that while PHP uses an Unicode escape (\u), Objective-C uses direct byte hexadecimal escapes (\x). Those are only visual representations of the strings, the bytes in memory are actually the same.
If your hash algorithm deals with bytes correctly, you should not see differences.
What are you using to do the encoding on PHP? It looks like you're generating a UTF-16 string.
Try utf8_encode() and see if that gives better results.

Get Ruby's OpenSSL::HMAC.hexdigest() to output the same as PHP's hash_hmac()

I'm trying to use the API of a web service provider. They don't have an example in Ruby, but they do have one for PHP, and I'm trying to interpret between the two. The API examples always use "true" on PHP's hash_hmac() call, which produces a binary output. The difference seems to be that Ruby's OpenSSL::HMAC.hexdigest() function always returns text. (If I change the PHP call to "false" they return the same value.) Does anyone know of a way to "encode" the text returned from OpenSSL::HMAC.hexdigest() to get the same thing as returned from a hash_hmac('sha256', $text, $key, true)?
Use OpenSSL::HMAC.digest to get the binary output.
You'll need to convert each pair of hex digits into a byte with the same value. I don't know any Ruby, but this is similar to how it would be handled in PHP.
First, take your string of hex digits and split them into an array. Each element in the array should be two characters long. Convert each element from a string of two hex bytes to an integer. It looks like you can do this by calling the hex method on each string.
Next, call pack on the converted array using the parameter c*, to convert each integer into a one-byte character. You should get the correct string of bytes as the result.

SHA1 hmac PHP vs Javascript - Different Results?

Hello
I am using class in javascript to hash string:
https://ssl.bsk.com.pl/mobi/js/sha1.js
hex_hmac_sha1("927545161", "asdasdasdasdś") ;
Result is:
5db0194c834d419fc5d68b72c88af1ac8ee749d6
In PHP i'm hashing:
echo hash_hmac('sha1', "asdasdasdasdś", '927545161');
but result is:
0b115775a20bed9922b6a9cc934cb5328fe71ade
Where is error?
5db0194c834d419fc5d68b72c88af1ac8ee749d6 != 0b115775a20bed9922b6a9cc934cb5328fe71ade
PHP interprets the UTF-8 string as sequence of 8-bit chars. Whereas in Javascript each character can resolve to an Unicode code point.
Your compacted and totally unreadable Javascript implementation uses .charCodeAt() to transform the string into a hexstring. I didn't bother to investigate it completely, but it's most likely that "ś".charCodeAt(0) simply resolves to 347, and the remainder of the conversion expected a value in the 8-bit range 0 to 255.

In PHP what does it mean by a function being binary-safe?

In PHP what does it mean by a function being binary-safe ?
What makes them special and where are they typically used ?
It means the function will work correctly when you pass it arbitrary binary data (i.e. strings containing non-ASCII bytes and/or null bytes).
For example, a non-binary-safe function might be based on a C function which expects null-terminated strings, so if the string contains a null character, the function would ignore anything after it.
This is relevant because PHP does not cleanly separate string and binary data.
The other users already mentioned what binary safe means in general.
In PHP, the meaning is more specific, referring only to what Michael gives as an example.
All strings in PHP have a length associated, which are the number of bytes that compose it. When a function manipulates a string, it can either:
Rely on that length meta-data.
Rely on the string being null-terminated, i.e., that after the data that is actually part of the string, a byte with value 0 will appear.
It's also true that all string PHP variables manipulated by the engine are also null-terminated. The problem with functions that rely on 2., is that, if the string itself contains a byte with value 0, the function that's manipulating it will think the string has ended at that point and will ignore everything after that.
For instance, if PHP's strlen function worked like C standard library strlen, the result here would be wrong:
$str = "abc\x00abc";
echo strlen($str); //gives 7, not 3!
More examples:
<?php
$string1 = "Hello";
$string2 = "Hello\x00World";
// This function is NOT ! binary safe
echo strcoll($string1, $string2); // gives 0, strings are equal.
// This function is binary safe
echo strcmp($string1, $string2); // gives <0, $string1 is less than $string2.
?>
\x indicates hexadecimal notation. See: PHP strings
0x00 = NULL
0x04 = EOT (End of transmission)
ASCII table to see ASCII char list

Categories