Nodejs how to implement OpenSSL AES-CBC encryption (from PHP)? - php

I am currently working on translating an encryption algorithm from PHP to Typescript, to use in a very specific API that requires the posted data to be encrypted with the API key and Secret. Here is the provided example of how to correctly encrypt data in PHP for use with the API (the way of implementing the key and IV can't be changed):
$iv = substr(hash("SHA256", $this->ApiKey, true), 0, 16);
$key = md5($this->ApiSecret);
$output = openssl_encrypt($Data, "AES-256-CBC", $key, OPENSSL_RAW_DATA, $iv);
$completedEncryption = $this->base64Url_Encode($output);
return $completedEncryption;
In the above code, the only thing the base64Url_Encode function does is convert the binary data to a valid Base64URL string.
And now the code as I have implemented it inside Typescript:
import { createHash, createCipheriv } from 'node:crypto'
const secretIV = createHash('sha256').update(this.ApiKey).digest().subarray(0, 16)
// Generate key
/*
Because the OpenSSL function in PHP automatically pads the string with /null chars,
do the same inside NodeJS, so that CreateCipherIV can accept it as a 32-byte key,
instead of a 16-byte one.
*/
const md5 = createHash('md5').update(this.ApiSecret).digest()
const key = Buffer.alloc(32)
key.set(md5, 0)
// Create Cipher
const cipher = createCipheriv('aes-256-cbc', key, secretIV)
let encrypted = cipher.update(data, 'utf8', 'binary');
encrypted += cipher.final('binary');
// Return base64URL string
return Buffer.from(encrypted).toString('base64url');
The above Typescript code only does NOT give the same output as the PHP code given earlier. I have looked into the original OpenSSL code, made sure that the padding algorithms are matching (pcks5 and pcks7) and checked if every input Buffer had the same byte length as the input inside PHP. I am currently thinking if it is some kind of binary malform that is causing the data to change inside Javascript?
I hope some expert can help me out with this question. Maybe I have overlooked something. Thanks in advance.

The stupidity is in the md5 function in PHP, which defaults to hexadecimal output instead of binary output:
md5(string $string, bool $binary = false): string
This is also why the code doesn't complain about the key (constructed from the MD5 hash) is being too small, it is fed 32 bytes after ASCII or UTF8 encoding, instead of the 16 bytes you'd use for AES-128.
Apparently it is using lowercase encoding, although not even that has been specified. You can indicate the encoding for NodeJS as well, see the documentation of the digest method. It also seems to be using lowercase, although I cannot directly find the exact specification of the encoding either.
Once you have completed your assignment, please try and remove the code ASAP, as you should never calculate the IV from the key; they key and IV combination should be unique, so the above code is not IND-CPA secure if the key is reused.
In case you are wondering why it is so stupid: the output of MD5 has been specified in standards, and is binary. Furthermore, it is impossible from the function to see what it is doing, you have to lookup the code. It will also work very badly if you're doing a compare; even if you are comparing strings then it is easy to use upper instead of lowercase (and both are equally valid, uppercase hex is actually easier to read for humans as we focus on the top part of letters more for some reason or other).
Basically it takes the principle of least surprise and tosses it out of the window. The encoding of the output could be made optimal instead, the NodeJS implementation does this correctly.

Related

esp32 and php XXTEA strings encryption

I'm using esp32 (Arduino platform not esp-idf) with the "HTTPClient.h" library to send get requests with parameters to my PHP server.
I want to encrypt the parameter values and decrypt them in my PHP code And vice versa (my server sends back JSON data to my esp32).
I tried using the XXTEA protocol with these libraries for PHP, and for esp32.
But the encrypted string won't decrypt properly on PHP.
Example:
When I encrypt "HELLO WORLD" on my esp32 with the key "ENCRYPTION KEY" I get this:
35bd3126715874f741518f4d
And when I decrypt it on PHP it returns blank.
Moreover, when I encrypt it on my PHP server I get this:
T1YNYC4P4R2Y5eCxUqtjuw==
My esp32 sketch looks like this:
#include <xxtea-iot-crypt.h>
void setup() {
Serial.begin(115200);
}
void loop() {
String plaintext = "HELLO WORLD";
// Set the Password
xxtea.setKey("ENCRYPTION KEY");
// Perform Encryption on the Data
Serial.print(F(" Encrypted Data: "));
String result = xxtea.encrypt(plaintext);
Serial.println(result);
// Perform Decryption
Serial.print(F(" Decrypted Data: "));
Serial.println(xxtea.decrypt(result));
delay(2000);
}
My PHP code looks like this:
require_once('xxtea.php');
$str = "HELLO WORLD"
$key = "ENCRYPTION KEY";
$encrypt_data = xxtea_encrypt($str, $key);
error_log($encrypt_data);
Is there a way to have an encrypted strings communication between PHP and esp32?
Thanks in advance.
This problem may result from inputs being of different data type, since no current XXTEA implementation seems to do any type or range checking.
Or it could be due to different endian behavior of the two computers involved, since binary is typically stored as an array of words constructed from bytes.
Or it could be due to lack of official or standard reference examples for correct encryption of a specific string and key. In the absence of reference examples (using either hexadecimal or base64 conversion of the binary encryption result) there is no way to tell whether an implementation of encryption is correct, even if its results decrypt correctly using a corresponding decryption implementation.
ADDED:
I think I've found one compatibility problem in the published code for XXTEA. It may be worth taking some space here to discuss it.
Specifically, the problem is that different implementations create different results for encrypting the same plaintext and key.
Discussion:
This problem results from the addition of the length of the plaintext as the last element of the array of longs. While this solves the problem of plaintext that has a length that is not a multiple of 4, it generates a different encrypted value than is generated by the JavaScript implementation.
If you insert "$w=false;" at the start of the long2str and str2long functions, the encrypted value for the PHP implementation becomes the same as the JavaScript implementation, but the decrypted value has garbage at the end.
Here are some test case results with this change:
PHP:
text: >This is an example. !##$%^&*(){}[]:;<
Base64: PlRoaXMgaXMgYW4gZXhhbXBsZS4gIUAjJCVeJiooKXt9W106Ozw=
key: 8GmZWww5T97jb39W
encrypt: sIubYrII6jVXvMikX1oQivyOXC07bV1CoC81ZswcCV4tkg5CnrTtqQ==
decrypt: >This is an example. !##$%^&*(){}[]:;<��
Note: there are two UTF-8 question-mark characters at the end of the "decrypt" line.
JavaScript:
text: >This is an example. !##$%^&*(){}[]:;<
Base64: PlRoaXMgaXMgYW4gZXhhbXBsZS4gIUAjJCVeJiooKXt9W106Ozw=
key: 8GmZWww5T97jb39W
encrypt: sIubYrII6jVXvMikX1oQivyOXC07bV1CoC81ZswcCV4tkg5CnrTtqQ==
decrypt: >This is an example. !##$%^&*(){}[]:;<
The reason there is no garbage in the JavaScript implementation even though it does not save the length of the plaintext is given in a comment there: "note running off the end of the string generates nulls since bitwise operators treat NaN as 0". In other words, the generated string is padded with NULs that are never seen, even though JavaScript, like PHP, can include NULs in strings because it stores the length separately.
I don't have an opinion about which approach is best, but one should be chosen for all implementations.
The reason that there should be a standard for the result of encryption (regardless of whether the binary is converted to hex or to base64 for safe transit) is that one might want to use, say, PHP for encoding but JavaScript for decoding, depending on which languages are natural to use at two locations. After all, encryption is most often used to communicate between two locations, and the language used at the target location might not even be known.
Why not using the wificlientsecure library? Works great on the esp32.

IV too long in PHP mcrypt_generic_init

I am working on a project where all the data from the web services is being encrypted using Triple DES Encryption. In my specific case, I am receiving a query string from a URL that has been encrypted. The web service provider has given me two values for decryption: the encryption Key_192 and the initialization vector IV_192. Both these keys are 24 characters long.
When I attempt to decrypt the query string I have received in PHP, I am using the mcrypt library. When initializing the generic decrypt methods, part of my function is:
$key = "XXXXXXXXXXXXXXXXXXXXXXXX";
$iv = "YYYYYYYYYYYYYYYYYYYYYYYY";
$cipher = mcrypt_module_open(MCRYPT_3DES, '', 'cbc', '');
mcrypt_generic_init($cipher, $key, $iv);
$result = rtrim(mdecrypt_generic($cipher, $this->hex2bin($buffer)), "\0");
mcrypt_generic_deinit($cipher);
return $result;`
However, when I execute that portion of my code, I receive the following message:
mcrypt_generic_init(): Iv size incorrect; supplied length: 24, needed: 8
The web services provider was not able to provide any guidance on the error, instead directing me to their VB.NET implementation which has a line like:
Dim cs As CryptoStream = New CryptoStream(ms, cryptoProvider.CreateDecryptor(KEY_192, IV_192), CryptoStreamMode.Read)
where they pass the two keys in directly, similar to the mcrypt_generic_init() function.
I understand that the IV size is dependent upon the cypher method (Triple DES), but am confused as to why I have an IV longer than the function appears to support. How could that be? My experience with this kind of encryption is limited, and I have been unable to decrypt the query string into anything that doesn't look like a field of random characters.
Run mcrypt_enc_get_iv_size() to figure out the required IV size. For Triple DES, it will be 8. mcrypt_generic_init() requires a string of exactly the correct length, so you should use a shorter string (or, to do it on the fly, use substr()).
It turns out that my issue in decrypting was caused by differences in how the mcrypt PHP library and the VB.NET libraries pad strings while encrypting. Also, with regard to the original question, only the first 8 characters of the IV are actually used. The others are discarded. Better description is located here:
http://mishu666.wordpress.com/2007/08/20/problem-and-solve-of-3des-incompatibilities-with-nets-tripledescryptoserviceprovider/

Node.js: how to decipher text ciphered in php?

My PHP ciphering looks like this:
<?
$salt = '…';
$data = '…';
$iv = mcrypt_create_iv(mcrypt_get_iv_size(MCRYPT_RIJNDAEL_256, MCRYPT_MODE_CFB), MCRYPT_RAND);
$ciphered = trim(base64_encode(mcrypt_encrypt(MCRYPT_RIJNDAEL_256, $salt, $data, MCRYPT_MODE_ECB,$iv)));
I'm trying to decipher the result of the code above with:
ciphered = '…';
crypto = require('crypto');
salt = crypto.createHash('md5').update('…').digest('hex');
iv = '0123456789123455';
decipher = crypto.createDecipheriv('aes-256-cbc', salt, iv);
deciphered = decipher.update(ciphered, 'base64');
deciphered += decipher.final('utf-8');
This code results in: TypeError: DecipherFinal fail
a couple of problems I see:
mismatch of operating modes. you generate an IV for a CFB (Cipher Feedback) operating mode,you use ECB (Electronic Code Book - not recommended, just check out the image in that wiki article for why) as your mode when you actually encrypt, then try to decrypt using CBC (Cipher Block Chaining) mode. You should stick to one mode (probably CBC). To do this keep the decryption side aes-256-cbc and make the encryption side MCRYPT_MODE_CBC
you pass $salt (which is actually your key) into mcrypt_encrypt without hashing it, but do hash it, and return a hex string when crypto.createDecipheriv expects a binary encoded string, per its documentation. Both keys need to be the same, and need to follow the proper encoding so that they remain the same when passed into the functions.
It looks like you generate an IV on the encryption side, then use a fixed string for an IV on the decryption side. The IV (Initialization vector) needs to be communicated with the ciphertext to the decryption side (and it is okay to be transmitted in the clear along with the ciphertext).
the update method on your decipher object does not accept base64 as an encoding, per its documentation. You will need to convert your base64 text to something else (probably a binary encoding) and then pass it into the update method with the proper encoding.
PHP's default charset is ISO-8859-1, but you are trying to decrypt your ciphertext as a UTF-8 string. This may cause problems, especially if you use characters beyond those used in standard ASCII. you either need to make sure that your PHP side is operating in UTF-8 mode (check out this SO answer on how to do that), or ensure that your input only uses ASCII characters (ISO-8859-1 is a superset of ASCII) and use the 'ascii' output encoding.
Most of your problems boil down to encoding issues. I don't know a lot about the various types of encoding on node.js, so you will need to research that on your own, but the issues with the cryptographic primitives should be easy to fix. make sure to read the documentation I linked and the mcrypt_encrypt documentation as well.

Decrypting the .ASPXAUTH Cookie WITH protection=validation

For quite sometime I've been trying to decipher the ASP .ASPXAUTH cookie and decrypt it using PHP. My reasons are huge and I need to do this, there is no alternative. In PHP so far I have successfully managed to read the data from this cookie, but I cannot seem to do it while it is encrypted. Anyway, here it goes...
First you need to alter your servers Web.config file (protection needs to be set to Validation):
<authentication mode="None">
<forms name=".ASPXAUTH" protection="Validation" cookieless="UseCookies" timeout="10080" enableCrossAppRedirects="true"/>
</authentication>
Then in a PHP script on the same domain, you can do the following to read the data, this is a very basic example, but is proof:
$authCookie = $_COOKIE['_ASPXAUTH'];
echo 'ASPXAUTH: '.$authCookie.'<br />'."\n";//This outputs your plaintext hex cookie
$packed = pack("H*",$authCookie);
$packed_exp = explode("\0",$packed);//This will separate your data using NULL
$random_bytes = array_shift($packed_exp);//This will shift off the random bytes
echo print_r($packed_exp,TRUE); //This will return your cookies data without the random bytes
This breaks down the cookie, or at least the unencrypted data:
Now that I know I can get the data, I removed the 'protection="validation"' string from my Web.config and I tried to decrypt it using PHP mcrypt. I have tried countless methods, but here is a promising example (which fails)...
define('ASP_DECRYPT_KEY','0BC95D748C57F6162519C165E0C5DEB69EA1145676F453AB93DA9645B067DFB8');//This is a decryption key found in my Machine.config file (please note this is forged for example)
$iv = mcrypt_create_iv(mcrypt_get_iv_size(MCRYPT_RIJNDAEL_256, MCRYPT_MODE_CBC), MCRYPT_RAND);
$decrypted = mcrypt_decrypt(MCRYPT_RIJNDAEL_128, ASP_DECRYPT_KEY, $authCookie, MCRYPT_MODE_CBC, $iv);//$authCookie is the pack()'d cookie data
This however fails. I've tried variations of IV with all zeros # 16 bytes. I've tried different Rijndael sizes (128 vs 256). I've tried base64_decode()ing, nothing seems to work. I've found this stackoverflow post here and started using variations of the key/iv that are made using sha256, but that isn't really working either.
Anybody have a clue what I should do?
I don't know how encryption is made in .NET AuthCookies, but I can try to answer.
Assuming the encryption occurs in AES CBC-IV mode, with randomly generated IVs, you need to first find out where the IV is.
The code snippet you show cannot work, as you are generating a random IV (which will be incorrect). That being said, even if you get the IV wrong, in CBC mode you will only have the first 16 bytes of your decrypted ciphertext "garbled" and the rest will decrypt properly - you can use this as a test to know if you're doing the rest correctly. In practice when using random IVs, it's very likely that it's prepended to the ciphertext. To check if this correct, you can try to check if len(ciphertext) = len(plaintext) + 16. This would mean that most likely the first 16 bytes are your IV (and therefore it should be removed from the ciphertext before attempting to decrypt it).
Also on your code snippet, it seems you are using the key as an ascii-string, whereas it should be a byte array. Try:
define('ASP_DECRYPT_KEY',hex2bin('0BC95D748C57F6162519C165E0C5DEB69EA1145676F453AB93DA9645B067DFB8'));
Also, this seems to be a 32 byte key, so you need to use AES-256. I don't know how the authcookie looks like, but if it's base64 encoded, you also need to decode it first obviously.
Hope this helps!
Note: I don't recomment doing this for important production code, however - because there are many things that can go wrong if you try to implement even your own decryption routine as you are doing here. In particular, I would guess there should be a MAC tag somewhere that you have to check before attempting decryption, but there are many other things that can go wrong implementing your own crypto.
I understand this may not have been possible for the OP but for other people heading down this route here is a simple alternative.
Create a .net web service with a method like:
public FormsAuthenticationTicket DecryptFormsAuthCookie(string ticket)
{
return FormsAuthentication.Decrypt(ticket);
}
Pass cookie to web service from PHP:
$authCookie = $_COOKIE['.ASPXAUTH'];
$soapClient = new SoapClient("http://localhost/Service1.svc?wsdl");
$params= array(
"ticket" => $authCookie
);
$result = $soapClient->DecryptFormsAuthCookie($params);
I know what a pain is to decrypt in PHP something encrypted in .NET and vice versa.
I had to end up coding myself the Rijndael algorithm ( translated it from another language ).
Here is the link to the source code of the algorithm: http://pastebin.com/EnCJBLSY
At the end of the source code there is some usage example.
But on .NET, you should use zero padding when encrypting. Also test it with ECB mode, I'm not sure if CBC works.
Good luck and hope it helps
edit: the algorithm returns the hexadecimal string when encrypts, and also expects hexadecimal string when decrypting.

Which Blowfish algorithm is the most 'correct'?

I'm running Windows Server 2k8 (maybe that's half the problem?) Anyway, I'm getting different values out of different Blowfish modules in various languages. Is there one which can be relied upon as the standard?
For the following examples, assume the key is password and the plaintext 12345678.
a. The Online Encrypt Tool with Algorithm set to Blowfish mode ECB and Base64 Encode the output checked gives 2mADZkZR0VM=. I've been using this a my reference point, wise or otherwise.
b. The following Perl code uses Crypt::ECB and MIME::Base64
use MIME::Base64;
use Crypt::ECB;
$crypt = Crypt::ECB->new;
$crypt->padding(PADDING_NONE);
$crypt->cipher('Blowfish') || die $crypt->errstring;
$crypt->key('password');
$enc = $crypt->encrypt("12345678");
print encode_base64($enc);
This outputs 2mADZkZR0VM= with PADDING_NONE (which compares well with 'a.' above). However, when padding is set to PADDING_AUTO it outputs 2mADZkZR0VOZ5o+S6D3OZw== which is, in my mind at least, a bug as the plaintext is 8 characters long and in no need of padding.
c. If I use Crypt::Blowfish as below
#! c:\perl\bin
use Crypt::Blowfish;
use MIME::Base64;
my $key;
my $plaintext;
$key = "password";
$plaintext = "12345678";
my $cipher = new Crypt::Blowfish $key;
my $ciphertext = $cipher->encrypt($plaintext);
my $encoded = encode_base64( $ciphertext );
print $encoded;
then I get 2mADZkZR0VM= which matches 'a.' above. Trouble with this module though is that one has to segment things into 8 byte chunks for encoding; it has no chunker of its own.
d. If I use the source at http://linux.die.net/man/3/bf_ecb_encrypt (which I did for a recent PHP ext project) then I get the same answer as 'a.'. I'm inclined to trust this code the most as it's in use in SSLeay and OpenSSL.
e. The BlowfishEx.EXE in DI Management's Blowfish: a Visual Basic version with PKCS#5 padding gives 2mADZkZR0VOZ5o+S6D3OZw== which is the same as the Crypt::ECB results with PADDING_AUTO. With Padding set to None I get 2mADZkZR0VM= which matches 'a.'
I've partly answered my own question writing this: looks like I just have to modify DI Management's code for the VB6 project. And maybe suggest the same to the writer of Crypt::ECB.
But the question remains: is there a trustworthy blowfish reference platform?
Your problem doesn't seem to be with whether they implement the Blowfish algorithm correctly. In the "no padding" variant, they all seem to produce identical results.
That leaves a question of whether an 8-byte input should be padded at all. The answer to this is pretty simple: PKCS #7 requires that there is always at least one byte of padding added to the input. The reason for this is simple: on the receiving end, there needs to be an unambiguous way to remove the padding. In the case of PKCS#7, the value of the padding bytes is always the number of bytes of padding that needs to be stripped off by the recipient.
So, when you receive material that's been padded with PKCS7, you decrypt the block, then look at the last byte in the decrypted output, and remove that many bytes from the block. Without a block of padding on the end, the receiver would normally look at the last byte of actual data, and whatever value that byte happened to contain, strip off that many bytes (though it should probably tell you there's a problem if the number is larger than 8). In any case, to assure correct operation, there must always be at least one byte of padding. In your case, that means the input gets expanded out to 16 bytes instead of 8.

Categories