Hi list, when I use OpenSSL I suspect some funny business going on with the HMAC computation of "openssl dgst" command line tool. Consider: $ echo -n foobar | openssl dgst -sha256 -hex -hmac aabbcc (stdin)= 6e74cdc3b72b8b66535b914357c7d656a22acbb1700b4e6de688fd5c091d305c But #include <stdio.h> #include <stdint.h> #include <stdbool.h> #include <openssl/hmac.h> #include "hexdump.h" int main() { uint8_t digest[32]; HMAC_CTX hmacCtx; HMAC_CTX_init(&hmacCtx); HMAC_Init_ex(&hmacCtx, "\xaa\xbb\xcc", 3, EVP_sha256(), NULL); HMAC_Update(&hmacCtx, "foobar", 6); unsigned int length; HMAC_Final(&hmacCtx, digest, &length); HMAC_CTX_cleanup(&hmacCtx); HexDump(digest, 32); return 0; } Yields 985343745ee86b452c7c0b327171829c77e1a022f423d95156b52fa22083db8e Also, Python: #!/usr/bin/python3 import Crypto.Hash.HMAC import Crypto.Hash.SHA256 key = b"\xaa\xbb\xcc" data = b"foobar" hmac = Crypto.Hash.HMAC.new(digestmod = Crypto.Hash.SHA256, key = key) hmac.update(data) result = hmac.digest() print("".join("%02x" % (c) for c in result)) Yields 985343745ee86b452c7c0b327171829c77e1a022f423d95156b52fa22083db8e Am I using "openssl dgst" wrong or is it just plain broken? Regards, Johannes