(resending, with my subscription to <openssl-users@xxxxxxxxxxx> completed) Hi OpenSSL Developers, (cross-posting <openssl-users@xxxxxxxxxxx> and <devel@xxxxxxxxxxxxxx>,) OpenSSL commit [1] changed the representation of the "entropy amount" -- later renamed to "randomess" in [2] -- from "int" to "double". I've read the commit message: commit 853f757ecea74a271a7c5cdee3f3b5fe0d3ae863 Author: Bodo Möller <bodo@xxxxxxxxxxx> Date: Sat Feb 19 15:22:53 2000 +0000 Allow for higher granularity of entropy estimates by using 'double' instead of 'unsigned' counters. Seed PRNG in MacOS/GetHTTPS.src/GetHTTPS.cpp. Partially submitted by Yoram Meroz <yoram@xxxxxxxxxxxxxxx>. and also checked "MacOS/GetHTTPS.src/GetHTTPS.cpp" at the same commit. But, I'm none the wiser. Can someone please explain what is gained by using a floating point type here? Is it really a relevant use case that entropy is fed from an external source to OpenSSL such that truncating the amount to a whole number of bits would cause significant lossage? (Admittedly, it could be relevant if the individual randomness bit counts were in the (0, 1) interval, both boundaries exclusive.) Using floating point for randomness representation is a problem for environments that prefer to avoid floating point altogether, such as edk2 ("UEFI") firmware Thanks, Laszlo [1] https://github.com/openssl/openssl/commit/853f757ecea7 [2] https://github.com/openssl/openssl/commit/f367ac2b2664