Is it because they are hex characters and hence only need 4 bit to store per character but we display each of those 4 bits as a character as a hex value (0 to 9 and a-f) all of which in ASCII and UTF-8 require a byte to represent? Hence the length of 32 (or 36 with dashes)?
On Wed, Oct 7, 2020 at 8:10 PM Hemil Ruparel <hemilruparel2002@xxxxxxxxx> wrote:
Sorry if this is silly but if it is a 128 bit number, why do we need 32 characters to represent it? Isn't 8 bits one byte?On Wed, Oct 7, 2020 at 8:08 PM Thomas Kellerer <shammat@xxxxxxx> wrote:Hemil Ruparel schrieb am 07.10.2020 um 16:21:
> it is declared as uuid. But how does it occupy only 16 bytes?
Because a UUID is internally simply a 128bit number - the dashes you see are just formatting.
But if you can only send the text represnation, then yes 32 characters aren't enough.