On Mon, 1 Jul 2024, Linus Torvalds wrote: > The architecture was wrong 30 years ago. It's not that it "became" > wrong in hindsight. It was wrong originally, and it's just that people > hadn't thought things through enough to realize how wrong it was. > > The only way it's not wrong is if you say "byte accesses do not > matter". That's a very Cray way of looking at things - Cray 1 had a > 64-bit "char" in C, because there were no byte accesses. > > That's fine if your only goal in life is to do HPC. > > So if you simply don't care about bytes, and you *only* work with > words and quad-words, then alpha looks ok. > > But honestly, that's basically saying "in a different universe, alpha > is not a mis-design". Precisely my point! We got so used to think in multiples of 8 bits that other approaches seem ridiculous. The PDP-10 operated on 36-bit quantities and strings were essentially clusters of 6-bit characters packed into 6-packs (which is also allegedly where the C language's original limitation of using at most six characters for identifiers came from -- so that the PDP-10 could compare a pair with a single machine instruction). So there was already legacy of doing things this way at DEC back in ~1990 and I can envisage engineers there actually thought that to have a machine that in C terms has 32-bit shorts and ints, 64-bit longs and pointers, and strings as clusters of 8-bit characters packed into 4-packs or 8-packs was not at all unreasonable. Or maybe just plain 32-bit characters. After all you don't absolutely *have* to use data types of 8 or 16 bits exactly in width for anything, do you? NB for strings nowadays we have Unicode and we could just use UTF-32 if not to waste memory. And even now ISO C is very flexible on data type widths and only requires the character data type to be at least 8 bits wide, and 16-bit and 24-bit examples are actually given in the standard itself. Yes, POSIX requires the character data type to be 8 bits wide exactly now, but POSIX.1-1988 deferred to ANSI C AFAICT. Maciej