Bernd Petrovitsch wrote:
It is done IMHO with the false knowledge that "sizeof(int) == 4 *
sizeof(char)".
No, it's correctly been told that ints are 32 bits wide, but then it's
converting this to *bytes* (not chars) by dividing by the hard-coded
constant 8.
(By 'byte' I'm referring to the quantum of addressingness of the
underlying machine architecture. In other words, the numbers that are
used as parameters as offsets into load and store. This isn't something
that's exposed to C except on architectures where sizeof(byte) ==
sizeof(char), i.e., all sensible ones.)
[...]
ACK. Therefore "sizeof(char) == 1" must always hold.
Yes; but that is only true from C's perspective.
We're dealing with things from the machine code perspective, where
sizeof(byte) == 1, and sizeof(char) is not necessarily the same as
sizeof(byte).
I find it really helps with this stuff if you are capable of holding two
completely contradictory beliefs at the same time. A certain level of
insanity can help too...
--
David Given
dg@xxxxxxxxxxx
--
To unsubscribe from this list: send the line "unsubscribe linux-sparse" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html