Re: 128-bit integer - nonsensical documentation?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I sense there is a consensus that
1) the 128bit integer is emulated emulated on 64-bit platforms, not available on 32-bit platforms, and is not native anywhere
2) the long long int is 64-bits everywhere so you can *NEVER* do what the document seems to suggest one *MIGHT* be able to do —  input a 128-bit constant

To me, this would justify rewriting the documentation.

My personal lament is that i still cannot find out anywhere if it is available on all 64-bit platforms or on intel only.

KS

> On Aug 26, 2015, at 3:22 PM, Jonathan Wakely <jwakely.gcc@xxxxxxxxx> wrote:
> 
> On 26 August 2015 at 12:04, Kostas Savvidis wrote:
>> The online documentation contains the attached passage as part of the "C-Extensions” chapter. There are no actual machines which have an " integer mode wide enough to hold 128 bits” as the document puts it.
> 
> It's not talking about machine integers, it's talking about GCC
> integer modes. Several targets support that.
> 
>> This would be a harmless confusion if it didn’t go on to say “… long long integer less than 128 bits wide” (???!!!) Whereas in reality "long long int” is 64 bits everywhere i have seen.
> 
> 
> Read it more carefully, it says you can't express an integer constant
> of type __int128 on such platforms.
> 
> So you can't write __int128 i =
> 999999999999999999999999999999999999999999999999999999999999;





[Index of Archives]     [Linux C Programming]     [Linux Kernel]     [eCos]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [The DWARVES Debugging Tools]     [Yosemite Campsites]     [Yosemite News]     [Linux GCC]

  Powered by Linux