Re: Getting greatest decimal accuracy out of G_PI

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, Feb 02, 2007 at 04:30:17PM -0500, zentara wrote:
> Please pardon my basic questions, but I am self taught
> and run into questions I can't solve thru googling.

The problem of this question is rather that it's completely
unrelated to Gtk+.

> I see that G_PI is defined in the header to
> 3.14159265358979323846

I wonder where you found this.  gtypes.h defines G_PI:

  #define G_PI    3.1415926535897932384626433832795028841971693993751

> So on my 32 bit Athlon machine, I get limited precision
> to 15 decimal places, do I need a 64 bit machine for better
> precision? Or is there a different format to use?

The standard IEEE 754 double precision floating point type
has 64 bits, of which 52 bits is mantissa, that's 15-16
decimal digits.  The standard type obviously has the same
precision everywhere because if it differed it would not be
the standard type any more.

Both i386 (or rather its 387 math coprocessor) and x86_64
are capable of calculation with 80bit floating point
numbers, mantissa is 64 bits, that's 19-20 decimal digits.
These numbers are available as the `long double' type.  long
double is implementation-defined, it can look differently on
other platforms.

Yeti


--
Whatever.
_______________________________________________
gtk-list mailing list
gtk-list@xxxxxxxxx
http://mail.gnome.org/mailman/listinfo/gtk-list

[Index of Archives]     [Touch Screen Library]     [GIMP Users]     [Gnome]     [KDE]     [Yosemite News]     [Steve's Art]

  Powered by Linux