Re: Integer arithmetic vs double precision arithmetic

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Shriramana Sharma wrote:

> > E.g. The Unix API uses either seconds since midnight UTC, Jan 1st 1970
> > (time(), typically 32 bits) or seconds and microseconds since that
> > time (gettimeofday(), typically 32 bits for each component), while
> > Java uses milliseconds since that date (64 bits).
> 
> Oh! I did not know that Java can return system time to millisecond
> precision. 

To be precise, it returns the time at millisecond *granularity*; the
precision is limited by the OS.

> Is there a C or C++ function that can do that?

gettimeofday() returns the current time at microsecond granularity.

-- 
Glynn Clements <glynn@xxxxxxxxxxxxxxxxxx>
-
: send the line "unsubscribe linux-c-programming" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Assembler]     [Git]     [Kernel List]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [C Programming]     [Yosemite Campsites]     [Yosemite News]     [GCC Help]

  Powered by Linux