Re: Writing past the 2GB file size boundary on 32-bit systems

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



"D. R. Evans" <doc.evans@xxxxxxxxx> wrote:
>
> Is there a clear description anywhere of how to use C++ streams and
> ordinary C FILE* functions so that they don't fail when an attempt to write
> to a file goes past the 2GB boundary?

No.

I can't begin to describe how messed up this area is.  I could start
with K&R C and AT&T Unix (which had excuses), ISO C and POSIX (which
didn't) and carry on from there.  The one thing that I can say is
that the compiler (i.e. gcc, sensu stricto) has nothing to do with
the matter.

This is entirely a library and operating system issue and, by the
sound of it, you are knackered.  You can try creating such a file
using the underlying POSIX calls and see if that works.  You may
need to specify O_LARGEFILE in the open call.


Regards,
Nick Maclaren,
University of Cambridge Computing Service,
New Museums Site, Pembroke Street, Cambridge CB2 3QH, England.
Email:  nmm1@xxxxxxxxx
Tel.:  +44 1223 334761    Fax:  +44 1223 334679

[Index of Archives]     [Linux C Programming]     [Linux Kernel]     [eCos]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [The DWARVES Debugging Tools]     [Yosemite Campsites]     [Yosemite News]     [Linux GCC]

  Powered by Linux