Writing past the 2GB file size boundary on 32-bit systems

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Sorry... originally posted from an unsubscribed account, which seems to
mean that some people saw it but it wasn't posted to the entire reflector.

I've also added some more detail.

----

Using gcc/g++ 3.3.1 on an old 32-bit Mandrake 9.2.1 production system
(which means that I can't update to anything newer).

Is there a clear description anywhere of how to use C++ streams and
ordinary C FILE* functions so that they don't fail when an attempt to write
to a file goes past the 2GB boundary?

I have found various vague comments and a few suggestions scattered around
various places, but nothing I've tried so far has worked. I've spent a day
and a half trying various things and figured it was now time to ask in a
place where people probably know how to get this working :-)

If anyone cares about the part of the problem I'm trying to solve first: I
have attached cout to a file using the mechanism in Josuttis pp 641-642. It
works fine until the file hits 2GB, at which point the program prints that
the file size limit has been exceeded and then terminates. This is true no
matter what macros I define (in particular -D_LARGEFILE_SOURCE and
_D_FILE_OFFSET_BITS=64).

  Doc Evans







[Index of Archives]     [Linux C Programming]     [Linux Kernel]     [eCos]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [The DWARVES Debugging Tools]     [Yosemite Campsites]     [Yosemite News]     [Linux GCC]

  Powered by Linux