Isn't new[] supposed to throw std::bad_alloc in this situation?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi everyone,

Note:  32-bit application.  ILP32, LL64.

Isn't new[] supposed to throw std::bad_alloc in this situation?

- - - - - - - - - - - - - - -
int main()
{
  int size = 0x4000000A;
  int* p = new int[size];
  for(int i = 0; i < size; ++i)
  {
    p[i] = i; // Crashes here.
  }
}
- - - - - - - - - - - - - - -

The above loop is crashing sometime after p[9].  (On my system, it crashes on p[0x3FFAC].  Which is well after p[9] but long before p[0x40000009].)

My assumption is that the problem for this 32-bit toy app where sizeof(int) is 4, is that 0x4000000A * 4 ==> 0x100000028, which gets (0x100000028 & 0xFFFFFFFF) sliced to 0x28.

But, I presume, the size_t memory allocation request is being sliced to 0x28 before the new routine gets a chance to notice that the size * sizeof(type) is just too darn big.  I also presume that the p[0x3FFAC] crash is just happenstance to do how the heap is allocated from the OS, and the code is trashing vast tracks of the heap before running off the end into SEGV land.

My (naive?) expectations are that the new[] will throw a std::bad_alloc.

Is this code undefined behavior, or working-as-expected, or bug in GCC's C++ compiler (4.0.1, in this case), or a PICNIC error?

PICNIC - Program In Chair, Not In Computer

If it is a bug in the GCC C++ compiler (4.0.1), has it since been fixed in a more recent version?

Thanks,
--Eljay

# g++ crash.cpp

# ./a.out
Segmentation fault

# g++ --version
i686-apple-darwin9-g++-4.0.1 (GCC) 4.0.1 (Apple Inc. build 5490)


[Index of Archives]     [Linux C Programming]     [Linux Kernel]     [eCos]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [The DWARVES Debugging Tools]     [Yosemite Campsites]     [Yosemite News]     [Linux GCC]

  Powered by Linux