Pavel Saviankou wrote:
Hi, Michael
Le jeudi 08 février 2007, vous avez écrit :
Pavel Saviankou wrote:
Hi,
i would like to compile a programm, which contains an static array with a
large amount of structures(containts 9 doubles) as elements. I try it on
two ways.
and the array with the pointers on they:
mystruct * array[823543] = { &struct1, &struct2, ...};
Now i get the "virtual memory exhausted: Out of memory" with both
versions of compiler and with both optimisations settings.
There's nothing for the compiler to optimize. No code is generated
when you initialize a large array. All the compiler does is create
a huge assembly file which contains the values, which in turn creates
a huge object file.
i have also routine, which interpolates the data.. It has to be optimized.
the array has to be only placed in the memory... i would like, to give this
part of job to compiler... why not... i don't have to take care about of
consistancy of data.. Any c++-source has ( weak one, but even it has) by the
structure a CRC check. And it is easy to use. Only #include "routine.h" and
go on!
Huh? Structures do not have CRC checks. If you have data consistency
problems, that's nothing that a compiler will help you with.
Compilers compile. They don't manage large data sets.
I would usually advise that you dynamically allocate memory using malloc
and read the data into this array.
i have reasons, to do it only if i don't have another possibilities.
one of them see above.
But first, consider how much data you have. Each of the elements in
your struct is 9 doubles, or 72 bytes. When you attempt to create
100,000,000 elements each of which is 72 bytes, you will need 7.2Gb
memory to hold the array. (Ignoring that the intermediate assembly
and object files will be larger than this.) Do you really have this
much memory? Does your operating system give this much memory to any
single process?
64bit, 100Gb swap + 4 Gb RAM ?
Sorry doesn't work that way.
--
Michael Eager eager@xxxxxxxxxxxx
1960 Park Blvd., Palo Alto, CA 94306 650-325-8077