Digz:
Stack size is set by the OS, and there are typically ways to adjust
this in each OS. This is the part where we'll get berated for being
off topic, as this isn't a GCC question, but rather one appropriate
for a general programming list, or for one about your OS.
The default stack size for some OS's (like Mac OS X) is 8 megabytes.
Ten million integers at 4 bytes each would take 40 million bytes ~ 40
megabytes, and overflow the stack.
If you're interested in learning more, I suggest doing Google
searches for stack size/allocation, etc.. However, I think the
preferred technique is to malloc large arrays, in the event that your
code is moved from a machine/OS with a large stack to a machine with
an unacceptably small stack, etc.
Blake
"When the going gets tough, the tough get aeronautical."-Howling Mad
Murdoch
On Dec 3, 2006, at 1:18 PM, Digvijoy Chatterjee wrote:
Thanks,
the heap allocation code (malloc) ran without a problem..
so does a stack overflow mean ,that the stack can grow only to a
maximum size in the memory ??..who sets this ,and how does one change
this..
10,000,000 integers seems very little to overflow the stack.
-Digz
On 12/3/06, Blake Huff <stangmechanic@xxxxxxxxx> wrote:
I believe this is what people refer to as stack overflow. Take a
look at
allocating the memory with something like malloc, i.e.,
int *i = (int *) malloc(N * sizeof(int));
Blake Huff
stangmechanic@xxxxxxxxx
"I'd like to take this opportunity to thank my cat for letting me
live
here."
On Dec 3, 2006, at 12:56 PM, Digvijoy Chatterjee wrote:
Hi ,
I was trying to test algorithms with huge int arrays when i started
getting segfaults..here is a simple program which emulates this
behaviour...
Each time i run this program , it segfaults , This might be very
naive
..but can anyone help me with what I am missing here ?
static const int N=10000000;
int main()
{
int c, i[N];
for (c=0;c<N ; c++ )
i[c]=c;
}
Thanks
Digz