On Fri, Aug 05, 2005 at 09:22:55AM +0800, Ian Kent wrote: > I also find it hard to understand why it is such a problem having a larger > stack. As you point out, as software evolves it ultimately becomes more > complex. If the developers design needs it and the software is reliable > and efficient (aka performs well) then why not. > > A quick caclulation. > > 2000*4k is about 8M in say 1G at least. > > Not a large percentage overhead I think. Now try finding 2000 _contiguous_ pairs of pages after the machine has been up for a while, under load. Memory fragmentation makes this a really nasty problem, and the VM eats its own head after repeatedly scanning every page in the system. Dave -- fedora-devel-list mailing list fedora-devel-list@xxxxxxxxxx http://www.redhat.com/mailman/listinfo/fedora-devel-list