Re: memory consumption

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Jay K <jay.krell@xxxxxxxxxxx> writes:

> I hit similar problems building gcc in virtual machines that I think had 256MB. I increased them to 384,
>
>
> Maybe gcc should monitor its maximum memory? And add a switch
> -Werror-max-memory=64MB, and use that when compiling itself, at
> least in bootstrap with optimizations and possibly debugging
> disabled? Or somesuch?

A --param setting the amount of memory required is a good idea for
testing purposes.  However, frankly, it would very unlikely that we
would set it to a number as low as 64MB.  New computers these days
routinely ship with 1G RAM.  Naturally gcc should continue to run on
old computers, but gcc is always going to require virtual memory, and
on a virtual memory system I really don't think 512MB or 1G of virtual
memory is unreasonable these days.

It would be folly to let old computers constrain gcc's ability to
optimize on modern computers.  A better approach is to use gcc's
well-tested ability to cross-compile from a modern computer to your
old computer.


> I guess I can just make do with 4.3.5 built with host cc.
> Maybe I'll try splitting up some of the files. Is that viable to be applied for real?

I think you will make better progress by using -O1 when you compile.

Ian


[Index of Archives]     [Linux C Programming]     [Linux Kernel]     [eCos]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [The DWARVES Debugging Tools]     [Yosemite Campsites]     [Yosemite News]     [Linux GCC]

  Powered by Linux