Re: compiling large functions

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Martin Wiebusch wrote:
Hi,

I'm having trouble compiling generated code, which results in very large function bodies. The structure of the function is quite simple; just a long list of variable = expression assignments, where each expression is at most 30 lines or so. The entire sourcefile can be up to 10MB. (Yes, it's a beast of a calculation.) Even without optimization, gcc uses all my 2Gig of memory and eventually quits with a "cc1plus: out of memory" message.

I have had limited success with splitting up the code into several smaller functions. What puzzles me is that, if I put all functions into the same source file, gcc still uses much more memory than when I put them in separate files (and eventually dies for the 10MB source).

Basically, I'm wondering whether there is a way to tell gcc to just compile (and maybe even optimize?) one assignment at a time and not try to keep the entire function in memory (assuming that that's what it does now).

Try -fno-unit-at-a-time ?

Alternatively, just put the different functions in different files (e.g. func1.c, func2.c, func3.c, etc), and generate your makefile automatically so you don't need to keep track of the names.

Tom

[Index of Archives]     [Linux C Programming]     [Linux Kernel]     [eCos]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [The DWARVES Debugging Tools]     [Yosemite Campsites]     [Yosemite News]     [Linux GCC]

  Powered by Linux