Zed wrote: > I've tried to make my c++ source files small in my current project. Compiling > my approx. 20 files takes very long time. I created a new source file that > just includes all my original ones with #include statements, and compiled > it. This method was much quicker. Compiling all files included in one takes > approx. double the time compared to that of compiling just one of the small > source files. So If I make a change to some header that is included by many > of the source files, the last described method that includes the files to > one is much faster. This is essentially what the -combine switch does, except that switch only supports C. > It seems there is a lot of compilation time overhead if the code is compiled > in many small parts - this is without taking linking into account. Is there > any settings or some trick that can reduce this overhead, except my ugly > inclusion method? Yes, there's the startup/cleanup overhead of the compiler itself, plus the overhead of parsing all the various headers 20 times instead of once. You can try using a precompiled header to reduce the cost of the latter, see <http://gcc.gnu.org/onlinedocs/gcc/Precompiled-Headers.html>. Note that this isn't something you can just switch on, you have to think a little bit about how to implement it, but the manual gives some good suggestions about how to do that. Brian