Re: reduce compilation times?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Duft Markus wrote:
Hi!

This is where automated tools come in handy.  In my projects, I have
scripts that pick up source files and insert them into the
makefile(s). So with very little fuss I can add new files (either new
functionality or new split up code).

It really depends on the size of the class whether factoring makes
sense or not.  but if you have heaps of 3000+ line long functions, I
suspect you spend enough time searching through them as is.

When I was working on DB2 and scouring their often 6000+ line files
looking for why GCC 4.whatever.beta wasn't working as hot as it could,
it wasn't exactly a lot of fun.

I agree with such big files beeing no fun at all. I managed to keep a
structure where files don't get longer than say 500 lines.

Without even venturing into particularly good practice, we build with scripts which automatically split large source files containing sometimes 300 functions down to 1 function per file, compile them individually, and use a relocatable link to put together an object file corresponding to the large source file. Yes, this is definitely a way to reduce compilation time, when used together with a Makefile which applies minimum code size generation options for appropriate functions with maximum run time performance where needed.

---AV & Spam Filtering by M+Guardian - Risk Free Email (TM)---


[Index of Archives]     [Linux C Programming]     [Linux Kernel]     [eCos]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [The DWARVES Debugging Tools]     [Yosemite Campsites]     [Yosemite News]     [Linux GCC]

  Powered by Linux