On May 23, 2012 9:14 AM, "Tedd Sperling" <tedd@xxxxxxxxxxxx> wrote: > > Hi gang: > > On May 21, 2012, at 8:32 PM, tamouse mailing lists wrote: > > A rule of thumb is no more than 50 lines per > > function, most much less. Back in the day when we didn't have nifty > > gui screens and an 24 line terminals (yay green on black!), if a > > function exceeded one printed page, it was deemed too long and marked > > for refactoring. > > You hit upon a theory of mine -- and that is our functions grow in size up to our ability to view them in their totality. When our functions get beyond that limit, we tend to refactor and reduce. > > I know from the last several decades of programming, my functions have increased in number of lines. But, they have reached a limit that limit is generally about the number of lines I can read in half of my monitor's height. This of course, is dependent on monitor resolution, font-size, and how far I am sitting from the monitor. But I think this is a natural and physical limit that we don't normally recognize. I can cite studies that support my theory. > > It would be an interesting survey to ask programmers to review their code and provide the average number of lines in their functions AND how many lines of code their monitor's can display. In other words, look at your editor; count the number of lines your monitor can display; estimate the number of lines in your average function; and report the findings. For example, mine is about half -- my monitor can display 55 lines of code and my average function is around 25 lines. YMMV. > > Interesting, yes? > > Cheers, > > tedd > > > _____________________ > tedd.sperling@xxxxxxxxx > http://sperling.com > > > > > Yes, I think that is *exactly* the criterion-- not a mystery or an emergent thing, really, was a pretty expicit reasoning--being able to see/scan the entire function on one page (or now in one screenful) makes it much easier to see what happens in the function, where blocks open/close, and it forces one to break up code into logical units.