> > So if you use 5s interval in rrd it seems ok for me (it > > is what i plan to do here). > If you receive an update each second, you have the feeling it's realtime. > It's slow enough to understand the data and it's fast enough to feel it as > real-time. It's also fast enought that you get new data before you are tired > to look at the old data. hmm :) really depends on angle of view .. From my experience ("btw" tool) 1sec is too fast because I see results like: 30kbit, 28, 10, 33,15,35 .... I can see every packet burst and thus my brain is not good enough to compute average from it on fly. So I use 10sec moving average to have something senseful. On other side, there are two time variables : - show rate - EWMA time constant I agree that you can have time constant 30sec and sampling/show rate 1sec. Then you get smooth and fast updates :) Still you can use rrd because it is hierarchical - you can have last minute in secs resolution, then last hour in minutes resolution etc... devik