On Wednesday 12 February 2003 09:29, devik wrote: > > > So if you use 5s interval in rrd it seems ok for me (it > > > is what i plan to do here). > > > > If you receive an update each second, you have the feeling it's realtime. > > It's slow enough to understand the data and it's fast enough to feel it > > as real-time. It's also fast enought that you get new data before you > > are tired to look at the old data. > > hmm :) really depends on angle of view .. From my experience > ("btw" tool) 1sec is too fast because I see results like: > 30kbit, 28, 10, 33,15,35 .... I can see every packet burst > and thus my brain is not good enough to compute average from it > on fly. So I use 10sec moving average to have something senseful. > > On other side, there are two time variables : > - show rate > - EWMA time constant > > I agree that you can have time constant 30sec and > sampling/show rate 1sec. Then you get smooth and fast > updates :) > Still you can use rrd because it is hierarchical - you > can have last minute in secs resolution, then last hour > in minutes resolution etc... Storing the value is no problem, but showing. If don't think it's such a good idea to refresh a webpage and relaod (and draw) the graphs on it each second. So rrd for long term overview, java (of VB) for real time overview. Stef -- stef.coene@docum.org "Using Linux as bandwidth manager" http://www.docum.org/ #lartc @ irc.oftc.net