09.02.2015 00:35, Georg Chini пиÑ?еÑ?: > On 08.02.2015 20:30, Georg Chini wrote: >> On 08.02.2015 19:54, Alexander E. Patrakov wrote: >>> 01.02.2015 03:43, Georg Chini wrote: >>>> + /* Minimum number of adjust times + 1 needed to adjust at 0.75% >>>> deviation from base rate */ >>>> + min_cycles = (double)abs(latency_difference) / u->adjust_time / >>>> 0.0075 + 1; >>>> + >>>> + /* Rate calculation, maximum deviation from base rate will be >>>> less than 0.75% due to min_cycles */ >>>> + new_rate = base_rate * (1.0 + latency_difference / min_cycles / >>>> u->adjust_time) + 0.5; >>> >>> What's the aim here with min_cycles? Why not just clamp new_rate >>> post-factum to 0.75% vicinity of base_rate, as this is done in the 2â?° >>> case? >>> >> Without min_cycles you will far more often hit the 2 â?° limit and when >> you are approaching the >> base_rate. This seriously disturbs the regulation. The goal was to get >> out to 0.75% as quick as >> possible while approaching the base rate cautiously (with a weak >> regulator when latency is far off). >> Also without min_cycles you see the rates hopping up and down (due to >> the 2â?° limitation), you do >> not see a (more or less) continuous rate function. > > Doing what you suggest would give you 0.75% as long as the latency is > more than one > cycle off - but from 0.75% rate deviation you need at least 4 steps to > go back to the > base_rate with the 2 â?° restriction, so you would seriously over-regulate. Thanks for your answers. Now I consider the adjust_rate() function fully reviewed. So, now (or, rather, tomorrow) I have to properly split this patch, taking your explanations into account. -- Alexander E. Patrakov