Hello. I?working on a projecusing netem and have come across a situation that I?m not sure to classify as ?tc? or ?netem?. I?using neteas a qdisc on a leaf to delay packets. I want to be able to enable or disable this delay at runtime. It appears, however, that if I delete the netem leaf qdisc that I end up with lost packets. This isn't too surprising in retrospect. Do you agree this behavior occurs? For example, doing thfollowing appears to causloss whil[ 1 ] do echo "Adding delay" tc qdisc add dev eth1 paren1:21 handl20: netem delay 200ms 10ms distribution normal sleep 1 echo "Removing delay" tc qdisc del dev eth1 paren1:21 handl20: netem delay 200ms 10ms distribution normal sleep 1 done Is thera preferred/recommended way to enabland disable packet delay at runtime without packet loss? Much thanks! Frokalayika007 agmail.com Fri Jul 18 12:14:21 2014 From: kalayika007 agmail.co(Kalayika H) Date: Fri, 18 Jul 2014 12:14:21 -0000 Subject: neteloss simulation increasing delay Message-ID: <CAD4ar6uZxwF1XgYwzicY3O4OaygrZw55Ou8=5JZx2J12kDG2dg@xxxxxxxxxxxxxx> Hi all, I afacing somwhat strange situation when simulating loss with netem. WheI simulatloss using netem, it is actually increasing the RTT which i calculatin my module. WheI simulatonly delay without any loss, RTT is fine and netem is able to apply thsedelay value perfectly. Casomebody explain threason for this? or Is that normal ?? Thanks iadvance. Regards, Kalayika -------------- nexpar-------------- AHTML attachmenwas scrubbed... URL: <http://lists.linuxfoundation.org/pipermail/netem/attachments/20140718/ad7de994/attachment.html>