On Sun, 2018-09-02 at 22:16 -0600, Chris Murphy wrote: > Fedora 28 Server > Fedora 28 Workstation > dd-wrt 802.11n Broadcom based router > Connected wirelessly 5GHz, wired ethernet cable is physically > disconnected from Server > > File transfer from Workstation to Server > > scp: scp itself reports ~620KB/s; where nload on the server reports > ~4.9Mbit/s > smb: GNOME reports 4.9MB/s; where nload on the server reports > ~39.8Mbit/s > > Why? That's rather unexpected. > > Command is > scp test.bin f28s.local:/srv/scratch > > Using nc, I get speeds slightly faster than smb. OK so encryption? If > I connect wired, and then 'nmcli c down <ID>' to disconnect the > wireless connection: > > scp: 12MB/s, nload ~101Mbit/s > smb: nload ~96Mbit/s > > Sooooo, it's not encryption. Why would scp be this much slower only > with a wireless connection? And using: > > rync -avzhe ssh test.bin f28s.local:/srv/scratch > > Over wireless, this is just as bad as scp. SCP protocol is really slow, especially on networks with high latency (wireless). The reason why is mostly the size of buffers, which is very small and SCP waits for every part to be confirmed by the remote host before sending another part. You can google "scp speed" and you will get a lot of answers, sometimes wrongly accusing the encryption or the compression, but really, the RTT and buffers are the fault as I write here: https://superuser.com/a/1101203/466930 SCP should be really used only as fast hack for copying files in fast local networks. For all other cases, use SFTP or rsync if you need something more complex. Regards, -- Jakub Jelen Software Engineer Security Technologies Red Hat, Inc. _______________________________________________ users mailing list -- users@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe send an email to users-leave@xxxxxxxxxxxxxxxxxxxxxxx Fedora Code of Conduct: https://getfedora.org/code-of-conduct.html List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines List Archives: https://lists.fedoraproject.org/archives/list/users@xxxxxxxxxxxxxxxxxxxxxxx