Hello: I try to keep repositories routinely repacked and optimized for clones, in hopes that most operations needing lots of objects would be sending packs straight from disk. However, every now and again a client from a slow connection requests a large clone and then takes half a day downloading it, resulting in gigabytes of RAM being occupied by a temporary pack. Are there any strategies to reduce RAM usage in such cases, other than vm.swappiness (which I'm not sure would work, since it's not a sleeping process)? Is there a way to write large temporary packs somewhere to disk before sendfile'ing them? -K