On Fri, 24 Feb 2006, Nicolas Pitre wrote: > If blocks are hashed evenly the cost of producing a delta is at most > O(n+m) where n and m are the size of the reference and target files > respectively. In other words, with good data set the cost is linear. > > But if many blocks from the reference buffer do hash to the same bucket > then for each block in the target file many blocks from the reference > buffer have to be tested against, making it tend towards O(n^m) which is > pretty highly exponential. Well, actually this is rather O(n*m) not O(n^m), but bad nevertheless. Nicolas - : send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html