Excerpts from Cédric Girard's message of 2011-08-19 18:06:53 +0200: > On Fri, Aug 19, 2011 at 5:43 PM, Thomas Dziedzic <gostrc@xxxxxxxxx> wrote: > > > > > > > I agree that your arguments have a valid point of view all the way up > > to this point where you lost me. > > For me, "lack of quality" is in the same category as "lack of quality > > impacts speed" > > For example, lets have the same badly written algorithm compiled with > > no optimization and the other being compiled with -O999 ZOMG!! > > It doesn't matter to me if one ruins your system faster, it will still > > do the same thing. > > This is why I think the "lack of quality impacts speed" issue being > > completely different from "lack of quality" is invalid. > > > I will try to explain my point with an example. Take a bash script which > needs to find some string into a file. > Let's do this the ugly way: > echo $(cat $file) | grep -q "%PROVIDES%.*$1" > Let's do this the correct way: > grep -q "%PROVIDES%.*$1" $file > > If both take the same resources to execute, you may say: OK, the first one > is ugly but I don't really care because both give the same result and there > is no performance impact. > Now, if the first one appears to be way slower than the second one, the > situation is different because not only it impacts the developer (complex > code hard to understand and maintains) but it also impacts the end user > (have to wait longer than needed). > > This example was one real example taken from yaourt at the state it was in > January 2010. There is nothing ugly in the way it will not work or break > your system. It was just ugly and slow code. Well, it was buggy as it couldn't handle some files, like packages with '+' in their names (example: lv2-c++-tools). Not sure this is fixed in the meantime. I use slurpy since a long time, it works with every package (it just has search, download and upload, no building included).