Hi, On Mon, 2003-02-24 at 20:39, Cecchi, Gianluca wrote: > find /foofs -user foouser -maxdepth 3 -xdev -type f -exec rm {} \; > > In this way you execute the rm commands one by one and don't encounter the arg list too long message. > If instead you use a command like > > find path expression | xargs rm > > you encounter the same limit problem. No, on Linux at least, xargs will not show a limit problem --- it will just submit the command with a valid-length partial list of arguments as many times as necessary to deal with each file. For instance: $ seq 10000 | xargs echo hello | grep -c hello 10 shows that xargs took the 10000 arguments and ran the command "echo hello <arglist>" 10 times before exhausting the arguments. Piping to xargs does pose a few traps if you're using "find" on filenames which contain odd characters like newlines and spaces, but there's an easy way around that --- "find -print0" will print the filenames null-terminated rather than separated by newlines, and "xargs -0" will accept null-terminated input, making the filename delimiting unambiguous. So $ find <path> <find-args> -print0 | xargs -0 <command> is my preferred way of iterating a command over multiple files. Cheers, Stephen _______________________________________________ Ext3-users@redhat.com https://listman.redhat.com/mailman/listinfo/ext3-users