Large number of files in single directory

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



There seems to be a filesystem limitation on most flavors of Linux I've worked on, in terms of a max number of files in a single directory - before tools like tar, gzip, rm, mv, cp and others stop working properly. For example, I have some users that have 2000+ files in a single directory (some as many as 10,000 files) and trying to tar these directories is always coming up with "argument list too long."

Is there a way for tar and these other tools to "see" all these files and process them as normal? I recall once I had to resort to something like "find . -print | xargs rm -fr" to remove thousands of files from a single directory. Is doing something similar but replacing "rm" with "tar" the only way to make this work, or does tar have some sort of command line switch (I couldn't find one) to work with extremely long argument lists?

Chris


-- redhat-list mailing list unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe https://www.redhat.com/mailman/listinfo/redhat-list

[Index of Archives]     [CentOS]     [Kernel Development]     [PAM]     [Fedora Users]     [Red Hat Development]     [Big List of Linux Books]     [Linux Admin]     [Gimp]     [Asterisk PBX]     [Yosemite News]     [Red Hat Crash Utility]


  Powered by Linux