Currently, we use "ls ... | sort -R | head -n1" (or tail) to choose a random file in a directory.It sorts the files with "ls", sort it randomly and pick the first line, which wastes the "ls" sort. Also, using "sort -R | head -n1" is inefficient. For example, in a directory with 1000000 files, it takes more than 15 seconds to pick a file. $ time bash -c "ls -U | sort -R | head -n 1 >/dev/null" bash -c "ls -U | sort -R | head -n 1 >/dev/null" 15.38s user 0.14s system 99% cpu 15.536 total $ time bash -c "ls -U | shuf -n 1 >/dev/null" bash -c "ls -U | shuf -n 1 >/dev/null" 0.30s user 0.12s system 138% cpu 0.306 total So, we should just use "ls -U" and "shuf -n 1" to choose a random file. Introduce _random_file() helper to do it properly. Signed-off-by: Naohiro Aota <naohiro.aota@xxxxxxx> --- common/rc | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/common/rc b/common/rc index 5c4429ed0425..4d414955f6d9 100644 --- a/common/rc +++ b/common/rc @@ -5224,6 +5224,13 @@ _soak_loop_running() { return 0 } +# Return a random file in a directory. A directory is *not* followed +# recursively. +_random_file() { + local basedir=$1 + echo "$basedir/$(ls -U $basedir | shuf -n 1)" +} + init_rc ################################################################################ -- 2.41.0