Hi, A look through https://github.com/axboe/fio/blob/master/configure suggests for things related to HDFS suggests the following: * Have you made sure the environment variables JAVA_HOME , FIO_LIBHDFS_INCLUDE , FIO_LIBHDFS_LIB are set correctly before running configure? * Have you added the --enable-libhdfs ./configure parameter? On 2 February 2017 at 22:22, ankit patel <ankitpatel.edu@xxxxxxxxx> wrote: > Hello All > > I am trying to install FIO on my hadoop cluster and I aimed to measure > storage BW and latency through HDFS > https://github.com/axboe/fio/issues/301 > > while doing ./configure step i see > "HDFS engine no" > > followed by an error when I tried to use libhdfs with fio; > ./fio --name=random-writers --ioengine=libhdfs --iodepth=4 > --rw=randwrite --bs=32k --direct=0 --size=64m --numjobs=4 > fio: engine libhdfs not loadable > fio: failed to load engine libhdfs > fio: file:ioengines.c:91, func=dlopen, error=libhdfs: cannot open > shared object file: No such file or directory > > can you help me with how can i enable fio vi hdfs -- Sitsofe | http://sucs.org/~sits/ -- To unsubscribe from this list: send the line "unsubscribe fio" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html