I have most of my targets exported to Windows desktops at the moment. My first goal is to centralize big games onto a central file server with an SSD cache (Check). I have a few desktops running emulex cards, and using OneCommand Manager to manage them all, and on the server side, most volumes are snapshot+clone images from a single shared master, so that updates take up space, but the original install is shared (Right now cerca 1GB per machine, but 250GB shared). Works well, but I've encountered some strange problems on the management side, and I want to know where the problem lies, is it with LIO in general, is it with my LIO Config, or is it with OneCommand Manager from Emulex, or possibly, with the Windows Emulex drivers. If it is a client bug, and there's not an easy server-side workaround, I'll contact Emulex, but I'm hopeful someone knows exactly what's wrong and can tell me how to fix it. The problem is twofold. First, any LUN exported as 4K shows in OCM with "n/a" in EVERY field of the LUN INformation tab. It's as if OCM acknowledges that it exists, but knows nothing about it. Other than this, it functions sufficiently at the OS level, in that I can read and write data to the drive. Second, If LUN 0 is a 4K drive, OCM shows absolutely no information about any LUNs, and actually goes so far as to say "No Luns are available", as I happily run Diablo 3 from my SAN. I actually assume this second facet is in fact a OCM bug and not a LIO or COnfig bug, but I hope fixing 1 will resolve this as well. Finally, worth noting, 4K volumes perform in excess of 4x faster on random IOPS and Sequential Read, in large part because of the backing ZFS filesystem and COW operations on a wide raidz pool. I'd get better performance with a larger block size, but alas, I also introduce other issues on the Windows side. Below is an inline of my config file, with some superfuous snippets removed. All attributes and parameters are preserved so as to ensure nothing important is omitted. Thank you for any and all assistance. I'll probably have 1 more request for help after this one is resolved. Wish there was something like an issue tracker to make this easier. storage fileio { disk common.empty { buffered no path /vdisk/common/img size 1.0GB wwn 31b83880-1ea3-40d6-a72e-3547361a6e15 attribute { block_size 512 emulate_3pc yes emulate_caw yes emulate_dpo yes emulate_fua_read yes emulate_fua_write yes emulate_model_alias yes emulate_rest_reord no emulate_tas yes emulate_tpu no emulate_tpws no emulate_ua_intlck_ctrl no emulate_write_cache no enforce_pr_isids yes fabric_max_sectors 8192 is_nonrot yes max_unmap_block_desc_count 1 max_unmap_lba_count 8192 max_write_same_len 4096 optimal_sectors 16384 queue_depth 128 unmap_granularity 1 unmap_granularity_alignment 0 } } disk izanami.games { buffered no path /vdisk/izanami/games/img size 256.0GB wwn 4aba9819-afa7-4746-b03b-d6456d4fb802 attribute { block_size 4096 emulate_3pc yes emulate_caw yes emulate_dpo yes emulate_fua_read yes emulate_fua_write yes emulate_model_alias yes emulate_rest_reord no emulate_tas yes emulate_tpu no emulate_tpws no emulate_ua_intlck_ctrl no emulate_write_cache no enforce_pr_isids yes fabric_max_sectors 8192 is_nonrot yes max_unmap_block_desc_count 1 max_unmap_lba_count 8192 max_write_same_len 4096 optimal_sectors 2048 queue_depth 128 unmap_granularity 1 unmap_granularity_alignment 0 } } } fabric qla2xxx { target 21:00:00:1b:32:9d:12:dd { enable 1 attribute { authentication no cache_dynamic_acls yes default_cmdsn_depth 16 default_erl 0 demo_mode_discovery yes demo_mode_login_only 1 demo_mode_write_protect yes generate_node_acls yes login_timeout 15 netif_timeout 2 prod_mode_write_protect no } auth { password "" password_mutual "" userid "" userid_mutual "" } parameter { AuthMethod CHAP DataDigest "CRC32C,None" DataPDUInOrder yes DataSequenceInOrder yes DefaultTime2Retain 20 DefaultTime2Wait 2 ErrorRecoveryLevel no FirstBurstLength 65536 HeaderDigest "CRC32C,None" IFMarkInt "2048~65535" IFMarker no ImmediateData yes InitialR2T yes MaxBurstLength 262144 MaxConnections 1 MaxOutstandingR2T 1 MaxRecvDataSegmentLength 8192 MaxXmitDataSegmentLength 262144 OFMarkInt "2048~65535" OFMarker no TargetAlias "LIO Target for Izanami" } lun 0 backend fileio:common.empty lun 1 backend fileio:izanami.games acl 10:00:00:00:c9:c0:c9:f4 { attribute { dataout_timeout 3 dataout_timeout_retries 5 default_erl 0 nopin_response_timeout 30 nopin_timeout 15 random_datain_pdu_offsets no random_datain_seq_offsets no random_r2t_offsets no } auth { password "" password_mutual "" userid "" userid_mutual "" } mapped_lun 0 { target_lun 0 write_protect yes } mapped_lun 1 { target_lun 1 write_protect no } } acl 10:00:00:00:c9:c0:c9:f5 { attribute { dataout_timeout 3 dataout_timeout_retries 5 default_erl 0 nopin_response_timeout 30 nopin_timeout 15 random_datain_pdu_offsets no random_datain_seq_offsets no random_r2t_offsets no } auth { password "" password_mutual "" userid "" userid_mutual "" } mapped_lun 0 { target_lun 0 write_protect yes } mapped_lun 1 { target_lun 1 write_protect no } } } } -- To unsubscribe from this list: send the line "unsubscribe target-devel" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html