ESXi 5.5 iSCSI to FreeNas - no devices found

Status
Not open for further replies.

cmdematos

Dabbler
Joined
Dec 24, 2014
Messages
12
I have ESXi 5.5 and have installed both FreeNas 9.3 (latest current) as well as 9.2.1.9.

I have followed instructions from blogs on the internet showing how to get 9.2.1.9 to serve iSCSI to ESX. But try as I might, I just cannot get ESX to see any devices or paths. We connect to FreeNas successfully, have turned auth off on both sides, have LUC turned on for 9.2.x (don't see any such option on 9.3) but nothing is ever discovered.

I have poured over the properties but I cannot discover what the issue is. I assume FreeNAS plays nicely with ESXi 5.5 and have burned over 8 hours on this. I don't know where else to look - please help in any way you can.
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
Haven't tried 9.3.

Generally speaking, though, try these steps:

1) Look in the VMware logs for any clues

2) Look in the FreeNAS logs for any clues

3) Carefully walk through every configuration screen and make sure you haven't missed something that you'll then be saying "well that was dumb" (it's happened to us all, iSCSI configuration is always too-complex)

4) Try validating the iSCSI configuration by using something besides ESXi to verify your FreeNAS iSCSI disk works
 

42andrising

Cadet
Joined
Mar 19, 2013
Messages
1
Also check your VMware networking setup and the FreeNAS networking setup (VLANs, any link aggregation, IP's, etc.). Make sure the vmkernel adapters are using the NICs you'd expect and have the correct settings.

You've probably solved this by now.
 

depasseg

FreeNAS Replicant
Joined
Sep 16, 2014
Messages
2,874
I have esx5.5 and FN9.3 working. Can you paste screenshots of your iSCSI share setup tabs (at least - Portals, Targets, Extents and Associated Targets)?
 
Joined
Aug 25, 2014
Messages
89
I am new to FreeNAS but not new to ESXi and I am having a hard time getting an iSCSI share to be recognized by an ESXi 5.5 Host. (I have a brand new SuperMicro 4U jbod for FreeNAS
testing and eventual use) Since the ESXi host won't recognize the iSCSI pool I can't format it or use it?

I am still in test mode and my iSCSI share has two SSDs (one for cache & one for logs) and three 1TB SAS drives and I have 128GB ECC memory installed. No RAID cards.

I can see, mount and format the iSCSI share from Win-7 Enterprise Edition, Server 2008R2 and my trusty MacBook Pro (MacOS 10.10.2) so I know the shared pool is there.

In ESXi 5.5 console see the share inside the Configuration/Storage Adapters as an iSCSI Software Adapter

I can see my iSCSI Disk inside Configuration/Storage when I click Add Storage but when I click on the FreeBSD iSCSI Disk name and then click Next I get the same failure if I choose VMFS-5 or VMFS-3 and that is "The hard disk is blank". And in a popup error window "Call "HostDatastoreSystem.QueryVmfsDatastoreCreateOptions" for object "ha-datastoresystem" on ESXi "IP Address" failed."

I have tried not portioning or formatting and get nowhere and when I partition & partition / format I get a ESXi failure.

I have tried Partitioning the share as GUID from the MacBook Pro and giving the share an MS-DOS (FAT) format and that won't allow ESXi host to see and re-partition and format the share either.
May I ask what people are doing to have your ESXi Hosts see your iSCSI share(s)?

By the way the Server 2008R2 I mentioned that can see the & use the iSCSI share is on the ESXi host.

Thanks in advance for any help.
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
There is likely to be additional information available on the ESXi host in the logs. ESXi can get a bug up its butt for numerous reasons, and there are usually remediations available if you can identify the specific error. Use Google to search on the specific error. Often times, the error has nothing to do with iSCSI in particular and may require some bit twiddling, CLI work, etc.
 
Joined
Aug 25, 2014
Messages
89
Thanks, I will check the ESXi logs but I am suspecting FreeNAS as I just brought in one of my bigger test servers (IBM System X3650 M2) and I got the exact same Blank Disk message.
 

Tancients

Dabbler
Joined
May 25, 2015
Messages
23
Thanks, I will check the ESXi logs but I am suspecting FreeNAS as I just brought in one of my bigger test servers (IBM System X3650 M2) and I got the exact same Blank Disk message.
You've probably since moved on as well, but since I came across this thread in my search, I figured I'd quip in my current solution, and have a follow-up question to see if jgreco or someone else can direct on if this is an alignment issue or not.

Anyway, the solution I found that worked is mentioned here: https://communities.vmware.com/message/2313021#2313021

I was originally running the iSCSI zvol at 4096 logical block size, which seemed to cause ESXi to get all sorts of hung up. Once I bumped it down to 512, everything flew along fine. Which makes me wonder....Should I go into the command line and align my zvol correctly so I can bump the lb up, or is that not something I should be too concerned about?
 
Joined
Aug 25, 2014
Messages
89
I found the answer is ESXi 5.5 works well with 512K blocks as defined in the Extent. In fact 512K is in the standards for ESXi.

For speed when I am setting up a new Volume in FreeNAS I choose 64K blocks where 16K is the default.

I did a ton of testing on different RAID types and finally settled on Mirror & Stripe as close to the fastest and has a very high redundancy rate.

I did most of my testing on ten 4TB SAS drives, but now I am running twenty 4TB SAS drives. I did discover I could not create a twenty 4TB volume in one pass. I had to create a ten drive complete with BIG mirrored pair of very fast SSDs for logs and a GIANT set of very fast SSDs for cache. Then I went back and added ten more mirrored & striped HDD to my volume and all is good.

I was looking at the 4X cache rule as compared to RAM and then I saw the 5X rule so I went ahead and used a go big 8X rule and so far it is the fastest SAN I have ever had in my possession.

PS I have 128GB EEC RAM so my go big is four striped 240GB (eMLC) SSDs.

I would like to give a BIG thank you to everyone who has helped in my learning curve with FreeNAS !!!!!!

Now in a month or two I will start the process of upgrading Open Filer SANs to freeNAS SANs.
 

Tancients

Dabbler
Joined
May 25, 2015
Messages
23
I'll have to check and see what block size I set the zvols to. Been following the "If it can be done in the GUI, it should be done in the GUI" way of thinking, so it's still taking me a bit longer than if I was just throwing around ZFS commands.

Mine currently is a 12 drive stripe and raid (Raid 10), but using 7200rpm instead of some zippy SAS. I'm hitting close to the transfer limit of Gigabit from what I tested last night on the ESXi, so it'll be interesting to see how saturated it gets when I throw in an L2ARC Only 72gb ECC here, so my system is a child compared to yours!
 
Status
Not open for further replies.
Top