Hi guys,
We're running an HP C7000 Blade system with BL460c Blade servers as our ESXi cluster and FreeNAS server.
The problem that we're having is that other than the initial 2 ESXi servers that we've connected to the FreeNAS server, all other ESXi hosts that try to connect to the FreeNAS are timing out. Nothing on the FreeNAS log messages points to anything, as it just shows the hosts connecting.
Our FreeNAS Spec:
HP BL460c
2x Quad Core X5345
16GB RAM
HP P700m RAID Controller
Dual Broadcom NIC (Onboard) - Used for iSCSI Connections
Dual Intel 373i Mezzanine Cards - Used for management
OS HDD: HW RAID1 2x 300GB 10K SAS drives
Storage HDD: HW RAID5 25x 300GB 10K SAS drives
Because we're using hardware RAID, we've configured the system to use UFS storage rather than ZFS. Extents are configured as files.
Each ESXi Host Spec:
HP BL460c
2x Quad Core X5450
20GB RAM
Dual Broadcom NIC (Onboard) - Used for iSCSI
Dual Intel 373i Mezzanine Card - Used for Management/Traffic
Each ESXi host that's configured using the Software iSCSI HBA has multipathing enabled, and is paths configured for each hosts to storage is configured as round-robin.
Switches used are Cisco Catalyst CSW3020 NICS so there is NO cabling between blades as all is done on the backplane.
We have 4 ESXi hosts, but only 1 is powered on running most of our VMS, while the others are powered off as we're trying to see if that's the cause, that we are overloading the NICS, but graphing of the NICS show minimal/average usage.
Each ESXi can scan for and see storage, but other than the first two, the others are VERY slow?
We've tried modifying the Queue Length to 64 as someone has suggested, but it just seems really weird that other than the first 2 hosts that were connected to storage, that the others just cannot connect to the storage at all.
New too FreeNAS so any assistance would be GREATLY appreciated.
We're running an HP C7000 Blade system with BL460c Blade servers as our ESXi cluster and FreeNAS server.
The problem that we're having is that other than the initial 2 ESXi servers that we've connected to the FreeNAS server, all other ESXi hosts that try to connect to the FreeNAS are timing out. Nothing on the FreeNAS log messages points to anything, as it just shows the hosts connecting.
Our FreeNAS Spec:
HP BL460c
2x Quad Core X5345
16GB RAM
HP P700m RAID Controller
Dual Broadcom NIC (Onboard) - Used for iSCSI Connections
Dual Intel 373i Mezzanine Cards - Used for management
OS HDD: HW RAID1 2x 300GB 10K SAS drives
Storage HDD: HW RAID5 25x 300GB 10K SAS drives
Because we're using hardware RAID, we've configured the system to use UFS storage rather than ZFS. Extents are configured as files.
Each ESXi Host Spec:
HP BL460c
2x Quad Core X5450
20GB RAM
Dual Broadcom NIC (Onboard) - Used for iSCSI
Dual Intel 373i Mezzanine Card - Used for Management/Traffic
Each ESXi host that's configured using the Software iSCSI HBA has multipathing enabled, and is paths configured for each hosts to storage is configured as round-robin.
Switches used are Cisco Catalyst CSW3020 NICS so there is NO cabling between blades as all is done on the backplane.
We have 4 ESXi hosts, but only 1 is powered on running most of our VMS, while the others are powered off as we're trying to see if that's the cause, that we are overloading the NICS, but graphing of the NICS show minimal/average usage.
Each ESXi can scan for and see storage, but other than the first two, the others are VERY slow?
We've tried modifying the Queue Length to 64 as someone has suggested, but it just seems really weird that other than the first 2 hosts that were connected to storage, that the others just cannot connect to the storage at all.
New too FreeNAS so any assistance would be GREATLY appreciated.