Weird VMware ESXi Performance Issue

Status
Not open for further replies.

DTakeshi

Cadet
Joined
Nov 18, 2013
Messages
3
Hi guys,

We're running an HP C7000 Blade system with BL460c Blade servers as our ESXi cluster and FreeNAS server.

The problem that we're having is that other than the initial 2 ESXi servers that we've connected to the FreeNAS server, all other ESXi hosts that try to connect to the FreeNAS are timing out. Nothing on the FreeNAS log messages points to anything, as it just shows the hosts connecting.

Our FreeNAS Spec:
HP BL460c
2x Quad Core X5345
16GB RAM
HP P700m RAID Controller
Dual Broadcom NIC (Onboard) - Used for iSCSI Connections
Dual Intel 373i Mezzanine Cards - Used for management
OS HDD: HW RAID1 2x 300GB 10K SAS drives
Storage HDD: HW RAID5 25x 300GB 10K SAS drives

Because we're using hardware RAID, we've configured the system to use UFS storage rather than ZFS. Extents are configured as files.

Each ESXi Host Spec:
HP BL460c
2x Quad Core X5450
20GB RAM
Dual Broadcom NIC (Onboard) - Used for iSCSI
Dual Intel 373i Mezzanine Card - Used for Management/Traffic

Each ESXi host that's configured using the Software iSCSI HBA has multipathing enabled, and is paths configured for each hosts to storage is configured as round-robin.

Switches used are Cisco Catalyst CSW3020 NICS so there is NO cabling between blades as all is done on the backplane.

We have 4 ESXi hosts, but only 1 is powered on running most of our VMS, while the others are powered off as we're trying to see if that's the cause, that we are overloading the NICS, but graphing of the NICS show minimal/average usage.

Each ESXi can scan for and see storage, but other than the first two, the others are VERY slow?

We've tried modifying the Queue Length to 64 as someone has suggested, but it just seems really weird that other than the first 2 hosts that were connected to storage, that the others just cannot connect to the storage at all.

New too FreeNAS so any assistance would be GREATLY appreciated.
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
Test with iperf for network problems. Use of blades and built in switch does not guarantee your network is healthy. You can boot other nodes to a FreeBSD livecd to test, I think...
 

DTakeshi

Cadet
Joined
Nov 18, 2013
Messages
3
Okay, so I've run iperf between the storage host and some other servers, and reports transfers of 160+Mbits/sec consistently...
 

cyberjock

Inactive Account
Joined
Mar 25, 2012
Messages
19,526
So you have a network topology issue.
 

DTakeshi

Cadet
Joined
Nov 18, 2013
Messages
3
ESXi hosts and FreeNAS box are all on the same "VLAN" as each other, and have no routing between hosts, all NICs used for iSCSI are on same LAN with no other traffic on it.
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
Okay, great, but if your network is only good for 160Mbit/sec on what ought to be 1000Mbit/sec, YOU NEED TO FIX THAT PROBLEM FIRST. FreeNAS cannot magically make your network go faster than it can go. You need to identify and eliminate the bottleneck!
 
Status
Not open for further replies.
Top