Register for the iXsystems Community to get an ad-free experience and exclusive discounts in our eBay Store.

40Gb Mellanox card setup

Western Digital Drives - The Preferred Drives of FreeNAS and TrueNAS CORE

SimonFN

Member
Joined
Feb 15, 2015
Messages
33
I've got two Mellanox 40Gb cards working, with FreeNAS 10. One in server, one in a Windows 10 PC. Getting between 400 MB/s to 700 MB/s transfer rates.

Hardware:
2 x MHQH19B-XTR Mellanox InfiniBand QSFP Single Port 40Gbps PCI-E, from eBay for $70.
1 x Mellanox MC2210130-001 Passive Copper Cable ETH 40GbE 40Gb/s QSFP 1m for $52

I had to set both to Ethernet mode, as I couldn't get Infiniband working; FreeNAS 10 has a system Tunable for sys.device.mlx4_core0.mlx4_port0 = eth.



That's it! Thought it might be useful for anybody else investigating these cards.
 

brando56894

Dedicated Sage
Joined
Feb 15, 2014
Messages
1,506
3.5 Gbps? Not bad for $122! I just recently saw this on the LinusTechTips YouTube channel about how to get a ridiculously fast connection between two clients. Sucks you can't get InfiniBand to work, maybe one of the devs could help you.
 

SimonFN

Member
Joined
Feb 15, 2015
Messages
33
3.5 Gbps? Not bad for $122! I just recently saw this on the LinusTechTips YouTube channel about how to get a ridiculously fast connection between two clients. Sucks you can't get InfiniBand to work, maybe one of the devs could help you.
Yes, the LinusTechTips was the reason for me trying it. I'm sure there's some more tweaking I can do with settings, to get some more performance out of them. I got fed up of seeing "cable unplugged" in Windows, so gave up with the Infiniband setup. I'll try again at some point.
 

brando56894

Dedicated Sage
Joined
Feb 15, 2014
Messages
1,506
Yes, the LinusTechTips was the reason for me trying it.
Hahaha nice! I was tempted to try it as well but then remembered that both my desktop and server only have one PCI-E slot each, the server's is occupied by my M1015 HBA and the Desktop's is occupied by my GTX 1070. I've also contemplated getting a managed gigabit 16 port switch which has 2x 5 Gbps links over Ethernet/Cat 6, they're about $250 I think. That way I can manage my network traffic for more effectively and don't have to change my cabling or give up my PCI-E slots.
 

StoreMyBytes

Junior Member
Joined
Aug 1, 2017
Messages
13
Brando, what switch or switch models were you considering? That's a pretty reasonable price for 2x ports greater than gigabit.
 

yurelle

Junior Member
Joined
Mar 23, 2018
Messages
13
I got fed up of seeing "cable unplugged" in Windows
When I saw this I got both excited & sad at the same time. I've been struggling with this for a week. Your setup is exactly what I've been trying to do. But unfortunately, by the time I found this thread, your image has died. I'm using the exact cards you are [MHQH19B-XTR], and I'm using known good cards & cable. I tested them in loopback with both cards in a single windows box (with openSM running) and they worked fine, but trying to connect from Windows to FreeNAS bombs. Could you give some more information on where you set this:
FreeNAS 10 has a system Tunable for sys.device.mlx4_core0.mlx4_port0 = eth
I tried setting it in the WebGUI > System > Tunables, but when you add a Tunable there, it also asks for a Type: Loader, rc.conf, or Sysctl. I've tried all of them, and none seem to work. Also, I noticed that your snippet ends in "_port0". Watching the console during boot, my system does mention "mlx4_core0", but then mentions "Starting Port 1" instead of zero (the cards only have 1 port each). Is this a translation thing? Zero-based in the settings, but 1 based when printing in the console? I also tried modifying your snippet to refer to "_port1" but no luck. I've tried running openSM on the windows box, and not running it (trying to let FreeNAS auto-magically run it's own subnet manager), but neither seems to work. I can't even get the green "physical link" light to turn on to acknowledge that I've plugged something in.

I also tried following this: https://wiki.freebsd.org/InfiniBand but apparently FreeNAS doesn't include those kernel modules.

I know this is an old thread, but any help you could give me would be greatly appreciated; I don't understand what I'm doing wrong. I'm also a FreeBSD noob, so I could also be making a stupid mistake.

thanks,
-Yurelle
 
Last edited:

SimonFN

Member
Joined
Feb 15, 2015
Messages
33
Hi,
Since 10 was abandoned, the tunable is no longer required with the current version (11).

The card was detected fine, all that I had to do was set its static IP address in FreeNAS.



Then in Windows 10, set the other card to static IP (10.0.0.6 in my example, and the subnet mask to 255.255.255.0). The Mellanox drivers in Windows are version 5.35.12970.0.
Both cards are set to MTU of 1500.

PM me if you need to know any other settings.
 

yurelle

Junior Member
Joined
Mar 23, 2018
Messages
13
Like this? I've tried restarting both boxes after these changes, and it still doesn't work.
And should I run OpenSM on the windows box, or do nothing and let FreeNAS run it's own? I'm running FreeNAS 11.1-U3.

-Yurelle

Edit: Shrunk down the images. I forgot I was on a 4K monitor. Sorry about that.







 
Last edited:

yurelle

Junior Member
Joined
Mar 23, 2018
Messages
13
Ok, so I just tested, and everything works with both cards in loopback on the FreeNAS box. I installed both cards in the FreeNAS box, cabled them together, and when it booted up, I got green "Physical Link" lights, and even yellow activity lights. So, apparently, the issue is not FreeNAS recognizing the Infiniband cards, nor failing to start up openSM. All that is apparently working.

I already tested the Windows box in loop back, and it worked too. So, that means that both cards work, the cable works, the Windows drivers are good, the FreeNAS drivers are good, and openSM is working. So, the problem has to be something wrong with Windows trying to talk to FreeBSD/FreeNAS. When I connect from FreeNAS to Windows, I can't even get the green "Physical Link" light. Everything acts as though it's unplugged [Windows] "Network cable unplugged", [FreeNAS] "No Carrier" / "Giving up" "Status Changed to DOWN".

You mentioned that your Windows driver was version 5.35.12970.0; mine is 5.35.12978.0, virtually identical, so I would wager that that is not the problem.

On my FreeNAS box, pressing 9 on the console menu to go into the shell, running:
Code:
root@freenas:~ # dmesg | grep Mellanox

shows that the driver version is:
Code:
mlx4_core: Initializing mlx4_core: Mellanox ConnectX VPI driver v2.1.6 (Mar 14 2018)

Could you please check what version your FreeNAS box is using?

As for the firmware:
Code:
Windows 10 > Device Manager > Network Adapters Mellanox ConnectX-2 IPoIB Adapter > [right click] > properties
Mellanox Properties Dialog > Information Tab

This shows my firmware version as 2.9.1200. If I recall correctly, the firmware version on the other card is 2.9.1000, just slightly lower. I would say that that could be a problem, except for the fact that both loopback configs work, so that would seem to indicate that the firmware combination is fine.

P.S. As I said, I'm a FreeNAS noob, but from that console menu that FreeNAS boots into, pressing 9 to get to the shell, that is a normal root terminal right? Or is there something special/restricted about it? Also, it doesn't say anywhere on those screens how to return from the shell back to the menu, but I noticed that typing the command "exit" works. Just want to make sure that that is the proper way, and I'm not accidentally doing a force close or something.

P.P.S. Forgive me if I'm being overly simplistic about explaining my steps, but (1) I want to make sure that I'm doing them correctly, and not making any stupid assumptions, and (2) I'm sure I won't be the last person to have this issue, and if someone else should find this post, I'm hoping they can use my troubleshooting as a guide, if we ultimately get this working.
 
Last edited:

c32767a

Senior Member
Joined
Dec 13, 2012
Messages
362
As for the firmware:
Code:
Windows 10 > Device Manager > Network Adapters Mellanox ConnectX-2 IPoIB Adapter > [right click] > properties
Mellanox Properties Dialog > Information Tab

This shows my firmware version as 2.9.1200. If I recall correctly, the firmware version on the other card is 2.9.1000, just slightly lower. I would say that that could be a problem, except for the fact that both loopback configs work, so that would seem to indicate that the firmware combination is fine.
This is your problem:

Mellanox ConnectX-2 IPoIB Adapter


Windows is trying to operate the Mellanox card in IP over Infiniband mode. You need it to run in ethernet mode. Unless something has changed recently, the driver in FreeNAS only supports the card when it operates in ethernet mode.

That's why both cards work in the same machine, but not when you try to use them between the two.

I don't know Mellanox on Windows, so I don't know how to make the driver settings necessary to run in ethernet mode, but I'm sure it's documented somewhere.
 
Last edited:

yurelle

Junior Member
Joined
Mar 23, 2018
Messages
13
Thank you, that makes sense. That does look like the problem. But the Mellanox Windows config utility used to switch them doesn't seem to support these old cards. Some docs reference a tab in the card's properties (Device Manager) where you can switch it, but mine doesn't show that tab.

I tried compiling the Mellanox IB driver into FreeNAS using these instructions (Section "2 Installation"): https://www.mellanox.com/related-docs/prod_software/Mellanox_FreeBSD_User_Manual_v2.1.5.pdf
but I'm not sure that will work in FreeNAS. The provided scripts reference a terminal variable $HEAD, which doesn't seem to be defined in FreeNAS, and I'm too much of a noob to know what it's supposed to be. I tried googling it, but can't find anything that seems relevant.

Honestly, I'm not sure where to go from here. I tried running a Fedora live USB iso, and it ran these cards immediately & linked up with windows, with no problems. However, I also looked into getting Fedora to run natively on ZFS, and that looks like it'll be even more difficult than this.

Thanks for your help,
-Yurelle
 
Last edited:

c32767a

Senior Member
Joined
Dec 13, 2012
Messages
362
Honestly, I'm not sure where to go from here. I tried running a Fedora live USB iso, and it ran these cards immediately & linked up with windows, with no problems. However, I also looked into getting Fedora to run natively on ZFS, and that looks like it'll be even more difficult than this.

Thanks for your help,
-Yurelle
Hate to say it, but probably the best thing to do is list your cards on eBay and find some supported 40G cards..

Or just use plain FreeBSD and forego the FreeNAS part.. You can do everything by hand in FreeBSD that the FreeNAS middleware does.. but you have to do it by hand.. Or use another OS that supports your cards..
 

JustinClift

Senior Member
Joined
Apr 24, 2016
Messages
287
Hate to say it, but probably the best thing to do is list your cards on eBay and find some supported 40G cards..
That's probably premature. ;)

@yurelle, it looks like you have the FreeNAS side of things set up ok with the card. As @c32767a mentioned, the problem on the Windows side is that the card is not in native Ethernet mode.

As you also said, there's an option in the device's properties panel which will switch it to Ethernet mode. It's definitely there, even for the older ConnectX series one and two cards. I know that for sure, as I only have series 1 and 2 cards, and I've had it working before.

I don't have mine in a Windows box at the moment, but I can take some time over the next few days to juggle things about and get one into a Win 7 box to investigate the exact driver setting needed. Would that help?

---

On a side note, unless you're wanting a loooong learning experience and have a lot of time and patience, don't bother trying to get the Infiniband drivers compiled into FreeNAS. It can be made to work, but the benefits you're looking for (speed, everything "just working") just aren't there compared to getting the cards working in native Ethernet mode. :)
 

nikey22

Newbie
Joined
Mar 30, 2018
Messages
1
I'm not following something here. You have a 40Gbe card and you are achieving between 400MB/s and 700MB/s speeds? That's not anywhere near saturating the connection.
Those speeds would be somewhat reasonable if you had a 10Gbe connection, but still subpar.

There is no technology available now in storage media to saturate this speed.
Remember the TCP/IP protocol at its fastest will do about 22-24Gb/s. You will need someway of grabbing the content from RAM from the server, sending it across the NIC, and dumping it into the RAM of the client computer, completely bypassing the slow TCP/IP level.

Maybe in the near future a SSD-SAS drive? They should achieve about 22Gb/s, I believe that interface is now available. That at least will give you HALF of your line speed and still run with the TCP/IP protcol.
 

yurelle

Junior Member
Joined
Mar 23, 2018
Messages
13
I'm not following something here. You have a 40Gbe card and you are achieving between 400MB/s and 700MB/s speeds? That's not anywhere near saturating the connection.
Those speeds would be somewhat reasonable if you had a 10Gbe connection, but still subpar.
It's not a 40Gbe card, it's a 40Gb Infiniband card, that has the capacity to run in ethernet mode. The Mellanox ConnectX-2 Cards [MHQH19B-XTR] operate at 40Gb/s in Infiniband mode, but only operate at 10Gb/s in ethernet mode. I would assume he's reading & writing from a RAID array or SSD, and the drive IO is the bottleneck. He was probably just mentioning the "Transfer Speeds" to show that the network was NOT acting as a bottleneck. You would have to be running off a RAM Disk to really saturate it, unless you're doing active clustered computing.

I don't have mine in a Windows box at the moment, but I can take some time over the next few days to juggle things about and get one into a Win 7 box to investigate the exact driver setting needed. Would that help?
I don't want to cause too much trouble, but I would appreciate it. It's not urgent, I did get Fedora working with a non-native, external ZFS pool, that I've transferred all my files to. So, as of right now, I have a working ZFS data-store with IB support, but it's kind of finicky, especially after reboots, and I'd like to eventually switch to a Native ZFS OS like FreeBSD or FreeNAS. I really wanted to try SmartOs on Illumos, but apparently they don't support the new Ryzen Threadripper, which is what I'm running, and it bombs weirdly; I couldn't figure that one out.

As you also said, there's an option in the device's properties panel which will switch it to Ethernet mode. It's definitely there, even for the older ConnectX series one and two cards. I know that for sure, as I only have series 1 and 2 cards, and I've had it working before.
I'm on Windows 10, not sure if that is causing an issue.

This forum https://community.mellanox.com/thread/4057 has this picture, but also indicates that
Code:
Mellanox WinOF VPI (v5.35) provides modules that are containing IB & ETH drivers supporting ConnectX-3 & ConnectX-3 pro adapters only



My windows 10 instance doesn't show that tab. However, I do have a lot more tabs, so maybe I'm in the wrong window?
-Yurelle

I'm opening up from the Device Manager > [Right click] Properties:


This is what I get:
 
Last edited:

JustinClift

Senior Member
Joined
Apr 24, 2016
Messages
287
I don't want to cause too much trouble, but I would appreciate it.
No worries, I'll get it done. When thinking about it earlier today, I realised the Win7 workstation has a free PCIe slot and could do with a direct connection to the FreeNAS box anyway. And the FreeNAS box has a dual port Mellanox adapter in it already, with a port still unused. So, probably a good test environment for double checking.

With the screenshots, those do help. Would you be ok to take them for the rest of the "Advanced" tab? You only captured the names of the first few settings... there's still a bunch more if you scroll it down.

... and I have a feeling the one you want is in there. :D
 

JustinClift

Senior Member
Joined
Apr 24, 2016
Messages
287
Hmmm, this is weird. I've just installed a MHQH19B-XTR (same model as yours) in the Win7 box here, then added in the Mellanox WinOF 4.80 drivers, to match the version shown in your screenshots.

For me too, there's no tab showing up to switch between native ethernet and infiniband mode. There really should be, as I've had this exact card working in native ethernet mode before.

So, something strange seems to be going on. I'll investigate a bit more, and when I've figured it out (probably tomorrow) hopefully we'll have a solution. :)
 
Top