Supported 40GbE cards?

Status
Not open for further replies.

friolator

Explorer
Joined
Jun 9, 2016
Messages
80
We work with very large file sets. Very large. Mostly image sequences of feature films at 4k resolution, so somewhere on the order of 120,000 files per movie, topping out at about 5TB. Currently we've got 12TB or larger RAIDs in each of our film scanners, in our color correction systems and in our restoration system. We sneakernet files around, using 6TB bare drives and hot-swap drive docks. It works, but it's inefficient and slow.

We have a 16-drive FreeNAS system set up now, which is connected over 1GbE, but we can only really use it for smaller projects. It acts as a long term parking location for files before they go to more permanent LTO tape backups. I like FreeNAS though, so I'd like to use it for our SAN.

I just bought a 16port 40GbE switch. The thinking is we'll scan a film directly to the SAN (iSCSI), unmount the iSCSI target, mount it from the grading system and write the output back over the network to another iSCSI target. That then gets mounted on the restoration system, if necessary, and we do the same thing. So the plan is to have maybe 60-70TB of storage broken up into 6TB chunks. That should allow us to work on at least 3 jobs at once without having to worry about shuffling files around to make space.

I have an empty 20-bay server enclosure in addition to our existing 16-bay, and a third enclosure with another 8 bays. The plan is to use the 20bay machine with three cards installed:

1) 40GbE NIC to connect to the switch (dual-port?)
2) 24-port LSI SAS card for the 20 drives in the enclosure
3) 24-port LSI SAS card for the two other enclosures, which we'll add to as we need to expand.

So my initial question is: What's FreeNAS support for 40GbE like? Are there specific cards we should be looking at?
 

friolator

Explorer
Joined
Jun 9, 2016
Messages
80
Anyone? Does FreeNAS support any 40GbE cards?
 

Mirfster

Doesn't know what he's talking about
Joined
Oct 2, 2015
Messages
3,215

friolator

Explorer
Joined
Jun 9, 2016
Messages
80
Took some digging, but I found some stuff buried in one of those threads. Looks like the Chelsio T580 is what we'll get. Thanks!
 

friolator

Explorer
Joined
Jun 9, 2016
Messages
80
Thanks. That's encouraging!
 

Mlovelace

Guru
Joined
Aug 19, 2014
Messages
1,111
Just as an aside, I know iXsystems ships trueNAS servers with the "SO-CR" version of the chelsio cards (T420-SO-CR/T520-SO-CR), as trueNAS/freeNAS does not leverage all the capabilities of the higher priced CR cards. You might look into getting the T580-SO-CR instead to save some money ($375 for the SO-CR vs. $750 for the CR).
 

friolator

Explorer
Joined
Jun 9, 2016
Messages
80
Yeah. that's what I meant to type. I just ordered one of the T580-SO-CR cards.
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
I will come and steal your cards if you do not report back on how this works out. Be warned. :smile:
 

cookiesowns

Dabbler
Joined
Jun 8, 2014
Messages
31
I will come and steal your cards if you do not report back on how this works out. Be warned. :)

We've recently scored some SFN7142Q 2x40GbE cards along with a T580-LP-CR. You won't believe the price I scored on ebay for the 7142Q card.. comes with both brackets, practically brand new. The deal on the T580-LP wasn't as good, and probably could have been fine with the SO, but these cards can also live in our VM hosts to drive some NVMe storage ( 4xP3700's ) so I figured why not.

Will chime in with performance soon, I figure our drives/clients will eventually be the bottleneck, and there will definitely need to be quite some tuning involved on all layers.

4x 6 Z2, 8TB Seagate SAS3 dual-ported. E5-1650v3, 128GB DDR4, P3500 400GB L2arc for now... the hope is to eventually upgrade to a host that supports 4x NVMe drives.
 

JustinClift

Patron
Joined
Apr 24, 2016
Messages
287
If it's useful, the Mellanox 40GbE cards work. The driver is included in FreeNAS 9.10 (and 10) now, and various people have reported success with them.

There was a report of trouble when using a ConnectX-3 Pro adapter in 40Gbe mode + NFS though.

If you're using SMB/CIFS or any other non-NFS protocol though, that (in theory) shouldn't affect you.
 

JustinClift

Patron
Joined
Apr 24, 2016
Messages
287
Ugh. Just noticed the date on the original post. Sorry everyone. :(

@friolator What did you end up going with? Btw, why ISCSI instead of a shared storage protocol? CIFS kind of sounds like it would suit the workflow better. :)
 

cookiesowns

Dabbler
Joined
Jun 8, 2014
Messages
31
So..

out of the box iPerf performance between two nodes, one with Chelsio T580, the other with the Solarflare SFN7142Q. Default MTU, SR4 optics connected to a Trident II based switch I get roughly 21gbit/s. Not bad if you ask me.
 

friolator

Explorer
Joined
Jun 9, 2016
Messages
80
Ugh. Just noticed the date on the original post. Sorry everyone. :(

@friolator What did you end up going with? Btw, why ISCSI instead of a shared storage protocol? CIFS kind of sounds like it would suit the workflow better. :)

The whole thing was put on hold for a bit because of other things we had to deal with, but I'm back at it now. The server is built and has been running with just 4 droves (no volumes yet though) for about a month now. all seems good in terms of burn-in, so I'm starting to figure out the storage configuration for this setup now, and getting ready to order some drives.

We will use a combination of shared storage and iSCSI: the shared storage will be for things that machines which only exist on the gigabit network will access: backup files for things like Quickbooks, Shared installer images, project files for ongoing work, etc. The iSCSI volumes will be for anything that requires high performance. Because of the nature of the workflow, it's not really a problem to compartmentalize things this way, and if we can squeeze a bit more performance out of iSCSI by having volumes that are formatted in the native file system of the machines connecting to them (NTFS in most cases), the trade-off of not being able to access those drives concurrently will be worth it.

My plan is to experiment with both first, though. I'll report back here or in another thread when we've got drives in the FreeNAS box and have started benchmarking things.
 

friolator

Explorer
Joined
Jun 9, 2016
Messages
80
If it's useful, the Mellanox 40GbE cards work. The driver is included in FreeNAS 9.10 (and 10) now, and various people have reported success with them.

We're using ConnectX 3 cards in our workstations, but having some issues getting them to see the switch. In the case of the FreeNAS box (which has the Chelsio T580-SO-CR), the switch sees it, no problem. We weren't able to get the latest Mellanox drivers to work with Windows 7 on the workstations, but the previous version did at least install and let the OS see the card. The issue now is getting the card and the switch to talk, which I think is a Mellanox thing. Still digging on that. I'm hoping we can make that work, because they were pretty cheap cards.
 

JustinClift

Patron
Joined
Apr 24, 2016
Messages
287
Out of curiosity, have you pinged the Mellanox guys directly about the switch problems?

ConnectX-3 cards should still be on their support list, so direct support (in theory :)) should be workable. If for some reason it's not, their technical staff are pretty helpful on their forums (for current release cards, not so much for older stuff :mad:).

As a potentially-out-of-left-field idea, the ServeTheHome networking forums is extremely active, and have a lot of people with practical problem solving skills for 10/40GbE/Infiniband (especially compatibility stuff).

No idea if that helps, but hopefully... :)
 
Last edited:

friolator

Explorer
Joined
Jun 9, 2016
Messages
80
I did ask on the Mellanox forums about a month ago and got some tips, but haven't had the time to try it out. August turned out to be a crazy month here, then I was on vacation, so it's taking more time than I'd have hoped. My plan is to get that working very soon, because I'm about to order drives for the first vdev in our FreeNAS system, with the intention of performance testing it with our massive file sets.

I'll post back here with the results
 
Status
Not open for further replies.
Top