Supermicro

Menhery

Cadet
Joined
Oct 3, 2019
Messages
2
Hello, I have decided to build a FreeNAS box in lieu of replacing my two small Synology NAS boxes. I would like to use a ZFS array and host NAS files along with connecting as virtualization server running oVirt. On the FreeNAS use case I have several hundred thousand photographs from my photography hobby, on the other side I will be virtualizing some significant data science workflows for my development efforts. So far, here is my build:

CHASSIS: SUPERMICRO CSE-826 (https://www.supermicro.com/en/products/chassis/2U/826/SC826E26-R1200LPB)
MOTHERBOARD: X9DRI-LN4F+ REV 1.20
CPU:Intel Xeon 12 Core E5-2695V2 2.4GHz 30MB L3 Cache 8GT/S QPI Speed Socket FCLGA2011 22NM 115W Processor (Qty 2)
RAM:Crucial 32GB Kit (16GBx2) DDR3/DDR3L-1600 MT/s (PC3-12800) DR x4 RDIMM Server Memory CT2K16G3ERSLD4160B / CT2C16G3ERSLD4160B (Qty 4)
BACKPLANE: SUPERMICRO BPN-SAS-826A Backplane (3 IPASS connections)
SAS CABLE: Supermicro 75cm IPASS to IPASS Backplane Cable (CBL-0281L)
HDD: 10TB Western Digital (White Label) 3.3V pin Mod (Storage Array) (Qty 12 = 120TB)
HDD: Samsung EVO 500BG SSD (OS HDD) SATA 1 Onboard

I'm really lost on which HBA to use. I am hoping that the following HBA is the right choice along with connecting one SAS cable directly to the MoBo.
LSI LOGIC SAS9207-8I 6GB/S 8PORT INT PCI-E 3.0 SATA SAS HOST BUS ADAPTER.

I really don't want to source the wrong card. I already searched the forum and read the thread on the confusion around selecting an LSI SAS card and I am still confused. Further, I want to select the best option given the total price of my build so far. So, if I have to spend some extra money, then so be it.

The front of the BPN-SAS-826A backplane has the following connections:
Front Connectors
1. ACT_IN: JP26 and JP47
2. Chips: MG9071 and MG9072
3. I2C Connector #1: JP37
4. I2C Connector #2: JP95
5. I2C Connector #3: JP52
6. Power Connectors (4-pin): JP10, JP13, and JP46
7. SAS IN #1: JSM1
8. SAS IN #2: JSM2
9. SAS IN #3: JSM3
10. Upgrade Connectors, JP69 and JP78

Currently, the chips, RAM, and SSD are installed; CentOS posts as an initial run test. The backplane is still not connected, but the drives are mounted and registering on the backplane.

Help, and thank you!
 

Menhery

Cadet
Joined
Oct 3, 2019
Messages
2
I ordered the LSI LOGIC SAS9207-8I and based on my reading I need to run it in IT mode. Next step for me is to run Mellanox 40 Gbe cards to get significant IOZone bandwidth out of the box.

The big test will be virtualizing a small Hadoop cluster and two gateway VMs with Rstudio and Anaconda running on the KM machine connected to the ZFS as an iSCSI LUN.
 
Top