Thoughts on first Scale build / pool layout

imvi

Dabbler
Joined
Dec 16, 2022
Messages
16
*Edit: More concrete system idea now below

Hello everyone :smile:

After lots of reading and researching around here I have arrived at some ideas of how to handle my requirements, but am unsure about which pool(s) make sense for me (or whether to use Scale at all?).

Background
Currently, all important data is on a 8x1TB RAID5 managed through a Promise EX8650 in my Desktop computer. The system and associated RAID has been running (not 24/7, but during most days) since 2009. Disks in the RAID have built up over the years but it’s been at the current configuration for 10 years or so.
This is to be replaced. The Desktop should become much slimmer (think mini ITX, single SSD, no GPU), while the storage should be moved to a NAS.

Requirements
Most of my crucial data is photos, music and then some documents, totaling 4.5TB as of now. It will grow, but probably no more than a few hundred GB per year.

Besides this data, I have another 3TB or so of media which I could lose but also need storage for.

Finally, I have 4 TB of media seeding in Deluge (currently on an external hard disk managed by an old Thinkpad T400 running 24/7). I don’t care about losing this at all. Having more storage here would be convenient though. Say, 8TB.

Besides all of this storage, I want the new machine to also take over running
  • Deluge (or another torrent client)
  • Jellyfin (1 video stream transcoding at a time max, mostly direct play)
  • Owncloud
  • A yet to be determined VPN
  • Maybe Home Assistant, or generally have room for a few more docker containers if something comes up

Pool Setup ideas
Given the above, for the crucial stuff I was thinking of
  • Either a 4-wide Z1 without spares, consisting of 12+TB non-NAS disks. This is way more than I need, especially given that at the current price/TB ratios it’s likely I’ll get disks more along the 18TB size. For example I’ve looked at Toshiba MG09/10.
  • Or a 5-wide Z1 without spares of 6-8TB NAS drives, such as the Seagate Ironwolf or WD Red, if there is actual value in that. WD REDs for example don’t seem to draw less power despite the lower rpm and the noise is not that important since the system will be sitting in the storage room behind a sliding door.
It would be nice if this pool were also fast-ish, so I could directly work with the photos in Lightroom, or generally benefit from 10GbE (the apartment I’m renting has Cat 7 wiring, so needs to be copper). I know the configuration above wouldn’t max out 10GbE, that’s fine. I would expect it to be faster than a single disk though.
I picked Z1 since I lived with Raid5 for so long and slept rather well, but tell me if this is a terrible idea given modern disk sizes or for any other reason. I suppose the direct alternative is a 5-wide Z2.

I am at a loss on how to handle the rest of the storage. I don’t care about losing the Deluge seed data, and don’t want to lump it in with the above pool because it would create I/O on the disks for data I don’t care about? I could stripe 3x4TB older disks I still have lying around. What I dislike about this
  • I'm going to lose all of the data if just one drive fails, and I have zero use for the speed benefit of the striping here. Can I just use them raw but have them appear as one volume to Deluge for convenience? I know I could set them up as their own pool each.
  • I have three additional disks running, taking up room, creating noise and consuming power
From my understanding neither L2ARC nor SLOG make sense in my configuration. I have various older SSDs lying around though which I could make use of if it does.

Final notes
  • I am not dead set on TrueNAS Scale/ZFS but compared to Unraid at least it seemed the better option. Please let me know if you disagree or there is something else I should consider.
  • I am aware of this thread that has seems to have somewhat similar requirements
  • The initial cost of acquiring the system is not a major consideration, I care more about it not drawing unnecessarily large amounts of power in the hopefully 5-10 years it will run.
Which hardware to get is also still unclear, but before thinking about the specifics of that, it seemed like a good idea to first decide on the number of disks that need to be supported.
 
Last edited:

imvi

Dabbler
Joined
Dec 16, 2022
Messages
16
Unfortunately, I can't edit seems.
Pool Setup ideas
Given the above, for the crucial stuff I was thinking of
  • Either a 4-wide Z1 without spares, consisting of 12+TB non-NAS disks. This is way more than I need, especially given that at the current price/TB ratios it’s likely I’ll get disks more along the 18TB size. For example I’ve looked at Toshiba MG09/10.
I have since seen here that Z1 with drives in the 18TB range is likely a bad idea. Still not sure what the best approach would be in my case though.
 

sretalla

Powered by Neutrality
Moderator
Joined
Jan 1, 2016
Messages
9,703
You may do well with a 2 or 3 way mirror of 18TB disks in terms of meeting the storage needs and keeping the number of disks low.

You are probably right that SLOG and L2ARC aren't needed.

You would do well to get a couple of cheap SSDs to mirror in combination with your spinning rust to run apps and store smaller files and logs, etc to avoid additional load on the disks when it's not needed.

Watch out if you're buying WD RED that you get CMR (and certainly not SMR) disks. (although at 18TB, I think you're fairly safe as they have no product at that size which is SMR AFAIK)
 

imvi

Dabbler
Joined
Dec 16, 2022
Messages
16
You may do well with a 2 or 3 way mirror of 18TB disks in terms of meeting the storage needs and keeping the number of disks low.
Thanks, good idea, I hadn't really considered that option. Just to confirm my understanding: I could later extend this storage pool with another vdev consisting of a second 2-way mirror, to double my usable capacity (and improve performance), correct?
Or alternatively destroy the pool and switch to a Z1/Z2 configuration if that then seems like the better option.
You would do well to get a couple of cheap SSDs to mirror in combination with your spinning rust to run apps and store smaller files and logs, etc to avoid additional load on the disks when it's not needed.
I like the idea of keeping the I/O on the large pool lower by doing this. The older SSDs are nice because it get them for free (/cheap if I have to get more), they wont contribute noise or take up a lot of space in the case, and their power consumption should be negligible.

I'm not sure how you are suggesting to combine this with my old 4TB drives though, assuming that's the spinning rust you are referring to? Or did you mean a 2-way mirror of SSDs together with the 2-way mirror of 18TB disks? Probably makes more sense the more I think about it ;)
Watch out if you're buying WD RED that you get CMR (and certainly not SMR) disks. (although at 18TB, I think you're fairly safe as they have no product at that size which is SMR AFAIK)
Thanks for the warning :) I've heard WD seems to be particularly sneaky about this.
 

imvi

Dabbler
Joined
Dec 16, 2022
Messages
16
Some updates after lots more research

Storage
I have since ordered three Toshiba MG08 16TB (MG08ACA16TE, ~13.5 Euro/TB) which should arrive over Christmas. I'll be [thoroughly testing](https://www.truenas.com/community/resources/hard-drive-troubleshooting-guide-all-versions-of-freenas.17/) them in my current Desktop PC and plan to then trial them in the NAS as a three-way mirror as suggested. If that doesn't work out, I guess I'll have to buy one or two more for a Raid-Z2.

Mobo + CPU
Among the C24x Supermicros, the X11SCL-F is one of the few that I can actually get (new, for 270 Euros), so that is the current plan. Used options seem to hardly exists in Germany or are unsuited (e.g. the widely available X10SLM which supports just 32GB Ram).

I am still highly unsure about 'how much CPU' I need for my use case. The highest load I could imagine would be transcoding 4K content (which will be rare). I tried Jellyfin on my Laptop and the i5-10310U in there could not quite handle it. If I take that as a reference in Passmark (for lack of something better) an E-2124 seems to be tight. Here someone used a more powerful E-2176G for the purpose. Do I need to go this high?
Getting used CPUs is also not easy it seems. The E-2124 is actually around for 115. i3's might be a bit easier to find, but even an i3-9300 isn't a whole lot more powerful, so no benefit there I guess.

Alternatively, I could also get a X10SDV-6C+-TLN4F with a Xeon D-1528 for 400 Euro used (+80 Euro for 2x32GB RDIMM RAM). That doesn't seem like the worst deal, has low power conumption, and the added benefit of having the 10 GbE NIC I want anyway already onboard. But I don't really need the mini-ITX format and again am unsure if the CPU has enough power. If not, there's no upgrade path with this option. More modern Xeon-D options (X11SDV-, X12SDV-) seem really expensive / are not available used. Also, a Xeon D-2123IT isn't all that much more powerful either.
 

imvi

Dabbler
Joined
Dec 16, 2022
Messages
16
Researching more I found some useful info on performance, of the D-1541 specifically, here which sounds like it should be plenty powerful for my use case. I also learned that Jellyfin should do well with transcoding multithreaded to the degree that ffmpeg does. On the other hand it seems 4K transcoding will pretty much require an (i)GPU anyway, so I've given up on that for now. Most likely it will be more convenient anyway to just ensure the client(s) can direct play the media.

Therefore I'm now on the hunt for used D-1541 or D-1528 systems with 64GB of RAM.
It's a bit frustrating that they're so expensive while I could get an E5-2697 v3/v4 that is insanely overpowered for my use case and uses way more power for just 70 Euros, or even whole CSE-829U X10DRU-i+ rack servers (without CPU) for just 350 Euros. I guess that's the price to pay for efficiency/small form factor ;)

I'm still open for any suggestions or tips, otherwise I'll just keep going here in case anyone finds my notes useful in the future :)
 

mangoon

Dabbler
Joined
Dec 18, 2022
Messages
13
You could have a look at HP mircoservers, the Gen 8 are pretty cheap and also the gen10 are good to get, but they are limited on ram...

I really like the node 304 case, but it also just takes itx boards, but with a full atx power supply... There are pretty decent ASUS server boards... I recently got a bluechip used server with a 1240v5 for just 300 bucks, that one would be really great too. But it also is limited to 32gb of ram...

The question is, do you really need 64gb in your usecase? If not you have so many options of older ddr3 hardware.

Also, as you mentioned that you live in germany, I would give "ebay Kleinanzeigen" a go. I got my bluechip from there and it really was a steal...
 

imvi

Dabbler
Joined
Dec 16, 2022
Messages
16
You could have a look at HP mircoservers, the Gen 8 are pretty cheap and also the gen10 are good to get, but they are limited on ram...
Thanks for the suggestions! The form factor is pretty appealing on these, although with at least 3 HDDs plus SSD(s) it would already start to get rather full in there. Most importantly though, I'm not convinced about the 32GB RAM max (see below).

I really like the node 304 case, but it also just takes itx boards, but with a full atx power supply... There are pretty decent ASUS server boards... I recently got a bluechip used server with a 1240v5 for just 300 bucks, that one would be really great too. But it also is limited to 32gb of ram...
I'm not too worried about the case and will base it on whatever mobo I get. If ITX I might indeed go for something like a Node 304 (or maybe rather 804 for better airflow), if (m)ATX I can reuse my current Lian Li A75X (for starters with the same PSU).

Never heard of Bluechip servers before, another saved search to set up :)
The question is, do you really need 64gb in your usecase? If not you have so many options of older ddr3 hardware.
No I'm not sure I need the 64GB, but similar systems seem to tend towards this amount of RAM, given that with ZFS more RAM always appears to better, the additional docker containers I want to be running, and the rough 1GB per 1TB guideline, I don't think I'd be confident to have just 32GB and no upgrade path.
Also, as you mentioned that you live in germany, I would give "ebay Kleinanzeigen" a go. I got my bluechip from there and it really was a steal...
Yes, I'm monitoring both ebay and ebay-kleinanzeigen as well as the Hardwareluxx forum.


Generally, I'm going back and forth on the system size. After re-reading the hardware guide on TDP, maybe I'm overthinking this part a bit too much and should open up for larger systems as long as they aren't too crazy.

Something else that counts against the Xeon D family is that apparently none of them support Quicksync which would be a nice plus for transcoding.
 

imvi

Dabbler
Joined
Dec 16, 2022
Messages
16
Happy near new year everyone :)

Just a few hours after writing my last post, I (somewhat surprisingly) won the auction for a motherboard, which now makes my plans a lot more specific.

This is what I'm looking at (Updated 2023-01-3)
MoboSupermicro X12SCZ-TLN4FGood: 2x 10GbE from Intel X550 onboard. Not ideal: Only 4 SATA
CPUXeon W-1270ECheaper used than a W-1250 and I don't see anything that speaks against the E version.
I'm aware this a much more powerful CPU than what I initially looked at, but oh well, if the energy consumption in Idle isn't too crazy that's fine by me.
CoolerThermalright True Spirit 140 BWCould reuse it from my current socket 1151 Desktop system. Kept my overlocked i7 3770K plenty cool, so I see no issues here (if all of the spacings work out)
RAM4x Kingston Server Premier DIMM 16GB, DDR4-3200, CL22-22-22, ECC (KSM32ED8/16MR)
4x Crucial DIMM 16GB, DDR4-2666 CL19-19-19 (CT16G4DFRA266; or maybe the faster CT16G4DFRA32A)
This should work, no? It's 16GB Dual Rank ECC UDIMMs with 3200 Mhz, all the things SM lists on their page. Kingston claims compatibility as well.
Used stuff on ebay is more expensive than just buying these new.
HBALSI SAS3 9300-8i SGLWith just 4 SATA ports this is pretty much a necessity if also want to use some SSDs I have left over for apps
CaseLian Li A75XCurrently keeps my Desktop w/ 12 HDDs + SSDs and more stuff cool easily
PSUbe quiet! Dark Power Pro P7 650W ATXOverkill, but I have it around

Depending on how much the system draws while idling, I might opt for something smaller in the the future after all.
 
Last edited:

imvi

Dabbler
Joined
Dec 16, 2022
Messages
16
Regarding the HBA:
I eventually realized that after all, maybe I might be able to re-use my Promise EX 8650?! Well, thanks to this thread I no longer have to try this out ;) Plus, while it has served me very well since 2009(!) I'm not sure I want to risk another 10 years of life on it.
I should now have a LSI 9300-8i HBA secured - waiting on answer from the seller.

CPU has also been ordered and is on its way. Found alternative, cheaper RAM as well.
Table above has been updated accordingly.

Once everything made its way here and I found time to play around with it, I'll report on what did (not) work out.
 

Davvo

MVP
Joined
Jul 12, 2022
Messages
3,222
Do note that your cooler is unlikely to fit since these motherboards usually come with a glued backplate with mounting threads.
Besides supermicro's suggested cooler, you can look at this list for alternatives.
 

imvi

Dabbler
Joined
Dec 16, 2022
Messages
16
Thanks for the heads-up @Davvo, I wasn't aware of that.

In the list you linked, does a 'YES/YES' for LGA115X (hole/backplate) mean that it does work with the fixed backplate or not?

I found that Noctua for example also has a list of coolers compatible with the board, but I'm not sure they take this into account.
 

Davvo

MVP
Joined
Jul 12, 2022
Messages
3,222
Thanks for the heads-up @Davvo, I wasn't aware of that.

In the list you linked, does a 'YES/YES' for LGA115X (hole/backplate) mean that it does work with the fixed backplate or not?

I found that Noctua for example also has a list of coolers compatible with the board, but I'm not sure they take this into account.
/YES means it works with the backplate.
Regarding noctua, I don't think they are compatibile.
 

imvi

Dabbler
Joined
Dec 16, 2022
Messages
16
Things took some time, but I could finally get started on this.

For easier temporary testing, I got an Arctic Alpine 12 CO as cooler as well as an FSP350-60HHN for a PSU. I also got a different HBA (AOC-S3008L-L8e), but it isn't plugged in yet. The Mobo is running with these components and the above Crucial RAM on a cardboard box.

Unfortunately, I cannot get it to work. There is nothing on either of the two DisplayPort outputs.

Without any RAM I get the expected beeps: 5 short, 1 long
With RAM (1, 2 or 4 modules), I get two short beeps, a pause and another short beep in another pitch. Both of these sound different from the beeps when there's no RAM. I have found no 'translation' for this beep pattern anywhere.

The BMC heartbeat is blinking happily as it should, and the LAN port is also active. I can see the device in my router, but somehow only with an IPv6 address and I cannot access the webinterface on it. Not sure how I could reset IPMI either, without having any display output.

The fans are doing various things from idling along to going at full speed, to cycling around between those things. I have no idea what the board is trying to do.

For lack of a better idea I'm for now blaming the RAM and might need to get some alternatives to test out...
 
Last edited:

mangoon

Dabbler
Joined
Dec 18, 2022
Messages
13
Ahem, is the Ram you got ECC Udimm? The one you Linked is not really ecc?

My X10 Board wont boot without ecc ram

Also, does you Board have a vga plug? My Supermicroboards all only post BIOS on VGA...
 

Davvo

MVP
Joined
Jul 12, 2022
Messages
3,222
The fans are doing various things from idling along to going at full speed, to cycling around between those things. I have no idea what the board is trying to do.
Supermicro boards expect industrial fans so using standard ones makes them believe they are stalling.
It's an easy fix, read the following resource.

Make sure you connect the ethernet into the IPMI port.

Your board should work with non-ECC RAM so there could be issues there.

Also I guess your motherboard came without the backplate since the Alpine 12 CO isn't compatibile with it.
 
Last edited:

imvi

Dabbler
Joined
Dec 16, 2022
Messages
16
Thank you very much to both of you!
Ahem, is the Ram you got ECC Udimm? The one you Linked is not really ecc?
Wow, it is indeed not. I have no idea how this happened. I guess I blindly trusted Crucial's recommendations (which actually list only non-ECC) and forgot the board will also take non-ECC RAM. I certainly wanted ECC!
My X10 Board wont boot without ecc ram
This one should, and it does, see below.
Also, does you Board have a vga plug? My Supermicroboards all only post BIOS on VGA...
Probably wouldn't have thought of it but indeed, this was the key. I'm still getting nothing on DisplayPort but can now happily follow it booting up via the VGA port, get into BIOS, can boot from a USB Stick... Everything seems to actually work, so maybe the beeps are normal for this board (and hence I didn't find any translation for them).
Supermicro boards expect industrial fans so using standard ones makes them believe they are stalling.
It's an easy fix, read the following resource.
Ah yes, I remember reading about this now, thanks!
Make sure you connect the ethernet into the IPMI port.
I made sure to use the correct port. Since I now have a visual and get into the BIOS, I also now have the correct IPv4 address to use. After aligning the subnet mask and assigning an IP that fits the rest of the network, I can now see the webinterface.
Your board should work with non-ECC RAM so there could be issues there.
And it does, see above. Apparently all beeps are normal.
Also I guess your motherboard came without the backplate since the Alpine 12 CO isn't compatibile with it.
Honestly, the backplate thing still had me a bit confused initially, but yes, it did come with a piece of metal on the backside and I could mount the Alpine 12 CO no problem.

So in conclusion things are looking much better already, thanks again!
I'll now order the ECC RAM, and in the meantime create a bootable USB stick from which I can run ipmitool to set a user as well as performing the above fix for the fans. Somehow Rufus' FreeDOS + ipmicfg didn't seem to work, so I'll try a Linux live image with ipmitool next.

For RAM, to be even more aligned with Supermicro's list, I switched to KSM32ED8/16HD with Hynix instead of Micron (although I'd hope the board isn't that picky).
 

Davvo

MVP
Joined
Jul 12, 2022
Messages
3,222
and in the meantime create a bootable USB stick from which I can run ipmitool to set a user as well as performing the above fix for the fans. Somehow Rufus' FreeDOS + ipmicfg didn't seem to work, so I'll try a Linux live image with ipmitool next.
You can use TrueNas Shell or the CLI for it.

Honestly, the backplate thing still had me a bit confused initially, but yes, it did come with a piece of metal on the backside and I could mount the Alpine 12 CO no problem.
So you got a backplate without the mounting screws, lucky you I guess.
 
Top