BUILD Powerful, used, home server build for future expansion.

Status
Not open for further replies.

TremorAcePV

Explorer
Joined
Jun 20, 2013
Messages
88
Hi guys,

First, the hardware:
  • Dual Intel Xeon X5472 SLANR LGA771 3.0GHz Quad Core CPUs
  • Super Micro Computer X7DWN+ LGA 771/Socket J Intel Motherboard
  • 32GB (8x4GB) HP 398708-061 ECC DDR2 PC2-5300F 667MHz FBD DIMMs
  • NZXT Hale82-M 750w PSU (from gaming PC after PSU upgrade) 80+ Bronze IIRC
  • A single 4TB Seagate consumer HDD (will be upgraded to a WD Red or Seagate NAS drive in the future as well as expanded to RAID 0 then RAID 10, so 2 x 4TB for 8TB usable then 4 x 4TB for 8TB usable).
  • Two simple 32GB SSDs (got for very cheap) for L2ARC for the single ZPool.
  • Whatever I can find to sit it in for the case.
As you can probably tell, this is going to be a bit of a Monster from Frankenstein type machine.

If you are wondering what usage this will be for, I intend to make use of FreeNAS' new Domain Controller feature, run some plugins such as Plex Media Server and Transmission as well as perhaps a Minecraft server (if I can get that going). I've already got the DC up and running (and mostly working) with my current hardware (far weaker than the above) and it's doing pretty good. Some DNS issues as well as lack of support for some basic Domain Controller functionality, but it's a new feature, so I'll take what I can get.

My questions regarding this build are the following:
  1. The SSDs are SATA II. Should I even bother trying to use them as cache? Will they make a difference for read/write speeds for the single HDD setup? Note that I'm not going to be using them once I get to RAID 0 and 10, but that will take time to save up money.
  2. Will the fact it's DDR2 RAM affect performance majorly since FreeNAS lives in the RAM? Since it's Quad Channel, it is equivalent to DDR3 in Dual Channel (if I do math correctly) in speeds. However, I am not sure if having 2 CPUs would mitigate that (although with this setup, I have 2 separate quad channel setups so they could each have their own alotted RAM bus).
  3. What are some recommended cases that can fit this motherboard? Note that it is Enhanced Extended ATX, which is bigger than Extended ATX. I'd rather not have a server rack case myself, but I can't imagine there are many normal desktop PC cases that support this standard. If I have little choice, I'd accept getting one.
One other thing. I purchased the RAM, CPUs, and Motherboard off of Ebay used (from trusted sellers with buyer's back guarantee) for $250. This is meant to just show what good hardware you can get (for a FreeNAS system) on a budget. For those three components and with this kind of power, I think I did pretty well.

In terms of RAM usage, I'm calculating it at 4GB for DC, 4GB for OS, and 1*16 for 16GB of storage which should give me an overhead of at least 8GB with the option to upgrade to 64GB one day ($150 for 32GB to do that, roughly).

I'm open to any other thoughts and considerations about this build. Advice is welcome.

Thanks,
Vitalius.
 

cyberjock

Inactive Account
Joined
Mar 25, 2012
Messages
19,525
My thoughts:

1. You are throwing an L2ARC in there and it's not going to help you(but could hurt you later because of the added complexity). Clearly you haven't read my noobie guide...
2. That hardware was NOT worth $250. That thing is going to be a power hog. /smh
3. Your quad-channel is not your limiting factor. That 7 year old CPU is. Yes, that CPU came out in 2007. FSB.. for the fail. My 2010 laptop can outbench that thing!
4. I'd go shopping for a Supermicro case that supports it on ebay.
 

TremorAcePV

Explorer
Joined
Jun 20, 2013
Messages
88
My thoughts:

1. You are throwing an L2ARC in there and it's not going to help you(but could hurt you later because of the added complexity). Clearly you haven't read my noobie guide...
2. That hardware was NOT worth $250. That thing is going to be a power hog. /smh
3. Your quad-channel is not your limiting factor. That 7 year old CPU is. Yes, that CPU came out in 2007. FSB.. for the fail. My 2010 laptop can outbench that thing!
4. I'd go shopping for a Supermicro case that supports it on ebay.

Thanks for the response.
  1. That's what I figured. Thanks. I have read it. It's just been a while. I assume you can't RAID 0 two L2ARCs.
  2. I live in Texas. Our electricity tends to be cheap. I'm not personally worried about it. $200 per year (roughly based on my guesstimations) is kind of a lot, but for the utility, eh.
  3. But can it outbench 2 of them? /joke Hmm, the FSB is 1600MHz. The obvious question then is: Is that split between both CPUs or do they each get that speed for themselves? I'm not sure, but I imagine if they are separate then it won't be so bad.
  4. Alright, I figured as much. Was just checking.
 

cyberjock

Inactive Account
Joined
Mar 25, 2012
Messages
19,525
FSB is between all CPUs and cores. Now you probably saw "CPUs and cores" and then thought "that's gotta suck". Well, now you see why the FSB is gone. It was removed in Nov 2008 when the first gen "i7" hit the market.. as a Nahelem.

FSB really hurt multi-CPU/core systems for a while. In case you weren't around back when AMD first started beating Intel in the CPU wars, AMDs had no FSB. AMD ditched their FSB a few years before Intel did. And let me tell you, removing that FSB made a BIG difference in performance. Some people have theorized that if Intel had simply designed their CPU to not have a FSB at the same time AMD did then Intel's CPUs would have beat AMDs solution.
 

TremorAcePV

Explorer
Joined
Jun 20, 2013
Messages
88
FSB is between all CPUs and cores. Now you probably saw "CPUs and cores" and then thought "that's gotta suck". Well, now you see why the FSB is gone. It was removed in Nov 2008 when the first gen "i7" hit the market.. as a Nahelem.

FSB really hurt multi-CPU/core systems for a while. In case you weren't around back when AMD first started beating Intel in the CPU wars, AMDs had no FSB. AMD ditched their FSB a few years before Intel did. And let me tell you, removing that FSB made a BIG difference in performance. Some people have theorized that if Intel had simply designed their CPU to not have a FSB at the same time AMD did then Intel's CPUs would have beat AMDs solution.
Admittedly then, I can't expect the CPUs performance to be all that good. However, I am now wondering if it was a worthy trade-off for getting that much RAM. Getting that much ECC RAM on a system would cost a lot more than $200. Most new 32GB sets of ECC DDR3 RAM are around $400+ on their own.

And from what I've read and experienced, limited RAM either is or isn't a huge detriment to a FreeNAS system (with little grey area in between).

Oh well. This will be a fun test system then. I'll start saving up for real server hardware in the mean time. Thanks for your knowledge.
 

HoneyBadger

actually does care
Administrator
Moderator
iXsystems
Joined
Feb 6, 2014
Messages
5,112
My thoughts,

Yes, it's going to be a power hog, those are X5400-series Xeons with FB-DIMMs. However unless you turn on pool encryption, it's likely not going to be as weak as you or cyberjock is suggesting. They might not be as fast per thread as a modern Core series chip, but you do have eight of those threads.

Re: those SSDs, what model are they? They might be decent SLOG devices, you can repurpose them in your desktop or another rig - or even use them as a small, high-performance pool to install jails onto.
 

cyberjock

Inactive Account
Joined
Mar 25, 2012
Messages
19,525
I was just mentioning what I think the limiting factor will be. I wasn't saying it was going to be something that's likely to make him unhappy. For me, the power usage is disgusting.

As for the cost of RAM, your RAM can't be dropped into newer hardware. It's dead-end technology. You also don't likely need 32GB of RAM. For the vast majority of home users 16GB of RAM is more than enough.
 

HoneyBadger

actually does care
Administrator
Moderator
iXsystems
Joined
Feb 6, 2014
Messages
5,112
It was moreso the "my 2010 laptop can outbench it" line. Maybe a little hyperbolic there. You'll certainly have him schooled on performance-per-watt though.

Two 125W TDP Xeons and I'm assuming eight sticks of FB-DIMMs that suck down 10W each at idle ... not on my bill, thanks. OP lives in Texas so electricity is cheap, but don't discount the fact that you're going to be churning out that much extra heat in the summer. Speaking from downtown Houston experience, y'all got more than enough of that already. ;)
 

TremorAcePV

Explorer
Joined
Jun 20, 2013
Messages
88
My thoughts,

Yes, it's going to be a power hog, those are X5400-series Xeons with FB-DIMMs. However unless you turn on pool encryption, it's likely not going to be as weak as you or cyberjock is suggesting. They might not be as fast per thread as a modern Core series chip, but you do have eight of those threads.

Re: those SSDs, what model are they? They might be decent SLOG devices, you can repurpose them in your desktop or another rig - or even use them as a small, high-performance pool to install jails onto.

Thank you for giving your input.

Quite. That's what I was basically thinking. Beat the inefficiency by having more cores.

One is a 40GB MushkinIIRC, and the other is an OCZ 60GB (I thought thought they were 32GB when listing specs). I'm not sure on exact model numbers, but I'll add those when I get home (at work). I like that idea. Maybe the 60GB for L2ARC (if it is even worth it, though probably not) and the 32GB for the jails. Or both for jails.

I was thinking of tinkering with a Minecraft server so maybe have the 60GB dedicated to that one if it won't help as an L2ARC.

I was just mentioning what I think the limiting factor will be. I wasn't saying it was going to be something that's likely to make him unhappy. For me, the power usage is disgusting.

As for the cost of RAM, your RAM can't be dropped into newer hardware. It's dead-end technology. You also don't likely need 32GB of RAM. For the vast majority of home users 16GB of RAM is more than enough.

I agree about the power usage. Then again, I also have a 290X machine mining cryptocurrencies while I'm at work which would pull the same wattage as this server will, so I'm not that bothered by it.

As I said above though, I'll be saving up for better (both performance & efficiency) server parts later. Give it a year and I'll probably replace this build. But for now...

Correct. I actually happen to know a guy who works for a research organization. They get most of their hardware through donations. They have a machine that uses DDR2 FB-DIMMs and they would like to upgrade, though it isn't urgent, so I have a place to send these later anyway. I'll probably write it off on my taxes or something.

Well, I'm not really an average home user though I understand what you mean. My math:

1GB per TB means I'll need 16GB of RAM eventually as I plan to have 16TB of raw storage in RAID 10 (so 8TB of actual storage).
4GB for the OS itself (as per the manual IIRC)
4GB for the Domain Controller (from the wiki IIRC)

So that's 24GB already. Then there's Jails, the potential MC server (still not 100% certain on that though), and whatever else I decide to toy with. I feel like 8GB is good overhead for me.

It was moreso the "my 2010 laptop can outbench it" line. Maybe a little hyperbolic there. You'll certainly have him schooled on performance-per-watt though.

Two 125W TDP Xeons and I'm assuming eight sticks of FB-DIMMs that suck down 10W each at idle ... not on my bill, thanks. OP lives in Texas so electricity is cheap, but don't discount the fact that you're going to be churning out that much extra heat in the summer. Speaking from downtown Houston experience, y'all got more than enough of that already. ;)


10W each at idle. Huh. Didn't even think that Fully Buffered means they basically have a tiny processor on them.

I'd do the math, but it's a bit time consuming guesstimating system usage and efficiencies and such, so let's just say 300w average consistently throughout the year. 0.3kWh * $0.10 (power cost) = $0.03 per hour * 24 * 365 = $262.8 per year in power usage. Eh. Bad but not terrible. Actual cost will be between $200 and $300 more than likely. On par with my mining rig.

As for the extra heat, that is a valid point. What I lose in the summer, I gain in the winter. Swings and Roundabouts. You'd be surprised how warm my mining rig kept my room. Bonus. Considering that, I may move it in there during the winter (My bedroom is separate from the rest of the house and so isn't connected to the AC/heating).
 

HoneyBadger

actually does care
Administrator
Moderator
iXsystems
Joined
Feb 6, 2014
Messages
5,112
L2ARC in general is worth it if you have a working set larger than your ARC but smaller than your proposed L2ARC, and no easy/economical way to expand primary cache (RAM).

Most home users don't need it unless they're doing virtualization. In your case, with it being used mostly for media storage and transcoding, you won't be using L2ARC unless you plan on watching the same show over and over again.

Minecraft, on the other hand, absolutely loves SSDs. I say put your server on there.
 

TremorAcePV

Explorer
Joined
Jun 20, 2013
Messages
88
L2ARC in general is worth it if you have a working set larger than your ARC but smaller than your proposed L2ARC, and no easy/economical way to expand primary cache (RAM).

Most home users don't need it unless they're doing virtualization. In your case, with it being used mostly for media storage and transcoding, you won't be using L2ARC unless you plan on watching the same show over and over again.

Minecraft, on the other hand, absolutely loves SSDs. I say put your server on there.

By "Working set", do you mean "Data which you access regularly"? I may be misunderstanding that somewhere.

Assuming I have it right, then that makes sense as to why it's fairly pointless. I'll just store the MC stuff on the 60GB SSD and store other things that I'd prefer were fast on the 40GB. Thanks.
 

HoneyBadger

actually does care
Administrator
Moderator
iXsystems
Joined
Feb 6, 2014
Messages
5,112
Nope, you nailed it re: working set. That's why most home users, who access a variety of different media, aren't a good match - they tend to pull data randomly from their entire pool, which L2ARC can't cache effectively.
 
Joined
Dec 7, 2013
Messages
95
One question that comes in my mind is: Why use hardware that is that dated? Do you have it lying around gathering dust otherwise? (Don't get me wrong, it does work, I got a Xeon of that generation on an X7 board in my own box, but that was because of severe budget restrictions forcing me to buy used in order to get Server grade hardware for what money I have)

As for the minecraft server, thats set up quite easy: Create a jail, install java in it, put the server.jar somewhere and run it. Done that myself recently, still need to find out how to autorun the server instead of starting it manually through the console
 

TremorAcePV

Explorer
Joined
Jun 20, 2013
Messages
88
One question that comes in my mind is: Why use hardware that is that dated? Do you have it lying around gathering dust otherwise? (Don't get me wrong, it does work, I got a Xeon of that generation on an X7 board in my own box, but that was because of severe budget restrictions forcing me to buy used in order to get Server grade hardware for what money I have)

As for the minecraft server, thats set up quite easy: Create a jail, install java in it, put the server.jar somewhere and run it. Done that myself recently, still need to find out how to autorun the server instead of starting it manually through the console
Basically the same reason you have this hardware in your machine.

Because it was cheap, but it covered my requirements. One major requirement with FreeNAS using ZFS is you need lots of RAM depending on what you plan to do. And you want ECC RAM too, and I'm the type that demands it so long as the budget can do it.

16GB of DDR3 ECC RAM is around $200 new from what I saw in my research, and that isn't enough for what I want to do. So $400+ just for the RAM.

Basically, this old platform was a cheap way to get 32GB of ECC RAM. That's why I went with it. The CPUs were cheap, so I could get two 4 cores for around half the price of a consumer AMD "8" core (they kinda aren't 8 cores, but kinda are) CPUs. And forget server components (minimum Server Motherboard and CPU is $400+ as well).

So instead of buying 32GB of ECC RAM and an Opteron + Server motherboard for $800, I got an equivalent system, albeit a bit slower (FSB and all) for $250.

And no, a weaker system wasn't adequate for my uses. I intend to hit this server heavily with usage in some way or another (Domain Controller, various plugins/jails, Scrubs, Backups, Torrents, On-The-Fly encoding with Plex, etc), so less bottlenecks = better.


Nope, you nailed it re: working set. That's why most home users, who access a variety of different media, aren't a good match - they tend to pull data randomly from their entire pool, which L2ARC can't cache effectively.

Gotcha. Thanks a lot.
 
Status
Not open for further replies.
Top