19" Racks for home...

Status
Not open for further replies.

rvassar

Guru
Joined
May 2, 2018
Messages
972
Someone brought up the Dell R510/520 in another thread... I had forgotten about these. Not a bad configuration for a home NAS, except for the rack mount. It got me wondering what it would take to build a noise managed 19-inch half-rack for home use, containing no more than 2, 2U servers, a switch or two, and a UPS. Anyone have any thoughts? My primary concern is noise. I have to be able to do an hour or three on the phone at a 5 foot radius from the rack, and work an 8 hour day without going crazy.

I've been a software engineer for the last 20 years... My SysAdmin "hammock between the racks" days are well in the past. Thoughts?
 

Inxsible

Guru
Joined
Aug 14, 2017
Messages
1,123
Build your own like I did using these plans:

https://tombuildsstuff.blogspot.com/2014/02/diy-server-rack-plans.html


I built an open rack since my rack was going to be in a closet and I didn't want to enclose it because I was worried about adequate cooling for my equipment.

You can add a bit of sound proofing in the cabinet to deaden the sound.
 

Stux

MVP
Joined
Jun 2, 2016
Messages
4,419
It is possible to design a rack mounted system to be virtually silent.
 

rvassar

Guru
Joined
May 2, 2018
Messages
972
Build your own like I did using these plans:

https://tombuildsstuff.blogspot.com/2014/02/diy-server-rack-plans.html

You can add a bit of sound proofing in the cabinet to deaden the sound.

That's almost exactly what I was thinking, except I was pondering full enclosure with some kind of filtered air intake in the bottom, and baffled duct exhaust at the top using 120mm fans. Use carpet in the air ducts and rubber sheet to mount the side panels to dampen the noise radiation thru the wood.
 

Inxsible

Guru
Joined
Aug 14, 2017
Messages
1,123
That's almost exactly what I was thinking, except I was pondering full enclosure with some kind of filtered air intake in the bottom, and baffled duct exhaust at the top using 120mm fans. Use carpet in the air ducts and rubber sheet to mount the side panels to dampen the noise radiation thru the wood.
If you do go this route, you might want to use the square hole rack uprights instead of the round holes that Tom uses in his build. I found them here :
http://www.starcase.com/Steel_and_aluminum_rack_rail_s/388.htm

I went with the Q-series (L-shaped) instead of the Z-series(? shaped). Didn't find much info on what are the advantages/disadvantages between the two. Starcase also sells them on Amazon in whatever size you want. I built a 16U, since that was more than sufficient for all my equipment -- pfSense router, switch, patch panel, 2U FreeNAS, 1U VM Server, PDU, 1U shelf for the modem and AP. It has left me enough space to add an UPS which is the only thing missing from my setup that I must have. A 12U would have sufficed, but I put in an additional 4Us for future expansion in case I add another box for something.
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
It is possible to design a rack mounted system to be virtually silent.

Yes, but you aren't likely to do that with current or recent enterprise offerings, because they value density above all else, which means that all the components are packed together and are reliant on static pressure differential to force air through the thing.

We've built a lot of 4U gear here over the years that's near dead silent because it only requires modest airflow from a few gentle 120MM fans. Of course, it isn't dissipating much power and isn't crammed full of 24 3.5" drives...
 

rvassar

Guru
Joined
May 2, 2018
Messages
972
If you do go this route, you might want to use the square hole rack uprights instead of the round holes that Tom uses in his build. I found them here :
http://www.starcase.com/Steel_and_aluminum_rack_rail_s/388.htm

I went with the Q-series (L-shaped) instead of the Z-series(? shaped). Didn't find much info on what are the advantages/disadvantages between the two.

I was thinking the square holes as well, as virtually all the equipment I'd be likely to run would have been designed for that style of rail. I'm not sure what the Z-style rail is used for. I have used C-channel racks in telco/network closets as single column racks. Routers and switches usually just hang from a center mounted L bracket, and rack shelves are added for the oddball stuff. The Z-rail initially looks similar, but I'm guessing has some other application. Maybe travel case enclosed racks? In which case it might be easier to use for my application.
 

rvassar

Guru
Joined
May 2, 2018
Messages
972
Yes, but you aren't likely to do that with current or recent enterprise offerings, because they value density above all else, which means that all the components are packed together and are reliant on static pressure differential to force air through the thing.

We've built a lot of 4U gear here over the years that's near dead silent because it only requires modest airflow from a few gentle 120MM fans. Of course, it isn't dissipating much power and isn't crammed full of 24 3.5" drives...

That's one of the reasons I've stuck with the desktop tower format for so long. But that same push for density has altered the market in second hand gear. The systems that can be usefully deployed for FreeNAS & ESXi duty are getting kind of old / rare, and hard to come by. Used rack servers go at a discount. Compare the prices between a Dell T7500 and a R510 on eBay. They're practically the same machine. 4x price spread because the T7500 is quiet, and has PCIx16 GPU slots.

I have considered trying to modify a rack mount server with water cooling so I could cut the static pressure requirement, but then you find yourself hacking on their fan controllers to fool them about the speed of the fans, and running tubing thru or around slide-back lids, etc...
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
One of the big issues is that there isn't this fantastic race forward any longer. At the beginning of the 1990's we had just introduced the 486DX-33 and we entered the 2000's with the Pentium III's, and the difference in capability was ... massive. By way of comparison, the Sandy Bridge cores we were buying back in ~2011 (7.5 years ago) are still about 80-85% of a modern core. Some of us are holding on to gear longer.

I've got some R510's here in the 12-bay variant and they're kinda sucky and watt-hungry for NAS use. A lot of the newer stuff, though, people still recognize has some value. Oddly it's been used PC desktops that have been cheap lately.
 

rvassar

Guru
Joined
May 2, 2018
Messages
972
One of the big issues is that there isn't this fantastic race forward any longer. At the beginning of the 1990's we had just introduced the 486DX-33 and we entered the 2000's with the Pentium III's, and the difference in capability was ... massive. By way of comparison, the Sandy Bridge cores we were buying back in ~2011 (7.5 years ago) are still about 80-85% of a modern core. Some of us are holding on to gear longer.

I've always held on to my gear longer than average. Consider... At the beginning of the 90's the SPARCstation-2 had something like a 4 to 1 performance advantage over that 486, running at pretty much the same clock speed. Back in 2000 I had an UltraSPARC 1 on my desk at work, and a Sun 690MP with 2 - 75Mhz SuperSPARC II's at home, running in an old Sun 4/110 single slot deskside VME case. It was the PIII where Intel & AMD finally pulled ahead on clock rates and started beating the RISC chips. The P3/P4 also brought in the pipelines and compiler tricks that made the differences in instruction sets moot, and the advancements in x86 MMU's finished them... RISC lost after that because they were fab-less and were always one integration generation behind. So I view the hardware life-cycle model of late 90's thru 2011 as kind of an aberration. The Internet boom provided R&D funding that allowed Intel to wrap up the Moore's law game. Now we're back to features, but most of the players have quit.

I've got some R510's here in the 12-bay variant and they're kinda sucky and watt-hungry for NAS use. A lot of the newer stuff, though, people still recognize has some value. Oddly it's been used PC desktops that have been cheap lately.

Which is why my FreeNAS is running on a Optiplex 790 at the moment. But I suspect this is a short lived time. Desktops are becoming unicorns. Laptops and SFF clinic/POS terminals are the bulk of end users "desktop" machines now. Since I'm looking to move to something with ECC & more bays, I'm going to either have to bite the bullet and order new, or convert to rack mount. I actually have an old PowerEdge SC1430 tower, but it's 10 years old, and I can hear it across the house thru closed doors. The thing that gives me pause on a new tower format server like a Dell T30, is I have no idea if it's any quieter, and it only supports a couple 3.5" drives...


I'm actually thinking I go ahead and build the rack enclosure, somewhat like the plans above, and use the SC1430 to prove out the noise abatement. If it makes that tolerable, I can probably run a 2U rack mount of some sort.
 

pro lamer

Guru
Joined
Feb 16, 2018
Messages
626
water cooling so I could cut the static pressure requirement, but then you find yourself hacking on their fan controllers
Why is hacking needed?

Is it because of the water cooling systems software (controlling the fans and pumps speeds) designed for M$ Windows? And FreeNAS cannot deal with both pumps and fans?
Or because the water cooling systems fans are much different from regular fans? (If they are).
 

rvassar

Guru
Joined
May 2, 2018
Messages
972
Why is hacking needed?

Is it because of the water cooling systems software (controlling the fans and pumps speeds) designed for M$ Windows? And FreeNAS cannot deal with both pumps and fans?
Or because the water cooling systems fans are much different from regular fans? (If they are).

I meant literally meant both types of hacking... Hacking holes in the chassis to plumb the liquid cooling blocks on the CPU's. But beyond the mechanical integration, rack mount servers have highly integrated cooling systems, with IPMI controllers that are aware of, and report failures in each individual fan, etc... You can't just delete the fans you don't want. You have to work around the fan reporting, which will pause a reboot, etc... And even if you manage to get rid of the complaining, you don't really know if the PSU's are getting their required airflow. There's usually no gaps in the "wall of fans" behind the front of the server. You pull one, and the remaining ones then fail to make the designed number of inches of water-column pressure difference, because the air will blow back thru the gap. It's not worth the effort.


On edit: Apologies for using the arcane US engineering unit "inches water column". Millimeters of Mercury (aka 'Torr') would probably be a better unit to reply to our European friends. 1 inch W.C == ~1.866 mmHg (Torr).
 
Last edited:

pro lamer

Guru
Joined
Feb 16, 2018
Messages
626
You have to work around the fan reporting, which will pause a reboot, etc...
Does it refer to rackmount designed motherboards rather than ATX compatible mobos? May I use water coolers with Supermicro X9DRL-I mobo more easily than with e.g. x9drw-if?
 

rvassar

Guru
Joined
May 2, 2018
Messages
972
Does it refer to rackmount designed motherboards rather than ATX compatible mobos? May I use water coolers with Supermicro X9DRL-I mobo more easily than with e.g. x9drw-if?

Now... That I couldn't say. I'm thinking about utilizing second hand equipment sourced from local resellers. I live in Pflugerville, Texas, and drive past the Dell HQ in Round Rock, every day on my way to work. :)
 

Linkman

Patron
Joined
Feb 19, 2015
Messages
219
Which is why my FreeNAS is running on a Optiplex 790 at the moment. But I suspect this is a short lived time. Desktops are becoming unicorns. Laptops and SFF clinic/POS terminals are the bulk of end users "desktop" machines now. Since I'm looking to move to something with ECC & more bays, I'm going to either have to bite the bullet and order new, or convert to rack mount. I actually have an old PowerEdge SC1430 tower, but it's 10 years old, and I can hear it across the house thru closed doors. The thing that gives me pause on a new tower format server like a Dell T30, is I have no idea if it's any quieter, and it only supports a couple 3.5" drives...

If the T30 is anything like the prior gen T20 or the HPE ML10 gen9 then they are practically silent except at boot (for the HPE, 100% fans for a few seconds at boot) or when going 100% CPU (you can hear the CPU fan sometimes). Limited in number of HDDs they can handle but great value for server hardware and ECC capability (though you sacrifice IPMI / IDRAC / iLO capability) and could be swapped into a larger case with some effort.
 

joeinaz

Contributor
Joined
Mar 17, 2016
Messages
188
If you can find one, a possible rack option to look at for a home or small office is the IBM Office Enablement Kit. It is an 11U rack which is completely self contained, on wheels and the dampened in the rear with foam. I once deployed one with six servers, 24 disks, a network switch and a UPS and it was quiet enough to sit next to a receptionist's desk. The best part is it very esthetic looks more like a metal cabinet than a computer rack. I have seen them on eBay but they are hard to find for cheap.

It's a bit much for just a single FreeNAS box but to do FreeNAS and servers in a non-data center space it could be a solution.
 

joeinaz

Contributor
Joined
Mar 17, 2016
Messages
188
For a home rack for multiple servers and storage, you might look an an IBM Office Enablement Kit (part # 44X2077). It is a self contained, on wheels 11U rack with a front door and foam dampening in the rear. I once deployed one with 6 servers, 24 disks, a network switch and a UPS; This system was quiet enough to sit next to a receptionist's desk. The best part is it looks more like a metal cabinet than a computer rack. They are hard to find because they are discontinued but if you are looking to deploy a FreeNAS box and a number of servers in enclosure that would be presentable even in your living room, this might be a solution.
 

rvassar

Guru
Joined
May 2, 2018
Messages
972
For a home rack for multiple servers and storage, you might look an an IBM Office Enablement Kit (part # 44X2077).

Found one on eBay... $400 plus $250 for shipping! I can buy a new Dell T30 for $549.

They do look nice though.
 

IQless

Contributor
Joined
Feb 13, 2017
Messages
142
If noise is the main issue, then you do have the APC NetShelter CX line. BUT it will cost you a kidney and a firstborn.
 

Inxsible

Guru
Joined
Aug 14, 2017
Messages
1,123
Found one on eBay... $400 plus $250 for shipping! I can buy a new Dell T30 for $549.

They do look nice though.
What size are you looking for?

Here are a few 12U options that are not too highly priced. You'd have to research a bit into each to make sure it has all the features you want, like casters, glass door vs metal doors + most importantly the depth you would need based on the depth of your deepest piece of equipment.
  1. https://www.ebay.com/itm/12U-19-Wal...689865?hash=item41e0850b89:g:yZYAAOSwUwla4Nja
  2. https://www.ebay.com/p/Raising-8u-S...g-and-Data-System/2212762420?iid=152682399938
  3. https://www.ebay.com/p/12u-Wall-Mou...-Glass-Door-black/9008557274?iid=352200813402
  4. https://www.ebay.com/itm/12U-Wallmo...Network-Rack-Locking-Glass-Door-/182871721061
  5. https://www.ebay.com/itm/StarTech-1...404305&hash=item5b4142c120:g:N~cAAOSweEFbESww
  6. https://www.ebay.com/itm/12U-Wall-M...255100?hash=item1edf5a497c:g:1iIAAOSw-H1a3za0
  7. https://www.ebay.com/itm/12U-Wall-M...hash=item41e261472e:m:mA_1b0ehKILVsXOU5CWEomQ

I didn't order them by price, but they range from $79.99 - $202.99. After $200 though, I feel you can build your own and get better configuration based on depth you want and the height you want.

It cost me about $121.74 to build my own 4-post open rack. This included the square-hole uprights + lumber + pocket screws + supplies(glue, screws etc). Other than the uprights ($90 for 16U 4-posts), everything else was in single digits.

Enclosing it with plywood + fans + doors + hinges etc would cost more.
 
Last edited:
Status
Not open for further replies.
Top