Water Cooling

Status
Not open for further replies.

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
Looks like I underestimated PLX's current lineup. They even have an immense 96-lane switch that supports 24 ports. That's nearly enough to support 24 drives...
Looking at what sizes they have available, I'd guess they're using two non-cascaded 64-lane switches, each directly connected to the host with up to a potential 16 lanes (probably using x8 or x4 if they cheaped out).
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
Someone looking to cut out the SAS middleman. Haha, awesome. Obviously that sort of platform could be very useful for certain types of applications, but we're going to need a faster layer than ZFS for storage redundancy if we want to be able to take advantage of the potential speeds something like that can offer.
 

Mlovelace

Guru
Joined
Aug 19, 2014
Messages
1,111
Certainly one possibility. I mean, look at this, this is crazy:

http://www.supermicro.com/products/system/2U/2028/SSG-2028R-NR48N.cfm

But on the other hand, just making *all* the storage faster isn't necessarily the way to go. Storage tiering is a thing for a reason.



True, but one of my goals in life is to take every optical disc here, create an ISO on the fileserver, and then never need to go hunting for a particular magic disc ever again.
Hmm 48 nvme ssd @$1100 ea for 1.2TB + dual xeon E5-2699v3 @$4900 ea + 1.5TB RAM (24 64GB ddr4 lrdimm @$1200 ea) + $6300 for the case = $97700

I'd rather have a blade center :)
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
I'd rather have a blade center :)

Me too. Generally speaking I've never really found a useful application for the fancy stuff. Almost everything I do is network-centric, so the instant we're talking about a shelf of devices that is interfaced to the system at 120Gbit/sec, but only has 40Gbit connectivity options, I have a little trouble thinking up what purpose it serves.

A blade center, though, that could make sense... though I tend to do them as discrete machines, because I hate the idea of one single chassis letting out the magic smoke which makes them run.
 

Csaba Gajdos

Cadet
Joined
Oct 12, 2015
Messages
3
Different and fun? Without a doubt! I envision a hugh amount of inconvenience if you ever need to change out a hard drive.
As far as I know, they don't make a hot swap hard drive water block. o_O
Not to mention, a mixture of Grinch and frivolity would be JUST WRONG man! :p
That's a fair point. The main reason one would build something like this is the FUN. I agree on that with jgreco.
It could be very boring - but safe (but you can't learn anything new about heat and watercooling systems along the way) going with the classic air-cooled solution.
 

Fuganater

Patron
Joined
Sep 28, 2015
Messages
477
I love my Case Labs box ;)
Ya it is so awesome. I just re-purposed this case to be my new gaming tower.

Wow. Could you be a bit more specific?
I see that it could be a pain to change any hardware parts in the build, but other than that what drawbacks did it had?
Why did you choose to build it this way?
Well firstly I was using Windows 7 as a "server". Worked fine for dishing out media to my PS3 and holding my data but that was about all it could do. I had to completely change out hardware because I was using gaming gear instead of server grade gear.
 

BigDave

FreeNAS Enthusiast
Joined
Oct 6, 2013
Messages
2,479

Ya it is so awesome. I just re-purposed this case to be my new gaming tower.
Two weeks ago, I tore down my Mercury S8 water cooled gaming rig and re-purposed it to hold my FreeNAS hardware. lol
IMG_1517.JPG

I have room for another 4in3 hotswap unit, then I can
fit 6 more hard drives :D
 
Status
Not open for further replies.
Top