Migrating to FreeNAS & Pool Flexibility/Parity

Status
Not open for further replies.

Apocrathia

Cadet
Joined
Nov 14, 2013
Messages
3
I've used FreeNAS several times in the past, in production environments, and I've had awesome experiences with it. Well, I'm looking into it as an option for my next home NAS. Here's where I am now:

Currently, I've been running all of my operations with a Drobo attached to a Mac Mini. A setup I've seen done a million times, and it's been running okay since 2009. The Drobo is getting slower and slower, and the Mac Mini has better things to do than to wait on the Drobo all the time.

So, fast forward to the past few months; I've been searching, building VMs of different NAS OS's to test with, and specing out systems to build a new NAS. I considered another Drobo, but the proprietary filesystem just scares me (I've had one crash before and I lost EVERYTHING [and their support sucks]). Same with Synology (plus they're also painfully expensive). I looked at FreeNAS first, but it didn't have the RAIDz flexibility that I wanted (Mixed drive sizes, growing and shrinking, etc...). So, I ended up settling on unRAID...

Alright, so I've got this little mini-itx system, dual core atom, 4gb ram, lian li q-pc25b case (sweet little case, btw), and some drives. Now, here's the trouble; all of my main drives are still in the Drobo (4x2tb). I have a single 2tb drive free, and a 256gb ssd that I intended to use as cache. So, the idea here is that I would slowly move data to the new NAS, rsync to verify, delete it from the Drobo, free a drive, put it in the new NAS. Cool. This should work. Well, unRAID... *ahem* ReiserFS decided that something didn't look right, and data was corrupted. (It hasn't made me want to kill my wife yet, but luckily I don't have one.). Ugh. Pull the array into maintenance mode, reiserfsck, and it's clean. Well what the f**k happened to my data? I can only mount the drive as read-only. Useless. Searched their forums, found people with the same issues, and I've just lost all faith in ReiserFS (what little I had to begin with). The OS itself was already a convoluted mess (I never did like Slackware). Thankfully, everything is still on the Drobo. Okay, back to FreeNAS, let's see if we can make this work.

That's my situation; I have a single drive at the moment, and I will add new ones as they become available. I understand that I can go ahead and setup the first drive as a stripe and the SSD as a cache. Now, when I add drives, I can add them as stripes. What if I want to replace one? Can I shrink the pool, take the drive out, add the new one, and expand the pool to the new size? How does that work? What about parity with the striped drives? I know that unRAID gave me a single parity drive, which was okay. I'm sifting through the mountains of documentation to try and catch up, but I'm hoping someone can provide a concise answer.

Holy wall of text, batman!
TL;DR: Can I get parity when I stripe drives? How flexible is a ZFS pool?

Also, yay first post.
 

cyberjock

Inactive Account
Joined
Mar 25, 2012
Messages
19,526
I've used FreeNAS several times in the past, in production environments, and I've had awesome experiences with it. Well, I'm looking into it as an option for my next home NAS. Here's where I am now:

Currently, I've been running all of my operations with a Drobo attached to a Mac Mini. A setup I've seen done a million times, and it's been running okay since 2009. The Drobo is getting slower and slower, and the Mac Mini has better things to do than to wait on the Drobo all the time.

So, fast forward to the past few months; I've been searching, building VMs of different NAS OS's to test with, and specing out systems to build a new NAS. I considered another Drobo, but the proprietary filesystem just scares me (I've had one crash before and I lost EVERYTHING [and their support sucks]). Same with Synology (plus they're also painfully expensive). I looked at FreeNAS first, but it didn't have the RAIDz flexibility that I wanted (Mixed drive sizes, growing and shrinking, etc...). So, I ended up settling on unRAID...

Alright, so I've got this little mini-itx system, dual core atom, 4gb ram, lian li q-pc25b case (sweet little case, btw), and some drives. Now, here's the trouble; all of my main drives are still in the Drobo (4x2tb). I have a single 2tb drive free, and a 256gb ssd that I intended to use as cache. So, the idea here is that I would slowly move data to the new NAS, rsync to verify, delete it from the Drobo, free a drive, put it in the new NAS. Cool. This should work. Well, unRAID... *ahem* ReiserFS decided that something didn't look right, and data was corrupted. (It hasn't made me want to kill my wife yet, but luckily I don't have one.). Ugh. Pull the array into maintenance mode, reiserfsck, and it's clean. Well what the f**k happened to my data? I can only mount the drive as read-only. Useless. Searched their forums, found people with the same issues, and I've just lost all faith in ReiserFS (what little I had to begin with). The OS itself was already a convoluted mess (I never did like Slackware). Thankfully, everything is still on the Drobo. Okay, back to FreeNAS, let's see if we can make this work.

That's my situation; I have a single drive at the moment, and I will add new ones as they become available. I understand that I can go ahead and setup the first drive as a stripe and the SSD as a cache. Now, when I add drives, I can add them as stripes. What if I want to replace one? Can I shrink the pool, take the drive out, add the new one, and expand the pool to the new size? How does that work? What about parity with the striped drives? I know that unRAID gave me a single parity drive, which was okay. I'm sifting through the mountains of documentation to try and catch up, but I'm hoping someone can provide a concise answer.

Holy wall of text, batman!
TL;DR: Can I get parity when I stripe drives? How flexible is a ZFS pool?

Also, yay first post.

Hate to burst your first post, but you have "fail" written all over it.

4GB of RAM(too little.. 8GB minimum)
miniitx- will probably endbadly as soon as you start playing because you have only 1 PCIe slot.
CPU - too low of a performer if you are wanting performance, not to mention that it's max is either 4GB or 8GB, whereas ZFS is 8GB minimum.
256GB SSD, not unless you have at least 64GB of RAM(1:5 ratio by my calc). Hint: a cache drive is not a substitute for RAM, period.

You can't add single disks, except in a non-redundant stripe(aka RAID0).

You should read through my presentation(link in my sig), read the manual(and I mean read, not flip to the chapter to install it when you are installing it), and check out every stick in this forum. Those stickies are words of warning from others that have lost their data permanently. You need to slow down and do this right as many things, once done, cannot ever be undone.

Unfortunately, if you don't slow down you're probably going to lose your data and not even understand what you did wrong until you realize it is gone for good. There are no recovery tools for ZFS. So if you lose your data it is really lost.
 

Apocrathia

Cadet
Joined
Nov 14, 2013
Messages
3
I actually read through your powerpoint right after posting this and just went "well... f**k...". It looks like I'm going to have to keep searching for a NAS platform that I can use. I just need something scalable with some data protection.
 

cyberjock

Inactive Account
Joined
Mar 25, 2012
Messages
19,526
Yeah, probably the best choice unless you are ready to make the jump to the "big boy" NAS.
 

Apocrathia

Cadet
Joined
Nov 14, 2013
Messages
3
And every environment I've used FreeNAS has been in large production environments, where I had a lot of horsepower to throw at it. (I've been getting spoiled with NetApp for the past couple of years now, though). *sigh* Oh well... Thanks for doing that powerpoint. It really cleared up a ton.
 

cyberjock

Inactive Account
Joined
Mar 25, 2012
Messages
19,526
Yeah. FreeNAS is one of those things that really works best with larger systems than your "average home user" needs. Some people do it for the e-penis. Some do it for the experience. Others have a genuine need to use it because a hardware RAID controller is just too damn expensive. :p
 
Status
Not open for further replies.
Top