leenux_tux
Patron
- Joined
- Sep 3, 2011
- Messages
- 238
I have seen a few comments around this issue previously and have spotted a ticket in the bug tracking system for something similar (https://bugs.freenas.org/issues/4419) which has been closed (fixed), however, I'm not 100% sure it is.
I finally found some time to rebuild my FreeNAS server with all of the new components I have purchased over the last few months
When I use the command line to look at the amount of disk space available, I get a very different number, please see below.
Interestingly though, the POOL I have for my VM's (which has NOT been destroyed and re-created) is also incorrect, but not by a margin that concerns me.
Even amount listed via SSH doesn't look right, it looks too much. I was expecting to lose two drives to parity.
It's a conundrum, or is it ?
I finally found some time to rebuild my FreeNAS server with all of the new components I have purchased over the last few months
- Extra IBM M1015, now have two.
- Additional 5 X 2TB hard drives, so I now have 4X1TB dedicated as an ISCSI block device for VMWare ESXi and 10X2TB in a single RAIDZ2 Pool.
- New rack mounted case
- New rack.

When I use the command line to look at the amount of disk space available, I get a very different number, please see below.

Interestingly though, the POOL I have for my VM's (which has NOT been destroyed and re-created) is also incorrect, but not by a margin that concerns me.
Even amount listed via SSH doesn't look right, it looks too much. I was expecting to lose two drives to parity.
It's a conundrum, or is it ?