Unusable drive space

Status
Not open for further replies.

syogod

Cadet
Joined
Sep 14, 2015
Messages
5
Sorry for the dumb question, but I'm still fairly new to FreeNAS/ZFS/RAIDZ.

I just setup a RAIDZ system using 4x3TB drives. Ended up with a total of 7.63TB capacity and it felt a little off to me.

I understand the 3TB drives are actually around 2.73TB due to the whole TiB <> TB thing. By my rudimentary math, I should be around 2.73 * 3 = 8.19TB. Almost 600GB loss seems like a large overhead for the system, but like I said, I'm new so this could be normal. Am I missing something?
 

DrKK

FreeNAS Generalissimo
Joined
Oct 15, 2013
Messages
3,630
Sorry for the dumb question, but I'm still fairly new to FreeNAS/ZFS/RAIDZ.

I just setup a RAIDZ system using 4x3TB drives. Ended up with a total of 7.63TB capacity and it felt a little off to me.

I understand the 3TB drives are actually around 2.73TB due to the whole TiB <> TB thing. By my rudimentary math, I should be around 2.73 * 3 = 8.19TB. Almost 600GB loss seems like a large overhead for the system, but like I said, I'm new so this could be normal. Am I missing something?
More than likely the problem is the TiB vs TB comparison.

FreeNAS is reporting to you in binary prefix TiB's, and the drive manufacturer quotes in decimal TB's. The difference is about the difference you are reporting.
 

DrKK

FreeNAS Generalissimo
Joined
Oct 15, 2013
Messages
3,630
Code:
root@murmur:~ # units
586 units, 56 prefixes
You have: 7.63 terabyte
You want: byte
  * 8.3892737e+12
  / 1.1919983e-13

Thus would it seem that 7.63 TiB equates to 8.389 Trillion Bytes.
 

DrKK

FreeNAS Generalissimo
Joined
Oct 15, 2013
Messages
3,630

Bidule0hm

Server Electronics Sorcerer
Joined
Aug 5, 2013
Messages
3,710
No, he's right. He has already taken into account the TB/TiB difference (but he's using TB instead of TiB in his post).

The raw data space is equal to 9 TB or 8.185 TiB but he only sees 7.63 TiB.

Some of that is the overhead but I guess the rest is also because ZFS makes some assumptions when it processes the empty space so the number is probably not that accurate.

Remember that you can't fill the pool more than 80 % if you don't want to run into performance problems so you really have about 6.5 TiB usable.
 

DrKK

FreeNAS Generalissimo
Joined
Oct 15, 2013
Messages
3,630
No, he's right. He has already taken into account the TB/TiB difference (but he's using TB instead of TiB in his post).

The raw data space is equal to 9 TB or 8.185 TiB but he only sees 7.63 TiB.

Some of that is the overhead but I guess the rest is also because ZFS makes some assumptions when it processes the empty space so the number is probably not that accurate.

Remember that you can't fill the pool more than 80 % if you don't want to run into performance problems so you really have about 6.5 TiB usable.
holy crap, what the hell is wrong with me. He had it in blazing huge letters that he already took that into account.

I'm sorry. Ignore me.
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
600GB does sound like a lot. Are the drives reporting their expected size?
 

joeschmuck

Old Man
Moderator
Joined
May 28, 2011
Messages
10,994
How are you collecting the "reported" size of the pool? What does "zpool list" result in?
 

syogod

Cadet
Joined
Sep 14, 2015
Messages
5
I'll have to wait until I get home to get the exact location/phrasing, but I think it was something like, "View Disks" under "Storage" where I saw they were all at 3TB.
 

joeschmuck

Old Man
Moderator
Joined
May 28, 2011
Messages
10,994
I believe they are all 3TB but I want to know the reported pool size via the command referenced. 600GB is a bit extreme to be missing. If "zpool list" shows the same results, then also post "zpool status" and "df" so we can see the configuration of your pool, although I would think if it were any other configuration you would have larger space issues. And a screen shot of the "Storage". Lets just get it all out there so the problem can be identified.
 

syogod

Cadet
Joined
Sep 14, 2015
Messages
5
Sorry this took so long, I forgot about this until now.

Code:
[root@freenas] ~# zpool list
NAME           SIZE  ALLOC   FREE  EXPANDSZ   FRAG    CAP  DEDUP  HEALTH  ALTROOT
Primary       10.9T  1.32T  9.55T         -     6%    12%  1.00x  ONLINE  /mnt
freenas-boot  14.6G   989M  13.7G         -      -     6%  1.00x  ONLINE  -
[root@freenas] ~# zpool status
  pool: Primary
state: ONLINE
  scan: none requested
config:

        NAME                                            STATE     READ WRITE CKSUM
        Primary                                         ONLINE       0     0     0
          raidz1-0                                      ONLINE       0     0     0
            gptid/6baf38cb-5b2e-11e5-bd97-fcaa144f465a  ONLINE       0     0     0
            gptid/6c1055e0-5b2e-11e5-bd97-fcaa144f465a  ONLINE       0     0     0
            gptid/6c63ec7e-5b2e-11e5-bd97-fcaa144f465a  ONLINE       0     0     0
            gptid/6cb1ee48-5b2e-11e5-bd97-fcaa144f465a  ONLINE       0     0     0

errors: No known data errors

  pool: freenas-boot
state: ONLINE
  scan: scrub repaired 0 in 0h0m with 0 errors on Wed Sep  2 03:45:40 2015
config:

        NAME                                            STATE     READ WRITE CKSUM
        freenas-boot                                    ONLINE       0     0     0
          mirror-0                                      ONLINE       0     0     0
            da0p2                                       ONLINE       0     0     0
            gptid/0d60b8a5-3563-11e5-ab45-fcaa144f465a  ONLINE       0     0     0

errors: No known data errors
[root@freenas] ~# df
Filesystem                                                1K-blocks      Used      Avail Capacity  Mounted on
freenas-boot/ROOT/FreeNAS-9.3-STABLE-201509220011          14371989    527325   13844664     4%    /
devfs                                                             1         1          0   100%    /dev
tmpfs                                                         32768      5408      27360    17%    /etc
tmpfs                                                          4096         8       4088     0%    /mnt
tmpfs                                                       2764112     85912    2678200     3%    /var
freenas-boot/grub                                          13851617      6952   13844664     0%    /boot/grub
Primary                                                  7188776182       139 7188776042     0%    /mnt/Primary
Primary/Movies                                           7843475117 654699075 7188776042     8%    /mnt/Primary/Movies
Primary/Music                                            7195917175   7141133 7188776042     0%    /mnt/Primary/Music
Primary/TV                                               7557052188 368276146 7188776042     5%    /mnt/Primary/TV
Primary/jails                                            7188776182       139 7188776042     0%    /mnt/Primary/jails
Primary/.system                                          7188776193       151 7188776042     0%    /var/db/system
Primary/.system/cores                                    7188776949       906 7188776042     0%    /var/db/system/cores
Primary/.system/samba4                                   7188776548       505 7188776042     0%    /var/db/system/samba4
Primary/.system/syslog-f1ae6c68bbe041c7bb38cadeec088781  7188776589       546 7188776042     0%    /var/db/system/syslog-f1ae6c68bbe041c7bb38cadeec088781
Primary/.system/rrd-f1ae6c68bbe041c7bb38cadeec088781     7188776182       139 7188776042     0%    /var/db/system/rrd-f1ae6c68bbe041c7bb38cadeec088781
Primary/.system/configs-f1ae6c68bbe041c7bb38cadeec088781 7188776705       662 7188776042     0%    /var/db/system/configs-f1ae6c68bbe041c7bb38cadeec088781


Free_NAS_Storage.png
 

joeschmuck

Old Man
Moderator
Joined
May 28, 2011
Messages
10,994
Looking at the results you posted, all looks fine. I see 10.9TB of storage which is spot on where it should be.
 
Status
Not open for further replies.
Top