Unable to decrypt encrypted pools after reboot (GPT partitions gone on ALL drives?)

rfang9524

Cadet
Joined
Jan 3, 2019
Messages
3
I had the exact same issue. I though of doing the solutions posted above, but having to buy 20 large drives would be quite expensive. Luckily, I was able fix it after hours and hours of troubleshooting. Hope this helps anyone that had the same issue.



TLDR;
  1. Use geli to attach and decrypt each drive one at a time using the recovery key
  2. Use geli to attach the drives that failed to decrypt previously using the encryption key
  3. Import the pool and remount/relocate if necessary

What Happened
I purchased 4 new hard drive and added to the RAID as a new vdev. I restarted my server a few hours later. Upon startup I could decrypt the pool anymore. It will prompt with an error stating four of the (new) drives couldn't be decrypted. I had the recovery key, encryption key, and password from the day before, but none of them worked. I remember I did not issue or received a new key since. After hours and hours of research and trial and error, I was finally able to decrypt the pool.

Important Things to Note:
  • The encrypted partition is on p2 (i.e. /dev/sda#p2). Thus, you can't just specify /dev/sda#
  • da8-da11 are the new drives that had issues decrypting
  • After the pool is mounted, I had to remount it using zfs to the correct location on the filesystem.

Here's What I Did
Code:
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da0p2
root@freenas:~ # geli status
     Name  Status  Components
da0p2.eli  ACTIVE  da0p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da1p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da2p2
root@freenas:~ # geli status
     Name  Status  Components
da0p2.eli  ACTIVE  da0p2
da1p2.eli  ACTIVE  da1p2
da2p2.eli  ACTIVE  da2p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da3p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da4p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da5p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da6p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da7p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da8p2
geli: Wrong key for da8p2.
root@freenas:~ # geli attach -k pool_RAID_encryption\ \(3\).key -p /dev/da8p2
root@freenas:~ # geli attach -k pool_RAID_encryption\ \(3\).key -p /dev/da9p2
root@freenas:~ # geli attach -k pool_RAID_encryption\ \(3\).key -p /dev/da10p2
root@freenas:~ # geli attach -k pool_RAID_encryption\ \(3\).key -p /dev/da11p2
root@freenas:~ # geli status
      Name  Status  Components
 da0p2.eli  ACTIVE  da0p2
 da1p2.eli  ACTIVE  da1p2
 da2p2.eli  ACTIVE  da2p2
 da3p2.eli  ACTIVE  da3p2
 da4p2.eli  ACTIVE  da4p2
 da5p2.eli  ACTIVE  da5p2
 da6p2.eli  ACTIVE  da6p2
 da7p2.eli  ACTIVE  da7p2
 da8p2.eli  ACTIVE  da8p2
 da9p2.eli  ACTIVE  da9p2
da10p2.eli  ACTIVE  da10p2
da11p2.eli  ACTIVE  da11p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da12p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da13p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da14p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da15p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da16p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da17p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da18p2
root@freenas:~ # geli attach -p -k pool_RAID_recovery\ \(2\).key /dev/da19p2
root@freenas:~ # zpool import
   pool: RAID
     id: 628020232310xxxxxx
  state: ONLINE
 action: The pool can be imported using its name or numeric identifier.
 config:

    RAID            ONLINE
      raidz1-0      ONLINE
        da12p2.eli  ONLINE
        da13p2.eli  ONLINE
        da14p2.eli  ONLINE
        da19p2.eli  ONLINE
      raidz1-1      ONLINE
        da18p2.eli  ONLINE
        da16p2.eli  ONLINE
        da17p2.eli  ONLINE
        da15p2.eli  ONLINE
      raidz1-2      ONLINE
        da3p2.eli   ONLINE
        da5p2.eli   ONLINE
        da6p2.eli   ONLINE
        da7p2.eli   ONLINE
      raidz1-3      ONLINE
        da0p2.eli   ONLINE
        da1p2.eli   ONLINE
        da2p2.eli   ONLINE
        da4p2.eli   ONLINE
      raidz1-4      ONLINE
        da11p2.eli  ONLINE
        da8p2.eli   ONLINE
        da9p2.eli   ONLINE
        da10p2.eli  ONLINE
    cache
      nvd0p1
root@freenas:~ # zpool import -a
root@freenas:~ # zpool status -v
  pool: RAID
 state: ONLINE
  scan: resilvered 2.57T in 0 days 11:39:28 with 0 errors on Sun Mar  7 17:32:58 2021
config:

    NAME            STATE     READ WRITE CKSUM
    RAID            ONLINE       0     0     0
      raidz1-0      ONLINE       0     0     0
        da12p2.eli  ONLINE       0     0     0
        da13p2.eli  ONLINE       0     0     0
        da14p2.eli  ONLINE       0     0     0
        da19p2.eli  ONLINE       0     0     0
      raidz1-1      ONLINE       0     0     0
        da18p2.eli  ONLINE       0     0     0
        da16p2.eli  ONLINE       0     0     0
        da17p2.eli  ONLINE       0     0     0
        da15p2.eli  ONLINE       0     0     0
      raidz1-2      ONLINE       0     0     0
        da3p2.eli   ONLINE       0     0     0
        da5p2.eli   ONLINE       0     0     0
        da6p2.eli   ONLINE       0     0     0
        da7p2.eli   ONLINE       0     0     0
      raidz1-3      ONLINE       0     0     0
        da0p2.eli   ONLINE       0     0     0
        da1p2.eli   ONLINE       0     0     0
        da2p2.eli   ONLINE       0     0     0
        da4p2.eli   ONLINE       0     0     0
      raidz1-4      ONLINE       0     0     0
        da11p2.eli  ONLINE       0     0     0
        da8p2.eli   ONLINE       0     0     0
        da9p2.eli   ONLINE       0     0     0
        da10p2.eli  ONLINE       0     0     0
    cache
      nvd0p1        ONLINE       0     0     0

errors: No known data errors

  pool: freenas-boot
 state: ONLINE
  scan: scrub repaired 0 in 0 days 04:26:38 with 0 errors on Fri Mar  5 08:15:47 2021
config:

    NAME        STATE     READ WRITE CKSUM
    freenas-boot  ONLINE       0     0     0
      mirror-0  ONLINE       0     0     0
        da21p2  ONLINE       0     0     0
        da20p2  ONLINE       0     0     0

errors: No known data errors
root@freenas:/ # zfs set mountpoint=/mnt/RAID RAID
 
Top