freenas-boot pool recovery issues

Status
Not open for further replies.

ChicagoBud

Cadet
Joined
Jun 21, 2016
Messages
8
Sort of leaving this for anyone else that has this issue and maybe for someone to check what I did. Specifically is it ok to have a smaller boot partition (100M) than what the gui now tries to create (260M)?

TL: DR: Recovery was a pain. GUI didn't work. Seems like FreeNAS may have changed the boot device partitioning scheme. Fix from the command line.

I'm on FreeNAS-11.1-RELEASE. So someone physically took (stole?) one of the two usb flash drives out of one of my freenas servers. Luckily(?) I had setup a mirror and the server hasn't stopped working. Unfortunately, I was having troubles recovering from this even though I had a spare flash drive that was the same size (so I thought) and brand that I was trying to replace the missing drive with.

The first issue I ran into was that even though the old drive was physically removed, it was still part of the freenas-boot pool. I tried to remove it from the pool via the gui and that failed with a python traceback which I did not save (sorry). Google indicated that I could detach it from the command line so I did that and old drive was gone from the GUI - yeah.

Then I was trying to attach the new disk to the boot pool. Doing that from the GUI, I was getting:

Code:
[MiddlewareError: Command '('gpart', 'add', '-t', 'freebsd-zfs', '-i', '2', '-a', '4k', '-s', '31973138432B', 'da17')' returned non-zero exit status 1.]


Humm, so turns out what is really happening is the drives are not exactly the same and perhaps there is a new partitioning scheme for boot devices:

For me, da16 is the original (good) device in the pool and da17 is the replacement:

Code:
root@storage03:~ # gpart show da16 da17
=>	  34  62652349  da16  GPT  (30G)
		34	204800	 1  efi  (100M)
	204834		 6		- free -  (3.0K)
	204840  62447536	 2  freebsd-zfs  (30G)
  62652376		 7		- free -  (3.5K)

=>	  40  62668720  da17  GPT  (30G)
		40	532480	 1  efi  (260M)
	532520  62136240		- free -  (30G)


The 2nd partition failed to get created as there wasn't enough space left on the disk due to the larger size of the 1st partition. I ended up deleting the first partition and recreating the partitions with:

Code:
gpart show da16 da17
gpart delete -i 1 da17
gpart add -b 40 -s 204800 -t efi da17
gpart add -t freebsd-zfs -s 62447536 da17


basically matching the size of the partitions even though they don't start at the same place on the device. Then I attached the new device to the pool and updated the boot code in the first partition (I hope).

Code:
zpool attach freenas-boot /dev/da16p2 /dev/da17p2
gpart bootcode -b /boot/pmbr -p /boot/gptzfsboot -i 1 da17


Hopefully this will be good even though partition 1 is smaller than the new default. I'd appreciate any feedback on this. -- Bud
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
It's probably fine. I do wish more care had been taken when making this change, though.
 

ChicagoBud

Cadet
Joined
Jun 21, 2016
Messages
8
I assume the change was made to accommodate a larger bootcode. Is there anything you are aware of in the pipeline that would make the 100M size too small?

> I do wish more care had been taken when making this change, though.

Yeah, I feel I'm pretty experienced and have the confidence to do this sort of cli work. I would think the average FreeNAS user would be lost if this happened to them.
 

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,194
Is there anything you are aware of in the pipeline that would make the 100M size too small?
No, but I think the UEFI spec mandates a 200 MB partition or something like that.
 
Status
Not open for further replies.
Top