SOLVED Help! FN9.10 iohyve VMs don't work in FN11.1

Status
Not open for further replies.

nickt

Contributor
Joined
Feb 27, 2015
Messages
131
[EDIT]
Resolved - see post #5
[/EDIT]

Hi,

I've just taken the plunge to upgrade to FreeNAS 11.1 from FreeNAS 9.10. Everything went swimmingly, except for my iohyve VMs created in 9.10. I have googled extensively, particularly in these forums, but nothing seems to work. I can see that FreeNAS attempts to start the VM (using iohyve), but my VMs just sit dumbly at the grub prompt.

I've also attempted to create a new VM using the FN11.1 GUI. For this, I cloned one of my VMs (using iohyve clone) and pointed the VM to the cloned zvol. In this case, I instead get an EFI prompt, but when attempting to use the boot maintenance manager (as described here) I get a file explorer with nothing in it at all.

I have three VMs, all Debian, built with properties similar to the following:

Code:
root@saturn:~ # iohyve getall pan
Getting pan iohyve properties...
bargs		  -A_-H_-P
boot		   1
con			nmdm2
cpu			1
description	Local DNS Server
install		no
loader		 grub-bhyve
name		   pan
os			 d8lvm
persist		1
ram			256M
size		   10G
tap			tap1
template	   NO
vnc			NO
vnc_h		  600
vnc_ip		 127.0.0.1
vnc_tablet	 NO
vnc_w		  800
vnc_wait	   NO

One thing I am nervous about is that all of my Debian VMs were built using an LVM partition. I have seen hints that the bhyve UEFI driver doesn't play well with LVM?

At the grub prompt, I see:

Code:
grub> ls
(hd0) (hd0,msdos5) (hd0,msdos1) (cd0) (cd0,msdos5) (cd0,msdos1) (host) (lvm/pan
--vg-swap_1) (lvm/pan--vg-root)

But the following doesn't give me confidence - I am no grub expert, but I thought this should allow me to explore the filesystem in the VM:

Code:
grub> ls (lvm/pan--vg-root)
Device lvm/pan--vg-root: Filesystem type ext* - Last modification time
2018-05-12 11:53:30 Saturday, UUID f4fe1f4b-cd3c-4f56-98e4-74bae4b38caf - Total
size 19521536 sectors

The other thing I am confused by is that iohyve doesn't seem to show the new VM I created in the GUI. Is it that FreeNAS 11.1 uses a different wrapper around bhyve?

I'd love some help - I'm completely stumped. I'm about to boot back into FN9.10, hoping that the wonders of ZFS make for a clean rollback... (edit: they did - phew)

Nick
 
Last edited:

KrisBee

Wizard
Joined
Mar 20, 2017
Messages
1,288
I've also attempted to create a new VM using the FN11.1 GUI. For this, I cloned one of my VMs (using iohyve clone) and pointed the VM to the cloned zvol. In this case, I instead get an EFI prompt, but when attempting to use the boot maintenance manager (as described here) I get a file explorer with nothing in it at all.

Not surprising, your existing VMs are not going to have a ESP EFI partition and the necessary efi boot files, hence you get dropped to the EFI shell. So your clone will be unbootable using bhyve UEFI firmware. If the bhyve UEFI-CSM firmware variant worked you might be OK. But AFAIK is doesn't and there's been no upstream fix.

VMs created using iohyve at the CLI are not going to show up in the web UI. IIRC, iohyve was meant to be a stop-gap introduced in 9.10 before VM creation via the webui was introduced in later versions of FreeNAS. Whether migration of VMs was ever taken into account is not clear. But shouldn't iohyve continue to work in FreeNAS 11.1?

Bhyve UEFI firmware will boot a VM that uses LVM, I know its handled OK by centos7 and Ubuntu 16.04 and after. Debian just needs checking if /boot should lie outside the LVM on a separate partition together with a separate EFI partition.
 

nickt

Contributor
Joined
Feb 27, 2015
Messages
131
But shouldn't iohyve continue to work in FreeNAS 11.1?

Well yes, exactly - that was what I expected. I am not too bothered about the GUI; I've been quite happily working with iohyve at the command line for quite a while now. So my main mission is to get iohyve to keep doing its thing in 11.1.

With a bit more research, I think I've figured out what's going on - if all else fails, read the source code.

In FreeNAS 9.10, iohyve was at v0.7.7, but in FN 11.1, it's been bumped to v0.7.9. What I've discovered is that 0.7.9 doesn't handle the d8lvm os correctly - it is missing from the relevant code that builds the grub-bhyve command line arguments. Because d8lvm is missing, it falls through to generic code and - most importantly, misses the -d /grub switch. I assume grub can't find the config file and, rather than starting, it sits blankly waiting for something to do - hence the grub command prompt.

I haven't gone back to FN11.1 yet, but under FN9.10 I can simulate the same thing. By setting the os to something that produces the wrong command line arguments, the VMs fail to start, falling through to the grub prompt.

In v0.7.9, I can see that ubuntu will produce the correct command line, so when I next go back to F11.1, I will try changing the os to ubuntu to see if it will fix the issue.

Oddly enough, I've tried simulating this in FN9.10 by setting os to something other than d8lvm, but which produces the correct command line (I chose centos6). It still starts correctly, but something about the network stack gets mangled - stuff doesn't work that well. I don't understand this, as - by inspecting source code - the os *only* seems to affect the construction of the command line to grub-bhyve and nothing else. So I don't understand why this should happen. But it is reproducible.

I'll post again once I've had a chance to try FN11.1.

For reference, here are the relevant source code links:
 

KrisBee

Wizard
Joined
Mar 20, 2017
Messages
1,288
If stuff is broken in v0.7.9, maybe using your own custom grub.cfg and device.map files will enable you to boot your deblvm VMs in FN11.

I see you raised issues at https://github.com/pr1ntf/iohyve , wonder if you'll get an answer.
 

nickt

Contributor
Joined
Feb 27, 2015
Messages
131
So I have successfully resolved this with a workaround, and can confirm that it was caused by a bug in iohyve v0.7.9. My GitHub issue hasn't received a response at this point.

For anyone with Debian LVM VMs that won't start after upgrade to FreeNAS 11.1, there are three different workarounds, all of which work for me.

Option 1 - easy; works in both 9.10.2 and 11.1
While still at 9.10.2, configure:
Code:
# iohyve stop <vm>
# iohyve set <vm> os=centos6
# iohyve start <vm>
Oddly enough, this setting works in both releases, which is handy if switching between releases while resolving other issues (as I am at the moment).

Option 2 - easy; works only in 11.1
While at 11.1, configure:
Code:
# iohyve stop <vm>
# iohyve set <vm> os=ubuntu
# iohyve start <vm>
The ubuntu setting is fine in 11.1, but won't work in 9.10.2.

Option 3 - bit harder; works in both 9.10.2 and 11.1
  • While still at 9.10.2, ssh into your VM and extract the contents of /boot/grub/grub.cfg.
  • Put this into a new file in FreeNAS called /iohyve/<vm>/grub.cfg
  • Check that /iohyve/<vm>/device.map already exists (it should from previous successful runs with os=d8lvm)
  • Now configure
Code:
# iohyve stop <vm>
# iohyve set <vm> os=custom
# iohyve start <vm>

Additionally, I found that, after messing around with the os setting, (for reasons I don't understand) the VMs started booting with the virtualised hardware clock set to local time, not UTC time. This happened in both 9.10.2 and 11.1 and caused a range of issues for my VMs.

To fix this, configure:
Code:
iohyve set <vm> bargs="-A -H -P -u"
This adds the -u switch to the other standard switches, which is passed to bhyve and causes the VMs to be started in UTC time.
 
Last edited:

KrisBee

Wizard
Joined
Mar 20, 2017
Messages
1,288
Thanks for reporting back. I see iohyve github shows last commit was on 27 June 2017. Just as well there is a os=custom option.
 
Status
Not open for further replies.
Top