Will Byhve VMs still work in 11.1 when upgrading from 9.10.2?

Status
Not open for further replies.

scott2500uk

Dabbler
Joined
Nov 17, 2014
Messages
37
Byhve support was added in 9.10 and I have been using it to build a number of VMs but I cannot find any information if there is anything needs doing when upgrading to later versions of FreeNAS. I know a GUI has now been built but I suspect the later version of FreeNAS is not going to know about my VMs and I will have to do some sort of import.

Can anyone confirm if VMs set up in Byhve FreeNAS 9.10 will work when upgrading to FreeNAS 11.1?

Can anyone share any documentation of what needs to be done to get FreeNAS 11.1 to see my VMs if possible?

Thanks
 

kdragon75

Wizard
Joined
Aug 7, 2016
Messages
2,457
I personally do not use bhyve to to its limitations. You may just set up a virtual machine with 9.10 and test.
 

Kennyvb8

Contributor
Joined
Mar 18, 2017
Messages
112
I personally do not use bhyve to to its limitations. You may just set up a virtual machine with 9.10 and test.

According to byhve they can do lots. But the implementation is weak in FreeNas


Sent from my iPhone using Tapatalk
 

scott2500uk

Dabbler
Joined
Nov 17, 2014
Messages
37
OK since no one that knows has replied I went ahead and set up a testing environment to see what happens to my VMs when upgrading to the latest FreeNAS from 9.10.2.

For my test I did the following:
  • Installed fresh copy of FreeNAS 9.10.2
  • Configured iohyve
  • Created a VM with the following settings:
    Code:
    iohyve getall foo
    Getting foo iohyve properties...
    bargs			-A_-H_-P
    boot			 0
    con			  nmdm0
    cpu			  4
    description	  Mon Nov 12 18:30:36 GMT 2018
    install		  no
    loader		   grub-bhyve
    name			 foo
    os			   ubuntu
    persist		  1
    ram			  1024M
    size			 8G
    tap			  tap0
    template		 NO
    vnc			  NO
    vnc_h			600
    vnc_ip		   127.0.0.1
    vnc_tablet	   NO
    vnc_w			800
    vnc_wait		 NO
    

  • Installed fresh copy of ubuntu 16.04 inside the VM.
  • Setup the VM as standard single partition whole disk without LVM and using standard bios grub bootloader
  • Set rc tunables iohyve_enable="YES" iohyve_flags="kmod=1 net=em0"
  • rebooted freenas and vms
  • Confirmed vm was working as it should with iohyve start foo. VM boots fine and is controllable using iohyve console foo.
  • Upgrade FreeNAS via GUI to FreeNAS-11.2-RC1.
  • Once new version comes up the VM does not show in Virtual Machines (didn't expect it to tbh)
  • iohyve command still works so continue to boot up my VM as normal
  • Code:
    root@freenas:~ # iohyve list
    Guest  VMM?  Running  rcboot?  Description
    foo	NO	NO	   NO	   Mon Nov 12 18:30:36 GMT 2018
    root@freenas:~ # iohyve start foo
    Starting foo... (Takes 15 seconds for FreeBSD guests)
    root@freenas:~ # GRUB Process does not run in background....
    If your terminal appears to be hanging, check iohyve console foo in second terminal to complete GRUB process...
    iohyve console foo
    Starting console on foo...
    ~~. to escape console [uses cu(1) for console]
    Connected
    grub>
    
At this point, I'm not sure how to get my VM to boot.

I also tried adding a new VM via the GUI and selecting the existing VM disk and booting that. Tried both UEFI and UEFI-CSM as the boot methods and both failed to boot the VM setup in 9.10.2.

Luckily on my live system all my VM's are ubuntu 16.04 vms so if I can get one to boot on 11.2 then I can get them all.

Has anyone got any suggestions?
 

sretalla

Powered by Neutrality
Moderator
Joined
Jan 1, 2016
Messages
9,703
At this point, I'm not sure how to get my VM to boot.
Have you tried typing exit in the console? This should take you to BIOS and would then allow you some options for booting. You may need to look for the thread about UEFI boot to see how to make that automatic.

I can never remember which Ubuntu is the problem version on 11.1 (I use debian), but 16 may be it... try 11.2 RC to see if 11.2 will be worth waiting for.
 

scott2500uk

Dabbler
Joined
Nov 17, 2014
Messages
37
Thanks for the suggestion, unfortunately, my VMs are oldskool bios grub and are not installed to be booted via UEFI.

I have seen the posts on how to fix boot for UEFI systems and have tested those on fresh installs of Ubuntu and can confirm they work but they don't apply to my existing VMs unless I can get them converted to UEFI.

I'm currently playing around with the install disk and going into broken/recover system and see if there is any way to convert boot to be UEFI. At the moment I'm not having much luck as I have a single partition filesystem and from what I understand UEFI boot needs to be on its own FAT32 partition at the start of the disk. Basically, I'm having to play around with disk partition but I'm well over my head at this point.

I'll keep googling and trying things. I have seen there are so bootable tools out there that might be able to convert my installs to UEFI but haven't got around to testing them yet.
 

scott2500uk

Dabbler
Joined
Nov 17, 2014
Messages
37
OK, I feel like a bit of a doughnut right now.

If I add my existing VM as UEFI-CSM Bootloader, select my existing disk, start it up all in the GUI. I can then connect to the serial monitor. The VM will start and show the GRUB ubuntu boot menu where I can select to boot ubuntu.

The problem now is if I select ubuntu it just goes to a blank screen with a blinking cursor. After awhile I get some output

Code:
Loading Linux 4.4.0-116-generic ...
Loading initial ramdisk ...


Then seems to hang there

So by the looks of it, I don't have to mess around with converting to UEFI. Now to figure out why it isn't booting...
 
Status
Not open for further replies.
Top