[HOWTO] How-to Boot Linux VMs using UEFI

jcpro

Cadet
Joined
Aug 8, 2017
Messages
7
The problem here is that when I ran the boot-repair tool secureboot was already unchecked so I don't know how to make this a permanent fix. Every time the VM restarts, for whatever reason, I have to manually boot it up using the UEFI Shell prompts. This is silly.
 

Zwck

Patron
Joined
Oct 27, 2016
Messages
371
The problem here is that when I ran the boot-repair tool secureboot was already unchecked so I don't know how to make this a permanent fix. Every time the VM restarts, for whatever reason, I have to manually boot it up using the UEFI Shell prompts. This is silly.

hi after a successful boot you just have to copy some files around and then the reboot will work until you update the boot partition/grub.

Code:
The file /usr/local/share/uefi-firmware/BHYVE_UEFI.fd provides byve with the firmware to support UEFI guests, this function is based on the OVMF tianocore project which is also used as the basis of VirtualBox's virtual machine UEFI support. The bhyve UEFI firmware conforms to the known “Default Boot Behaviour” and looks for the file \EFI\BOOT\boot64.efi in the EFI partition of your VM. If it's not present you end up in the EFI shell.

One simple remedy is to create this \EFI\BOOT\boot64.efi file in your VM, which is straight forward once your VM has booted.

But how do you boot your VM if you find yourself in the EFI shell at first? Just type exit at the shell prompt, and in the EFI menu system navigate to "Boot Maintenance Manager" and then select "Boot from file" to locate and select your grubx64.efi file.
 

Zwck

Patron
Joined
Oct 27, 2016
Messages
371
How do you create the file?


cd /boot/efi/EFI
copy the existing ./ubuntu/grubx64.efi to /EFI/BOOT/bootx64.efi (think rename)

to get

root@ubuntu-vm:/boot/efi# tree -L 3 .
.
└── EFI
├── BOOT
│ └── bootx64.efi
└── ubuntu
├── fbx64.efi
├── grub.cfg
├── grubx64.efi
├── mmx64.efi
└── shimx64.efi
 

Latty

Dabbler
Joined
Sep 26, 2017
Messages
14
There are a lot of people in this thread putting the blame on Ubuntu here, which is wrong - this is a bug in FreeNAS's VM setup.

The location that is being suggested here is a default intended for removable media - permanent systems should be adding a boot loader entry. EFI implementations contain their own bootloader that can look anywhere you choose - you can just add an entry to it pointing to the correct location. You could manually muck around with
Code:
efibootmgr
to do this, but the easiest way to do this is a
Code:
sudo grub-install /dev/sda
(assuming /dev/sda as your boot disk). This is what the Ubuntu installer does.

I assume, because no storage is provided for VM management, the EFI variables are only held in memory and being wiped when the machine is turned off - this is like swapping out your motherboard each time you boot.

You can see this by doing
Code:
efibootmgr -v
- it will show the two default values. If you run the grub installation it will show a proper entry for Ubuntu that would work - do a software reboot and it will (presumably because the VM never gets turned off, so the variables stay in memory), but do a turn off and turn on and the entry has disappeared.

The reason other OSes work is because a lot of them install to that location because of broken motherboards that don't implement the spec properly and only look in that place. Ubuntu is following the spec properly.
 
Last edited:

Ericloewe

Server Wrangler
Moderator
Joined
Feb 15, 2014
Messages
20,176
There are a lot of people in this thread putting the blame on Ubuntu here,
Of course, since they're the ones with the non-standard behavior.
 

Zwck

Patron
Joined
Oct 27, 2016
Messages
371
There are a lot of people in this thread putting the blame on Ubuntu here, which is wrong - this is a bug in FreeNAS's VM setup.

Well, i have been following the thread for quite some time and I am not sure who you refer to when you say "a lot of people put the blame on ubuntu", as this thread is more or less a guide to get Ubuntu running with some, inbetween problem solving.

Cheers.

Edit: i made it an exact quote
 
Last edited:

Latty

Dabbler
Joined
Sep 26, 2017
Messages
14
Of course, since they're the ones with the non-standard behavior.

Quite the opposite - they are the ones with standard behaviour, others are doing the wrong thing. If my motherboard was wiping it's EFI variables every boot, I'd be complaining to them, not to Ubuntu. Likewise the FreeNAS VMs not storing that data is a failure on the VM system, not on Ubuntu.

This is like having someone coming along and punching your ice cream out of your hand, and going to the ice cream van to complain about it. Maybe the van down the street superglues your ice cream to your hand for you, but that doesn't make it a good idea.

Well, i have been following the thread for quite some time and I am not sure who you refer to when you say "most people put the blame on ubuntu", as this thread is more or less a guide to get Ubuntu running with some, inbetween problem solving.

Cheers.

You literally quoted me saying "a lot", then a sentence later charge me with claiming "most". You also replied directly after someone blaming Ubuntu for it. I think it's reasonable for me to try and clarify the cause and hopefully push for a better resolution - it would be annoying for this to get pushed off as a "needs fixing upstream", when it is something that needs fixing here (or at least, maybe a different upstream if the VM stuff is pulled in from somewhere else).
 

Zwck

Patron
Joined
Oct 27, 2016
Messages
371

Fixed my paraphrased quote.

I did not see too many complain specifically about ubuntu in this thread, to be frank i have not found any (i just quickly skimmed most of the posts, so take it with a rock of salt). What i found is in the bug report section of freenas that "a lot" of people have problems with the new VM frontend of freenas and that the freenas developer are aware of this.

https://bugs.freenas.org/projects/freenas/search?utf8=✓&q=VM&scope=&all_words=&all_words=1&titles_only=&issues=1&attachments=0&options=0&commit=Submit

cheers

There are still a lot of things to do on the VM part of freenas. But if you want to have a running VM right now the forum is where people tend to help you with it.
 

Terry Pounds

Dabbler
Joined
Jan 21, 2017
Messages
21
So just to make this clear. There is no permanent fix to automatically reboot after a VM power cycle? You no matter what have to boot the VM OS back up manually? Is this correct??
 

Zwck

Patron
Joined
Oct 27, 2016
Messages
371
So just to make this clear. There is no permanent fix to automatically reboot after a VM power cycle? You no matter what have to boot the VM OS back up manually? Is this correct??

The first post in this thread elucidates on how to fix this issue pesudo-permanently, or at least until you update the boot/grub part of you ubuntu installation which does not happen that often, if it does then you have to re-apply the fix. Call it pseudo-permanent.

My Ubuntu VM runs now for over 4 month and survived every update and restart.
 

Terry Pounds

Dabbler
Joined
Jan 21, 2017
Messages
21
The first post in this thread elucidates on how to fix this issue pesudo-permanently, or at least until you update the boot/grub part of you ubuntu installation which does not happen that often, if it does then you have to re-apply the fix. Call it pseudo-permanent.

My Ubuntu VM runs now for over 4 month and survived every update and restart.

I have a VM setup and running Centos7 minimal. The problem is I have manually boot it from the shell each time the VM is restarted. I have tried coping my grub boot file to /EFI/BOOT/bootx64.efi. But nothing I tried has fixed the issue. I have tried making folder in root directory of VM and making a folder EFI/BOOT/ and copied bootx64.efi there and I also tried making a the folder at "/EFI/BOOT/" and then putting the file bootx64.efi and that doesn't work either.

I guess there is some step or command or something I am missing.
 

Zwck

Patron
Joined
Oct 27, 2016
Messages
371
I have a VM setup and running Centos7 minimal. The problem is I have manually boot it from the shell each time the VM is restarted. I have tried coping my grub boot file to /EFI/BOOT/bootx64.efi. But nothing I tried has fixed the issue. I have tried making folder in root directory of VM and making a folder EFI/BOOT/ and copied bootx64.efi there and I also tried making a the folder at "/EFI/BOOT/" and then putting the file bootx64.efi and that doesn't work either.

I guess there is some step or command or something I am missing.

Hi, I have not experimented with centos7, however this thread recommends to use the CentOS everything DVD and install minimal from there. Apparently centos7 should not have this problem.
 

Terry Pounds

Dabbler
Joined
Jan 21, 2017
Messages
21
Hi, I have not experimented with centos7, however this thread recommends to use the CentOS everything DVD and install minimal from there. Apparently centos7 should not have this problem.

I finally got it to work using the following. [cp grubx64.efi /boot/efi/EFI/BOOT/] It now boots to the Centos start screen and not the shell. However I still have to connect with VNC then it automatically boots. Is there a way to make it boot without the VNC being connected?
 

Stux

MVP
Joined
Jun 2, 2016
Messages
4,367
I finally got it to work using the following. [cp grubx64.efi /boot/efi/EFI/BOOT/] It now boots to the Centos start screen and not the shell. However I still have to connect with VNC then it automatically boots. Is there a way to make it boot without the VNC being connected?

Turn off 'wait for connection to boot' in the VNC device props on your vm in FreeNAS.

Or whatever the setting is called.
 

Journer

Dabbler
Joined
Jun 20, 2017
Messages
17
I was pissing around for a while trying to get CentOS 7 to work, then finally stumbled upon this thread. THANK YOU!

This led me in the right direction, but didn't fix it completely for me... I had to select different boot files. This is the gist:
  • Create ZVOL & VM (UEFI, not UEFI-CSM)
  • Attach ISO (I was using http://isoredirect.centos.org/centos/7/isos/x86_64/CentOS-7-x86_64-DVD-1708.iso)
  • Set vnc viewer quality to "high" instead of automatic
  • Follow cent 7 installer screens.
  • Instead of allowing installer to pick partitions, I manually created boot, booteft, /, and swap. All the settings were default (EFI System Par, for the efi partition, for example). You might not need to do this - I did it because I want ext4 and not xfs.
  • Once it's done, it will reboot once and work; however, all subsequent reboots failed.
  • Using the guide in first post, get yourself to the manual boot filter loader
  • Select EFI/boot/centos/shimx64-centos.efi (all other other files failed for me, including grub)
  • That will boot you into your install. Login as root
  • Follow the steps in first post to overwrite /EFI/boot/BOOT, except replace BOOTX64.efi with shimx64-centos.efi instead
  • Reboot
  • ...
  • Profit
Shutdown your vm, snapshot the zvol, move on with life :)
 

keitalbame

Cadet
Joined
May 28, 2014
Messages
6
I was also having issues with centos7 minimal iso, copying from grubx64.efi, although I was able to boot when I selected grubx64.efi from Boot from file, on Boot Manager.

After Journer post, I'm able to shutdown and start without issues.
This were my steps.
  1. Created new zvols and vm from scratch, only changing the NIC and Disk to use virtio
  2. default install from centos7-1708 minimal (auto partition)
  3. restart after install completion
  4. login
  5. sudo su
  6. cp /boot/efi/EFI/centos/shimx64-centos.efi /boot/efi/EFI/BOOT/BOOTX64.EFI
  7. yes to overwrite /boot/efi/EFI/BOOT/BOOTX64.EFI
  8. shutdown
I'm now able to power on after every shutdown.
 
Top