Windows vm with GPU passtrough

pleysje

Cadet
Joined
Aug 16, 2023
Messages
2
I can't get the windows iso to boot when I setup my vm to use a GPU.

If I configure it to only use VNC the iso boots, I can install and use it but as soon as I remove the VNC monitor and setup my GPU it doesn't boot.
I see a uefi shell, when I exit that and try to boot from cd the screen goes black and comes back to the uefi.
Same thing happens when I install through VNC, if I exit the uefi shell and try to boot from the virtual disk the screen goes black and comes back to uefi.

I have tried various windows iso's (10 & 11 both x64)

System details:
Version: TrueNAS-SCALE-22.12.3.3
CPU: I7-13700k
Motherboard: MSI MAG Z690 (ddr5)
RAM: Corsair VENGEANCE DDR5 RAM 64GB (2x32GB) 5200MHz
GPU: GeForce RTX 3060 Ti

The GPU is set up as isolated
isolatedgpu.png


And this are the vm settings:
vmsettings1.png

vmsettings2.png

vmsettings3.png


vmsettings4.png

vmsettings5.png

vmsettings6.png


What I get when I boot the machine
PXL_20230816_134529585.jpg


With the last iso I tried it seems to start it but gets stuck on "Press any key to boot from CD or DVD...."
Am I missing something? What else could I try?
 

lesserthere

Dabbler
Joined
Aug 10, 2023
Messages
39
I had that shell prompt only when the boot order wasn't correct (but that was on proxmox and esxi).
Is this Scale on Bare Metal?
Let me know how windows goes once running as a VM I had very little success so went with proxmox and scale virtualized.
Not my preferred route but if windows can in fact work well on scale with a gpu passthrough let me know.
Was laggy and slow in my case - maybe I was doing something wrong but read scale was not really designed for windows vm's.
Correct me if I'm wrong.
 

pleysje

Cadet
Joined
Aug 16, 2023
Messages
2
It is truenas scale on bare metal (diy build)

I managed to get it up and running but it was a day of trial and error.

I think the shell was an issue with using the gpu output right away. I figured out I have to boot truenas without anything plugged into the gpu, even though it is set up as isolated. Somehow truenas uses it while booting when a monitor is connected to the gpu.

With that out of the way I configured my vm just as in my initial post but with also vnc enabled. I then connected my monitor to the gpu with hdmi. While booting the vm it finally showed "press any key to start from cd .." on both vnc and my monitor. When pressing the any key I noticed the vnc screen continues, but the monitor hooked up to the GPU seemed frozen on "press any key...".

I finished the entire windows setup through vnc and as soon as it was up and running, updated and with the nvidia driver installed the monitor came through.

Next problem was that for windows there are 2 monitors now, vnc and the real one. So I disabled the virtual monitor in device manager and now only the real one works. After this I also removed the vnc display from the virtual machines devices and now I can boot the vm and it works perfectly fine over hdmi.

Somewhere in between all of this I also bricked my truenas setup (didn't boot anymore) and had to reset the entire thing.

When the windows vm booted fine I tested rebooting my entire machine and starting the vm again and it's all stable now. I tested a game in the vm (Skyrim) and it ran just fine.

The only thing I can't seem to get to work now is setup a network bridge so I can access my truenas shares on the virtual machine.

I hope this process can help you or someone somehow.
 
Last edited by a moderator:
Top