Isolated GPU not being isolated.

Quace

Cadet
Joined
Dec 21, 2022
Messages
2
Hi,

I've recently upgraded one of the GPUs in my TrueNAS SCALE-22.12.0 box, however the new GPU does not appear to be isolated on boot.

Previously I had a Quadro P600 and a GTX 1650. I had the 1650 selected in System Settings -> Advanced -> Isolated GPU Device(s) and it functioned as expected. The Quadro P600 was available to applications like Jellyfin and the 1650 was reserved for VMs.

I've since swapped the GTX 1650 for an RTX 3060 however despite showing up under Isolated GPU Device(s) the RTX does not get isolated. VMs will fail to start if the card is attached as a device.

If I run nvidia-container-cli I can see both devices are included:

Code:
# nvidia-container-cli info
NVRM version:   515.65.01
CUDA version:   11.7

Device Index:   0
Device Minor:   0
Model:          NVIDIA GeForce RTX 3060
Brand:          GeForce
GPU UUID:       GPU-[UUID]
Bus Location:   00000000:03:00.0
Architecture:   8.6

Device Index:   1
Device Minor:   1
Model:          Quadro P600
Brand:          Quadro
GPU UUID:       GPU-[UUID]
Bus Location:   00000000:04:00.0
Architecture:   6.1


As a work around I can stop all the Apps, un-tick Enable GPU support under Apps -> Settings -> Advanced Settings start a VM with a GPU attached then re-tick this setting and restart the Apps.

When I do this I see only the one GPU in the output of nvidia-container-cli:

Code:
root@nas-01[~]# nvidia-container-cli info
NVRM version:   515.65.01
CUDA version:   11.7

Device Index:   0
Device Minor:   1
Model:          Quadro P600
Brand:          Quadro
GPU UUID:       GPU-[UUID]
Bus Location:   00000000:04:00.0
Architecture:   6.1


I have to do this every time my NAS restarts.

I've tried clearing and resetting the Isolated GPU Device(s) list however it doesn't seem to help.

Another symptom I've noticed is I can't just select the GPU when I edit the VM, I need to manually add both PCI devices (the card and audio output) manually under devices. However when I do this it does work.

Any help appreciated.

Thanks.
 

Quace

Cadet
Joined
Dec 21, 2022
Messages
2
Another workaround has been to set a VM with the GPU I want to isolate to start on boot. This seems to happen before the containers.

Would still be good to find a better solution as this isn't always the VM I want to use with the GPU.

Thanks.
 

gsrcrxsi

Explorer
Joined
Apr 15, 2018
Messages
86
hmm. this workaround isn't working for me. I can't isolate the GPU at all. can't even manually add the pci device to the VM.
 
Top