Performance Problems in FreeNAS using VBox

AndiM202

Dabbler
Joined
Jan 6, 2019
Messages
13
Hi guys!

I am not quite sure if the thread is on the correct position. Please inform or transfer the post, if needed.

Basically, I am using a Dell PowerEdge R620 Server which has following specs:

128GB ECC RAM
Intel(R) Xeon(R) CPU E5-2630 v2 @ 2.60 GHz
Dell Inc. 01W23F Mainboard
3x 2TB Seagate Baracude 2,5" Hard Drives
240GB SanDisk SSD
3x 300GB Dell SAS Hard Drives
Intel Perc H710 RAID Controller

Physically, the operating System is a Debian (Little Endian) which then virtualizes the FreeNAS using Virtual Box.
As I am not able to do any pass-trough or having any HBA Card, I am using 2x RAID5 for Storage Drives and 1x RAID0 for the SSD. I am completely aware that on top of ZFS it is strictly not recommended to use any physical RAID. Unfortunately, I am not able to do so because there is no Pass-Trough or JBOD Option in the RAID Controller.

Inside FreeNAS I use one Jail for my NextCloud Instance and 1 potential one for GitLab.

My Virtual Machine has over 60GB+ RAM and 5 physical CPU cores, which is, in my opinion, way enough.

Normally, my server always shuts down on 01:00 AM and starts again at 07:00 AM . I encountered some problems, whenever FreeNAS is running for several days. I always get Python Errors like Timeout Handshakes and Sentry Responded with API Error.

But my general problem with FreeNAS is that it occasionally gets soooo slow that I am not even able to do anything in the GUI and Command Line. I was thinking off completely going without any virtualization and using FreeBSD directly as OS, but that doesn't seem to be a proper solution as I am not able to work aroung the RAID Problem.

When I restart my VM it is always at peak performance and responds within seconds (GUI and CLI).

What would you guys recommend me to do with my setup? Should I go for a complete other solution or how do I improve performance or even usability on my system?

If I should post any logs or additional system information, please let me know.

Thank you so much and greetings,

Andi
 

artlessknave

Wizard
Joined
Oct 29, 2016
Messages
1,506
the chances of anyone even replying in these forums to this infinitely discouraged configuration with anything other than "don't use raid with zfs" is really slim; as far as most of us are concerned, your non supported setup is the reason for your problems and the solution is to use a supported setup. trying to "fix" something we don't think you should run in the first place is a waste of our free time.

it isn't just not recommended; you can, and should, assume it will kill your data.
get m1015/LSI9211 and then dont use the h710. about 50-70$
if you are going to use the h710 you might as well just make a raid and not bother with freenas at all. you are making things more complicated for questionable benefit while dramatically increasing the risk.
alternatively, it should be possible to flash the h710 to IT mode.
 

AndiM202

Dabbler
Joined
Jan 6, 2019
Messages
13
Hey! Thank you for your fast answer.

Yeah I expected such an answer. I got this server from a previous teacher of mine and only think, that these 128GB of RAM would be a waste of not using it. I will stick to your solution and will probably buy a HBA Card or another RAID Controller and go for that solution. Thanks!
 

AndiM202

Dabbler
Joined
Jan 6, 2019
Messages
13
the chances of anyone even replying in these forums to this infinitely discouraged configuration with anything other than "don't use raid with zfs" is really slim; as far as most of us are concerned, your non supported setup is the reason for your problems and the solution is to use a supported setup. trying to "fix" something we don't think you should run in the first place is a waste of our free time.

it isn't just not recommended; you can, and should, assume it will kill your data.
get m1015/LSI9211 and then don't use the h710. about 50-70$
if you are going to use the h710 you might as well just make a raid and not bother with freenas at all. you are making things more complicated for questionable benefit while dramatically increasing the risk.
alternatively, it should be possible to flash the h710 to IT mode.

The other solution with the additional SATA Connectors might be the best one. I will definitely try that out and will post my solution here! Thx!!
 

artlessknave

Wizard
Joined
Oct 29, 2016
Messages
1,506
it's still a bit janky, but freenas does have bhyve virtualization. what is the use requirement for this server that you cannot just run freenas and host a debian VM + jails & plugins? it seems like you are nesting freenas on debian with virtualbox (btw, vbox is a type 2 hypervisor, not a type1, and there is an additional performance cost to that) but I don't see why.
alternative to that, you could also use esx for baremetal hypervisor and run debian + freenas + whatever else with relative ease.
 

AndiM202

Dabbler
Joined
Jan 6, 2019
Messages
13
There is actually no need to have any OS like Debian. When I receive the cables I will probably use FreeNAS as my main Operating system and then build all of my VM's with FreeNAS directly. The thing is that I have encountered many problems with VM's and docker host's in 11.2. I was unable to start any of those. There were many bugs reported and it seemed to me that they didn't solve that problem completely.
 

artlessknave

Wizard
Joined
Oct 29, 2016
Messages
1,506
they've basically removed the (even jankier) docker-vm. in my experience jails are easier than docker, I couldn't figure out how to even use it.
if you really want docker you are better off going with a linux host and doing everything with docker, kvm (not vbox), and openzfs2
 

AndiM202

Dabbler
Joined
Jan 6, 2019
Messages
13
I was not able to manage any type of VM within FreeNAS, unfortunately. But I have now checked the cables on my server and ordered these SAS-SATA and USB-SATA cables and will go on with this operation in a few weeks. I will post every milestone in here.
 

AndiM202

Dabbler
Joined
Jan 6, 2019
Messages
13
the chances of anyone even replying in these forums to this infinitely discouraged configuration with anything other than "don't use raid with zfs" is really slim; as far as most of us are concerned, your non supported setup is the reason for your problems and the solution is to use a supported setup. trying to "fix" something we don't think you should run in the first place is a waste of our free time.

it isn't just not recommended; you can, and should, assume it will kill your data.
get m1015/LSI9211 and then don't use the h710. about 50-70$
if you are going to use the h710 you might as well just make a raid and not bother with freenas at all. you are making things more complicated for questionable benefit while dramatically increasing the risk.
alternatively, it should be possible to flash the h710 to IT mode.

Hey! Thanks again for your expertise. Can you confirm, that this HBA Card will completely fit and work with my Dell PowerEdge R620? I have already bought the Cables but need a Plan B, of course.

Thanks!
 

artlessknave

Wizard
Joined
Oct 29, 2016
Messages
1,506
I do not have a r620, however, there are standard PCIe slots so I know of no reason to think it wouldnt work. you should also be able to replace the h710 in the dedicated storage slot, but unless you need all the pcie slots you can just use a pcie and have both cards in the system
 

artlessknave

Wizard
Joined
Oct 29, 2016
Messages
1,506
i cant read that but it should work fine. it's branded LSI so it should be a legit LSI card at least. there are often major problems with the dirt cheap chinese clones of LSI that show up on ebay alot
 

AndiM202

Dabbler
Joined
Jan 6, 2019
Messages
13
Screenshot from 2020-01-09 21-27-55.png
 

John Doe

Guru
Joined
Aug 16, 2011
Messages
633
maybe consider ESXi as a hypervisor, passthrough the HBA and be happy.

my experience with freenas 11.1 down to 9.x and jails/ plugins was bad. I mean maybe they have done fancy stuff in the meantime and it is not applicable anymore but whenever I have the choice to choose between ESXi and freenas VMs I would go with ESXi
 

AndiM202

Dabbler
Joined
Jan 6, 2019
Messages
13
I am not sure about which setup I am going to use in the future. I was thinking about to still use VirtualBox with maybe a FreeBSD or Linux host and then virtualize FreeNAS but still not sure...

The thing is that I wanna have the best output and most convenient solution for backing up / restore. I had huge .vmdk files under VBox with 800GB+ while there were only around 300GB in use. Is that because of ZFS or because of the virtualization itself?
 

artlessknave

Wizard
Joined
Oct 29, 2016
Messages
1,506
vmdk files under VBox with 800GB+
that sounds like you thick provisioned the vmdk, nothing to do with the filesystem. if you want the virtual disk to only "use" the used space you need to thin provision it, but be warned that if you do so and you run out of space on the host filesystem, vm writes will fail mysteriously.
 
Top