Dual NICs breaking system

Joined
Jun 24, 2017
Messages
338
OK, at this point, I have to assume I'm doing something wrong...

I've got TrueNAS installed on an HP ML350 G8. I've got my primary access passing through an Intel NIC that goes into a switch linked to my unifi setup running on VLAN2 (relatively simple setup... IPs are 192.168.2.X on VLAN2...)
I've been trying to get transmission to run behind a VPN but have thusfar been totally unsuccessful (if it runs through the VPN (PIA specifically), I can not connect to it by browser, but can shell into it and see that it is indeed behind the VPN)... OR, I can disable the VPN and access it by web UI (or IP) and by shell... (This is actually not even the problem I'm posting about... And I'm pretty sure it's entirely a unifi issue I'm dealing with).

Anyway, I thought... OK, this is too much work for something that should be relatively easy... But alas... Never is it..

So, I decided just to implement the use of a second NIC (built in... And no, not Intel... Unfortunately it's Broadcom... And I know I'll have performance issues with it... But torrents are a fallback when all else fails... So speed isn't a big issue). Well, the moment I plug in the second nic, it takes a BUNCH of stuff off the network, and drops TrueNAS...TrueNAS then becomes flaky and SLOW. (Difficulty loading pages, etc).

Setup is: NiC 1 (Intel) plugs into Switch running VLAN2. (192.168.2.x)
NIC3 (Broadcom) plugs into Switch running the rest of the LAN (192.168.1.x)
Network is modem to USG to Unifi POE... POE switch branches out and segregates the network... 1 port is assigned to VLAN2 and goes to a Dell switch which runs out to all the wired stuff on VLAN2 (Kodi boxes, Cameras, etc). All the rest of the ports on the POE switch run to LAN devices (it actually goes to another Dell switch first and then out from there).


Sorry if I missed any information you guys might need to help... Just let me know what you need and I'll happily supply...

Otherwise, does anyone have any clue what would cause this? (It acts as if there are 2 DHCP servers on the network conflicting with each other)
 

sretalla

Powered by Neutrality
Moderator
Joined
Jan 1, 2016
Messages
9,700

Patrick M. Hausen

Hall of Famer
Joined
Nov 25, 2013
Messages
7,776
So, I decided just to implement the use of a second NIC (built in... And no, not Intel... Unfortunately it's Broadcom... And I know I'll have performance issues with it... But torrents are a fallback when all else fails... So speed isn't a big issue). Well, the moment I plug in the second nic, it takes a BUNCH of stuff off the network, and drops TrueNAS...TrueNAS then becomes flaky and SLOW. (Difficulty loading pages, etc).
You created a bridged loop in your network, most probably.
 

Patrick M. Hausen

Hall of Famer
Joined
Nov 25, 2013
Messages
7,776
TrueNAS automatically creates bridge interfaces for jails and VMs. That automation can go wrong when you assign jails to a different interface than the one that has the default route, yet have "vnet_default_interface" set to "auto" in the jail settings.

What you need to do is create the bridge interfaces manually, before assigning any jails or VMs, move the IP address configuration from the physical interfaces to the bridge interfaces, then assign jails and/or VMs to the bridge interfaces created.

I can help, just not right now. There are various long threads with the exact same topic and most help by me - you can try the search function first.
 
Joined
Jun 24, 2017
Messages
338
TrueNAS automatically creates bridge interfaces for jails and VMs. That automation can go wrong when you assign jails to a different interface than the one that has the default route, yet have "vnet_default_interface" set to "auto" in the jail settings.

What you need to do is create the bridge interfaces manually, before assigning any jails or VMs, move the IP address configuration from the physical interfaces to the bridge interfaces, then assign jails and/or VMs to the bridge interfaces created.

I can help, just not right now. There are various long threads with the exact same topic and most help by me - you can try the search function first.
I appreciate the offer! (and I will take you up on it! (Ill send you a PM after posting this)...
 

Patrick M. Hausen

Hall of Famer
Joined
Nov 25, 2013
Messages
7,776
Hi! Sorry, I was away for a weekend and then I felt a bit sick (*) and did not really visit the forum. Still interested? We could try to get this setup going.

(*) no COVID, all is well ...
 

BluRay

Cadet
Joined
Jun 14, 2021
Messages
8
I have maybe a somewhat similar issue. I have been testing TrueNAS since 12.0-U2.1, experimenting with older hardware until I make a final decision on which hardware to use on a separate PC. I have a couple of dual boot Windows 10/Linux Mint 20.1 both using VMware Workstation 16.1.2.

I bought a couple of Intel X540-T2 network cards. I installed one of the X540s in each PC to test. I installed the latest Intel drivers for Windows to verify working cards and tested with iperf3 to verify 10GE speeds. I then booted up in Linux Mint to do the same. Mint already had a driver 5.1.0-k (not the latest, but close) so I tested with iperf3 again and everything was fine with speed and network connectivity.

My Asus MB has 2 built-in NICs, also Intel, which I currently use for both Windows and Linux for 2 different networks. I then started the VMware workstation on Windows, to launch the TrueNAS guest. I checked the network config first, and VM workstation now showed 4 NICs set as bridged versus the original 2 before. I started the client and was able to connect and login. Everything was working, network connectivity to TrueNAS and the configured jail was fine.

I then shutdown and restarted in Linux Mint. I opened VM Workstation, checked settings for network but the Linux version doesn't have the same options to select the NICs. I launched the client, but there was no interface configured. I had been using a DHCP reservation, so I configured it manually. I could not connect to the TrueNAS web gui or ping the assigned IP address from the host. I used the VM Workstation gui to get to a shell prompt. I could ping the IP address configured for TrueNAS, but nothing else on the network. "ifconfig" showed em0 with "status: no carrier" and only the em0 interface using an Intel X540 port. No other NICs showing besides the loopback.

I then tried using Oracle's VirtualBox instead, and it works fine, and I can attach each NIC separately like the Windows client. I also tried creating a new client on VMware without success.

I see there are lots of "status:no carrier" errors for FreeNAS/TrueNAS but they seem to be legitimate errors for bad cables/fake cards/2 NICs with configured IPs on the same network etc. Have you ever seen this issue with VMware before?

From what I have read, the Intel X540 shouldn't be an issue for TrueNAS when running directly, and obviously there shouldn't be any network/cable/config problems since everything works with VM on Windows/Windows/VirtualBox on Mint and Mint by itself. I am using 3 separate networks on the 2 built-in Intel NICs, plus 1 X540-T2 NIC, the second port on the X540 is not in use.

Please let me know if I should post this to a new topic rather than here.
 
Last edited:

Patrick M. Hausen

Hall of Famer
Joined
Nov 25, 2013
Messages
7,776
Sorry, if you want some assistance, please break your wall of text into readable paragraphs. I am not going to digest this.
 

BluRay

Cadet
Joined
Jun 14, 2021
Messages
8
Hi Patrick, sorry for the rambling post. I would have edited it but there was/is no edit button for me. As it turns out, the issue is only with VMWare and nothing to do with TrueNAS. Adding a new network card seems to change the order they get listed using ifconfig. VMWare also isn't allowing Freebsd to discover the correct driver to use.

VMWare works fine in Windows due to more options for network interfaces, and identifies all of them correctly. Virtual Box also works on Linux, again due to more choices for network drives. There was nothing that supported 10GE, but I found one option that provided a working network, without the "no carrier" status.

In any event, TrueNAS has no issues configuring a valid driver when the correct information is passed from the host system.
 

Patrick M. Hausen

Hall of Famer
Joined
Nov 25, 2013
Messages
7,776
What do you mean "VMware works fine in Windows"? What is running on what? VWware Workstation on Windows? Virtual Box on a Linux machine? Again - what even is the problem and why is it TrueNAS related?

VMWare also isn't allowing Freebsd to discover the correct driver to use.
This also doesn't make much sense - I have been running FreeBSD in VMware ESX(i) literally for decades.
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
Adding a new network card seems to change the order they get listed using ifconfig.

If you mean in the FreeBSD VM, yes, VMware has broken PCIe enumeration and there are a bunch of people who need to be beaten with a stick.

VMWare also isn't allowing Freebsd to discover the correct driver to use.

That, however, doesn't make any sense. You can pick either the VMXNET3 or E1000E and it should work swimmingly well. It used to only work swimmingly well with E1000{,E} and that's really so incredibly stable that I often still use it as the preferred choice.

Sorry I broke the quoting on this message and didn't notice. -JG
 
Last edited:

BluRay

Cadet
Joined
Jun 14, 2021
Messages
8
If you mean in the FreeBSD VM, yes, VMware has broken PCIe enumeration and there are a bunch of people who need to be beaten with a stick.

That, however, doesn't make any sense. You can pick either the VMXNET3 or E1000E and it should work swimmingly well. It used to only work swimmingly well with E1000{,E} and that's really so incredibly stable that I often still use it as the preferred choice.
I am using VMware Workstation (the free one not ESX(i). I have a dual boot PC running Windows 10 and Linux Mint 20.04, both with the latest version of VMware Workstation (16.1.2) installed . The Windows VMware allows me to choose any or all of the 4 Intel interfaces under the advanced tab. I am using the bridged interface option. I had a guest TrueNAS (FreeBsd) client configured and working prior to adding the X540-T2.

Adding the X540 card on Windows caused no issues for TrueNAS. I could connect with the web interface at the original IP. I can see the X540 interfaces and configure them without issue. TrueNAS is still using the original (motherboard) Intel interface for the web gui, no issues.

In Linux, the X540 works fine on the host system. When I launch the TrueNAS guest client, i can neither ping nor connect to the TrueNAS web gui. When I connect using the shell and run ifconfig, only one interface is listed (em0). The same as before, however it is now using the primary interface of the X540 instead of the one on the motherboard it used previously. The interface status shows no carrier, and it cannot ping anything on the network. The advanced tab on the Linux version of VMware Workstation has no options for cards to choose.

I tried installing VirtualBox instead. It doesn't detect the actual interfaces I have, but there are multiple choices. Some interfaces gave me the same no carrier status, but I found one that does work. Likely not at 10GE looking at the description, but at least I have connectivity.

As I said in my last post, I don't see this as a problem with TrueNAS or FreeBsd. It is obviously an issue with the Linux version of VMware workstation only. I am aware that the X540 works fine on bare metal TrueNAS install. I can also see that from the correct 10GE Intel driver that is loaded on TrueNAS when I use Windows as the host OS.
 

Patrick M. Hausen

Hall of Famer
Joined
Nov 25, 2013
Messages
7,776
Short version: you can virtualize TrueNAS in ESXi or Proxmox or similar bare metal hypervisors. You will not be able to do anything productive beyond experimenting with the UI and seeing if you like it with TrueNAS in a desktop hypervisor like VMware Workstation or VirtualBox.

Just don't. Not supported, won't work.

It's supposed to be the other way round. Run TrueNAS on the bare metal, use TrueNAS to run VMs ...
 

BluRay

Cadet
Joined
Jun 14, 2021
Messages
8
Yes, I intend to use bare metal for TrueNAS once I finish choosing the rest of my hardware and testing. I am aware that running it on a VM is fnot recommended. It seemed like one of the primary reasons for not using a VM was the pitfalls associated with passing through hardware like HBA controllers.

I also see from the comments many people using VMs anyway and no mention of issues with network cards. At this point it has at least allowed me to become familiar with the setup and operation of TrueNAS, as well as how easy it is to recover from hardware and/or software issues by just transferring the data (pool) drives to another TrueNas installation. I am surprised by the issue with VMware WS on Linux as I thought most people running a VM for TrueNAS would be using some form of Linux.

The edit and delete options have appeared now. Yay!
 

BluRay

Cadet
Joined
Jun 14, 2021
Messages
8
I found a solution to the issues with the new network interfaces i was having. It just required modifying two VMware files in /etc/vmware, networking and netmap.conf. It somehow had changed the interface I was using originally to enp12s0 instead of the correct enp16s0. I was also able to add the new X540-T2 interface I wanted to use.
/etc/vmware/networking
Code:
add_bridge_mapping enp16s0 0
add_bridge_mapping enp5s0f0 2

/etc/vmware/netmap.conf
Code:
network2.name = "VMNet2"
network2.device = "vmnet2"

I will add a link to the external website I found this on if permitted.
 
Last edited:

Patrick M. Hausen

Hall of Famer
Joined
Nov 25, 2013
Messages
7,776
I thought most people running a VM for TrueNAS would be using some form of Linux.
Most use a bare metal hypervisor like ESXi. I have a system like this with three TrueNAS SCALE installations, too. All my TrueNAS CORE are bare metal, though.
 
Top