LAG+LACP = Extremely slow RSYNC

Status
Not open for further replies.

Michael Hanna

Dabbler
Joined
Jun 17, 2017
Messages
43
First I'll start with my hardware:

Supermicro X10SRH-CLN4F
64GB DDR4 ECC RAM
Intel Xeon E5-2620
8X 6TB WD Red Pro HDD's connected to 2 SAS Mini HD headers on motherboard
Ubiquiti Unifi US-24 Managed Switch
CAT6 Ethernet patch cables
Onboard Intel i350-AM4 Quad Port Gigabit Ethernet

I'm a fairly new users to FreeNAS... only about 2 weeks, give or take. I've been scouring through the forums daily (for hours) looking at all the post to bring myself as up to date as possible with the inner workings for FreeNAS. I've specifically been looking for the last two days for something that describes my issue but have not found anything that is specific to what I am seeing. I've setup a 3 way LAG interface on FreeNAS and configured the corresponding ports on my switch. I have full connectivity to FreeNAS. SMB, NFS, and FTP transfers all behave has expected maxing out ~115 MB/s pulling from FreeNAS and ~95 MB/s pushing to it. I've tested with multiple hosts pushing and pulling and traffic is split out between the LAG group members as one would expect. My issue is with an RSYNC task pushing data to a Synology box. Prior to the LAG being created I was using a single gig connection on the onboard Intel quad port nic and the rsync job was happily moving along at ~100 MB/s. After the creation of the LAG group it is crawling... at time as slow as Kb/s. This is confirmed by looking at the network utilization stats on both the FreeNAS box and the Synology box. At the same time the RSYNC job is processing and can still push and pull data from various hosts and achieve the expected speeds listed earlier. I can find nothing in my setup/configuration that should be causing this issue specifically with RSYNC so I thought I would reach out forums members and see if anyone has any ideas. If anymore information is needed from me please let me know.
 

Michael Hanna

Dabbler
Joined
Jun 17, 2017
Messages
43
Status
Not open for further replies.
Top