JayanWarden
Dabbler
- Joined
- Nov 26, 2017
- Messages
- 22
Hello everyone,
I recently rebuilt my FreeNAS server out of real server components.
Everything went fine from pool creation to L2Arc (i know that it may be overkill) and bhyve VM creation.
But I now have the following scenario:
Transfer from my PC to my NAS is running at around 400-500 MBit/s. Not the best, I know.
Transfer from my NAS to my PC is running at merely 15 Mbit/s, sometimes even less. Yes, Mbit.
I am using a SMB share that directly lies in my ZFS pool.
Specs:
Server:
Motherboard: Supermicro X9SRL-F-0 (new)
CPU: Intel Xeon E5 2620 v2 (used from ebay but working perfectly)
RAM: 4 x 16 GB Samsung ECC DDR3 (1333 MHz, also used but free from any errors)
5 x 8 TB Seagate IronWolf Pro in ZFS-Z2
PC: (used for testing with iPerf and data transfer)
Motherboard: MSI Gaming Pro Carbon x299
CPU: Intel Core i7 7740x
32GB RAM (as if that matters)
The server and my PC both use the onboard NICs.
The server has an Intel 82574L Dual GBE Controller, my PC has an Intel I219-v GBE Controller.
The PC and the server are connected through a generic TP-Link GBE 8-port switch and I use cat6 cables for the connection (two 2 meter cables)
Here is a screenshot from windows file transfer from my PC to my server:
Throughput is bouncing between 35 and ~70 MB/s
The server is merely doing anything during this transfer. CPU load is at ~40% (I use gzip-1 compression (I know that this may also be bad)), HDD load is at around 30% for all drives.
But when I try to copy a large single file from my server to my PC, bandwidth drops to merely one MB/s...
iPerf also shows very interesting results.
In this screenshot my PC was running iPerf as a client connecting to my server with one parallel stream.
Bandwidth is all over the place and averages at around 400 Mbit/s.
But when I bump the stream count to 10, suddenly the whole gigabit connection is getting used:
So far so bad. But when I turn things around, everything gets worse.
I now set iPerf on my PC in server mode and tell my server to connect to my PC.
Bandwidth is now very bad, averaging at around 14 Mbit/s for one stream.
Even worse, my server cannot connect to my PC when using 10 parallel streams. My server console is telling me it is running into timeouts and my PC does not detect an iPerf connection.
With four streams I can get iPerf to connect, but bandwidth does not really get any better.
TL;DR:
My PC does get a bandwidth of around 400 Mbit/s when using one stream to connect to the server. With ten streams I get one whole gigabit of throughput.
My Server however can get only around 14 Mbit/s bandwidth when transferring data to my PC, and around 39 Mbit/s when using four streams.
Ten streams run into a timeout.
How can this be?
Has someone ever seen behaviour like this?
How can I fix this?
Thanks in advance.
I will try to answer any questions that are posted.
I recently rebuilt my FreeNAS server out of real server components.
Everything went fine from pool creation to L2Arc (i know that it may be overkill) and bhyve VM creation.
But I now have the following scenario:
Transfer from my PC to my NAS is running at around 400-500 MBit/s. Not the best, I know.
Transfer from my NAS to my PC is running at merely 15 Mbit/s, sometimes even less. Yes, Mbit.
I am using a SMB share that directly lies in my ZFS pool.
Specs:
Server:
Motherboard: Supermicro X9SRL-F-0 (new)
CPU: Intel Xeon E5 2620 v2 (used from ebay but working perfectly)
RAM: 4 x 16 GB Samsung ECC DDR3 (1333 MHz, also used but free from any errors)
5 x 8 TB Seagate IronWolf Pro in ZFS-Z2
PC: (used for testing with iPerf and data transfer)
Motherboard: MSI Gaming Pro Carbon x299
CPU: Intel Core i7 7740x
32GB RAM (as if that matters)
The server and my PC both use the onboard NICs.
The server has an Intel 82574L Dual GBE Controller, my PC has an Intel I219-v GBE Controller.
The PC and the server are connected through a generic TP-Link GBE 8-port switch and I use cat6 cables for the connection (two 2 meter cables)
Here is a screenshot from windows file transfer from my PC to my server:

Throughput is bouncing between 35 and ~70 MB/s
The server is merely doing anything during this transfer. CPU load is at ~40% (I use gzip-1 compression (I know that this may also be bad)), HDD load is at around 30% for all drives.
But when I try to copy a large single file from my server to my PC, bandwidth drops to merely one MB/s...

iPerf also shows very interesting results.
In this screenshot my PC was running iPerf as a client connecting to my server with one parallel stream.
Bandwidth is all over the place and averages at around 400 Mbit/s.

But when I bump the stream count to 10, suddenly the whole gigabit connection is getting used:

So far so bad. But when I turn things around, everything gets worse.
I now set iPerf on my PC in server mode and tell my server to connect to my PC.
Bandwidth is now very bad, averaging at around 14 Mbit/s for one stream.

Even worse, my server cannot connect to my PC when using 10 parallel streams. My server console is telling me it is running into timeouts and my PC does not detect an iPerf connection.
With four streams I can get iPerf to connect, but bandwidth does not really get any better.

TL;DR:
My PC does get a bandwidth of around 400 Mbit/s when using one stream to connect to the server. With ten streams I get one whole gigabit of throughput.
My Server however can get only around 14 Mbit/s bandwidth when transferring data to my PC, and around 39 Mbit/s when using four streams.
Ten streams run into a timeout.
How can this be?
Has someone ever seen behaviour like this?
How can I fix this?
Thanks in advance.
I will try to answer any questions that are posted.
Last edited by a moderator: