- Joined
- May 17, 2014
- Messages
- 3,611
I routinely ran ZFS root pool and a separate data pool on a 2014 laptop with dual core, (single thread per core), Celeron & 2GBs of memory. (And the actual tech & design, was probably from 2012.) Until Mr. Meltdown and Ms. Spectre arrived, Gentoo Linux compiles of OS updates were fine. Perhaps taking all day. Now, it can take several days... even close to a week under certain package updates. But, I can live with that since it is not my primary computer.
Now I have since bought a "real" laptop with more cores, (6), and dual threads per core, (12 threads), plus a lot more. However, I have decided to keep that old laptop until it dies because it is a 12", verses my new one which is 14". So smaller and easier to travel with. In addition, if it is stolen, so what? My new one cost close to $2,000 with the upgraded dual SSDs, (for ZFS Mirroring). The old laptop cost $400 + $500 for 1TB SATA SSD, (back in 2014). I have long since re-couped my money.
The real subject about memory for TrueNAS is reliability and reasonable performance. Using less than the minimum memory is an exercise in masochism. Now I can completely understand someone still using the minimum memory with 100TBs of storage if the application is simply archive or backup. ARC is not all that useful in some use cases like write once, only read on scrub.
Now I have since bought a "real" laptop with more cores, (6), and dual threads per core, (12 threads), plus a lot more. However, I have decided to keep that old laptop until it dies because it is a 12", verses my new one which is 14". So smaller and easier to travel with. In addition, if it is stolen, so what? My new one cost close to $2,000 with the upgraded dual SSDs, (for ZFS Mirroring). The old laptop cost $400 + $500 for 1TB SATA SSD, (back in 2014). I have long since re-couped my money.
The real subject about memory for TrueNAS is reliability and reasonable performance. Using less than the minimum memory is an exercise in masochism. Now I can completely understand someone still using the minimum memory with 100TBs of storage if the application is simply archive or backup. ARC is not all that useful in some use cases like write once, only read on scrub.