Workflow and Backup Strategy

dakotta

Dabbler
Joined
Oct 12, 2018
Messages
42
Hello,

I'm trying to develop a workflow and backup strategy for myself and my wife.

There are many threads here that discuss how to backup your NAS. I'm wondering what people here use as a means for backing up to their NAS.

1. A dedicated plugin running on FreeNAS? (e.g., BackupPC, Asigra, Tarsnap, BRU Server)
2. The same type of program, manually installed, running inside a jail and scheduled through cron?
3. Synchronization software running on both the client and server? (e.g., NextCloud, Syncthing, rsync, git-annex)
4. A program running on the client that pushes data up to a mounted share on FreeNAS? (e.g., rsync, executed either manually or automatically?)
5. Or... do you always keep your data on FreeNAS and access it only as needed? (e.g., via a virtual machine or as a file server [shares] or an application server [plex, emby]?)

I am in the planning stage for a new NAS and my choice of a workflow/backup is going to determine the new NAS specs.

For example, right now I'm thinking of backing up three machines (somehow) to the NAS, but also using the NAS to run 6-12 virtual machines (for testing) with no more than two powered on at any time.

The client machines are:
  • work laptop - 500 GB
  • home desktop - 250 GB
  • home laptop - 10 GB [one data folder only]
Right now...
  • Important data from my work laptop is backed up to rsync.net, via a scheduled script.
  • Important data from my home computers are backed up to my local FreeNAS (to a dataset on an SMB share), at irregular intervals.
  • Music and other audio files are stored in a separate dataset (using FreeNAS as a file server).
  • The NAS is not backed up at all
I'd like to improve this.

Also, as part of this workflow and backup strategy, I'd like to protect against ransomware affecting multiple machines. (I don't know if that is possible.)

Cheers,
 
Last edited:

ChrisRJ

Wizard
Joined
Oct 23, 2020
Messages
1,909
For me it's 3, 4, and 5. The preferred approach is to keep data on the server (no. 5). This is how I have been handling things since NetWare 3.12 (back in 1996)
 

dakotta

Dabbler
Joined
Oct 12, 2018
Messages
42
Thanks. That makes a lot of sense.

It seems like option #5 (keeping the data on the server) can take advantage of zfs and scrubs to repair corrupted data, while #3 and #4 can actually cause data corruption. (If, for example, data on my non-ecc laptop becomes corrupted and then gets pushed up to TrueNAS...)

Cheers,
 

Patrick M. Hausen

Hall of Famer
Joined
Nov 25, 2013
Messages
7,740

dakotta

Dabbler
Joined
Oct 12, 2018
Messages
42
Snapshots with a sufficiently long retention period help here.
Yes. I can see that. If I know that a file is corrupted, I can search through old snapshots and backups until I find a good copy, but what I'm worried about is "silent" corruption. I have files on my laptop that are 17 years old...

I guess I don't really know what data corruption looks like. I mean, if metadata is corrupted I might not be able to open the file at all, right? That would be good, in a sense, because then I'd know the file is corrupted and I could restore from an earlier copy.

But what about file-data itself?

Can data corruption change numbers in a spreadsheet? Can it cause an mp3 to "skip"? Can it cause artifacts or blurring in a photo? I might never notice these types of corruption.

It seems like the safest approach is to keep my data on TrueNAS as much as possible. And for data that is primarily stored on my laptop? Maybe the best approach would be to use some sort of syncing program that asks for permission before overwriting existing files. That way, I could make sure that the only files updated are files I have been working on...

Anyway, thanks for your feedback.

Cheers,
 
Top