Backing Up Select Folders on a FreeNAS Server to an External USB

Status
Not open for further replies.

stephenlew

Cadet
Joined
May 7, 2015
Messages
5
Hi,

I am not familiar with FreeNAS but have a basic understanding of how to use Linux and navigate around the CLI. I am not familiar with ZFS, rsync and other more the more detailed features.

The place where I work wants me to create automated backups of a folder in FreeNAS in case something goes wrong which I think is a great idea. I don't know how to do it.

I have looked around the forums and seen basics of creating ZFS Snapshots, using Rsync, and other posts which recommend against doing exactly this. But the higher ups gave me a 4TB external USB HDD and said use this to create automated back ups of our stuff. The current size of the FreeNAS system is 8TB with 4.2TB currently in use.

Is there anywhere I can find a step by step guide on how to do this? I have tried to rsync push to the external HDD which I mounted onto the system but when i came in the next morning the drive was empty.

Thanks in advanced and sorry for being a total noob.

If there are any resources for someone like me that suddenly has to manage a FreeNAS server with zero prior experience could someone please link them here as well?
 

pirateghost

Unintelligible Geek
Joined
Feb 29, 2012
Messages
4,219

stephenlew

Cadet
Joined
May 7, 2015
Messages
5
Thanks, Robocopy looks really straight forward. Select the source, select destination, build script, schedule script. Is there anything else I should know before I go off and back up terabytes of data using this? It would suck if i am trusting my data to these scripts and I make a noob mistake.
 

anodos

Sambassador
iXsystems
Joined
Mar 6, 2014
Messages
9,553
Thanks, Robocopy looks really straight forward. Select the source, select destination, build script, schedule script. Is there anything else I should know before I go off and back up terabytes of data using this? It would suck if i am trusting my data to these scripts and I make a noob mistake.
That's why you verify your backups regularly. To be honest, your situation is not very good. It seems like you are being under-equipped for your job / set up for failure. I suggest you do the following:

1) Enable snapshots on your datasets. When in doubt, I do them once per day, retain for two weeks .
2) Post the hardware in the server, version of freenas, and zpool configuration [zpool status -v]. Based on the imposed backup strategy, I have a feeling that your server is improperly configured / hardware improperly selected. This can have catastrophic consequences. Post information enclosed in Code tags.
3) If you decide to use robocopy - enable logging, verify your backups, make sure that permissions are set correctly on the share (robocopy can only copy data you have access to).
4) Research how to set up a better backup solution than a single USB hard drive (which is at best a band-aid)​

When I run robocopy, I tend to use the following flags:

Code:
robocopy "[source]" "[destination]" /MIR /COPY:DT /Z /W:5 /R:15 /FFT /XF ntuser.* *.dat *.db *.tmp /LOG:[log destination]

/MIR - mirror the two folders
/COPY:DT - copy the data and timestamps (but not attributes - this is important if you disable "store dos attributes in your CIFS config)
/Z - restartable
/W - wait time between copy attempts
/R - number of retries before moving onwards
/FFT - Assume FAT file times. This may be necessary to prevent robocopy from needlessly backing up  files and folders due to differences in how time is stored in samba vs. windows.
/XF - exclude files
/LOG:[log destination] - Set the folder to which robocopy will log.
 
Last edited:

timb_yyc

Dabbler
Joined
Dec 23, 2013
Messages
12
I agree with anodos, you're on a path to failure! My math might be wrong but I'm not sure robocopy is going to work well to backup 4.2TB of data to a 4TB external drive... Even if you don't copy all files, you won't have incremental (historical) backups (i.e. if a file is found to be corrupt, you only have last night's backup to restore from). One of my clients had a virus that infected all of his JPG files and he didn't find out until a couple weeks later. He lost everything!

For backup systems I have been recommending fire/water proof backup systems made by ioSafe for years. They have both USB and NAS solutions. Then combine that with a good backup program like Acronis True Image which can compress as it backs up, plus you can select how long you want to keep your backups.

I guess it just depends on how much you value your data...

Good luck!
 

stephenlew

Cadet
Joined
May 7, 2015
Messages
5
So I plug in the USB and do a RoboCopy for now using the above parameters and I set the server to take daily snapshots which which expires after two weeks, as outlined by Anodos. This is the BandAid solution and I make that clear to everyone involved, but at least this way I have SOME form of a back up which is infinitely better than nothing at all. Following that I get the server information and post it here in code tags and look for a better solution for a back up.

But how catastrophic is it if the hardware is wrong? I know this is never an excuse for keeping a bad configuration but things have been working fine so far...(nearly a year now)

I get the feeling that the best solution is to have a separate server (does it need proper hardware?) set up somewhere else in the building and have the first one remotely send all its data to the second one (rsync?) and if the first goes down then we could switch them around. Although the best solution is a perfect duplicate freeNAS server running as a redundancy really the most cost effective solution? Is there a viable back up solution that doesn't require thousands of dollars to set up?
 
Last edited:

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504
Is there a viable back up solution that doesn't require thousands of dollars to set up?
A TS140 is under $300. Add $100 for RAM and disks of your choice. The server, RAM, and 4x 4 TB disks would be around $1k.
 

stephenlew

Cadet
Joined
May 7, 2015
Messages
5
I have just tested Robocopy and it works great. I have set up periodic snapshots to be taken daily when no one else is around. Can the server still function normally while this is going on in case someone is staying back late? The lifetime of these snapshots are set to 2 weeks.

Hardware information is as follows:
Code:
Build    FreeNAS-9.2.1.5-RELEASE-x64 (80c1d35)
Platform    Intel(R) Xeon(R) CPU E3-1220 V2 @ 3.10GHz
Memory    16087MB
System Time    Tue May 12 12:16:13 EST 2015
Uptime    12:16PM up 12 days, 24 mins, 0 users
Load Average    0.14, 0.08, 0.08


Output from zpool status -v in shell
Code:
pool: GalaxyBackup
state: ONLINE
scan: none requested
config:

NAME STATE READ WRITE CKSUM
GalaxyBackup ONLINE 0 0 0
gptid/e73ea7d3-f3b6-11e4-afcf-001e67aa856f ONLINE 0 0 0

errors: No known data errors

pool: studio
state: ONLINE
status: The pool is formatted using a legacy on-disk format. The pool can
still be used, but some features are unavailable.
action: Upgrade the pool using 'zpool upgrade'. Once this is done, the
pool will no longer be accessible on software that does not support feature
flags.
scan: scrub repaired 0 in 8h30m with 0 errors on Sun May 10 08:30:53 2015
config:

NAME STATE READ WRITE CKSUM
studio ONLINE 0 0 0
raidz1-0 ONLINE 0 0 0
ada1p2 ONLINE 0 0 0
ada0p2 ONLINE 0 0 0
ada2p2 ONLINE 0 0 0
ada3p2 ONLINE 0 0 0

errors: No known data errors


GalaxyBackup is the external HDD
studio is where all the data is kept.

Will look into the T140S solution proposed. As well as other possible solutions Do you guys/gals haves any other ideas?
 
Status
Not open for further replies.
Top