Moving Data Between Datasets

Status
Not open for further replies.

KMR

Contributor
Joined
Dec 3, 2012
Messages
199
Hey folks,

When I set up my pool I foolishly didn't create datasets for various things. Now my pool is quite full (I'll actually need to expand it soon I think) and I want to create datasets for shares and whatnot. I think a move from a folder on the volume to a dataset is a full read / write so is there a recommended method I should be using to move large chunks of data around (multiple TB)? If it is just a MV from the command line can I expect it to go at the full speed of my pool and will I run into issues where I am at 80% capacity? I'm thinking I should probably delete some stuff before doing this so I don't run into performance issues.

Thanks,
 

danb35

Hall of Famer
Joined
Aug 16, 2011
Messages
15,504
You can do mv from the command line. It's not as fast as on many other filesystems, but it's likely the fastest way available. Note that if you have snapshots, they'll get very large in a hurry when you start moving large amounts of data around.
 

KMR

Contributor
Joined
Dec 3, 2012
Messages
199
No snapshots, fortunately. I'm not sure they fit my use case of large media files that don't really change once they are dumped on the drives. Thanks for the advice.
 

KMR

Contributor
Joined
Dec 3, 2012
Messages
199
The two folders I want to move have most of the data on the pool in them (and their sub directories). Does anyone know if I'm going to run into issues moving them to a dataset where the pool is almost full (80%)?
 

Knowltey

Patron
Joined
Jul 21, 2013
Messages
430
The two folders I want to move have most of the data on the pool in them (and their sub directories). Does anyone know if I'm going to run into issues moving them to a dataset where the pool is almost full (80%)?

Possibly. If you have a snapshot scheme running on the datasets in question note that if you're moving the information out of one dataset into another it won't immediately free the space from the source dataset until you either delete the snapshot reserving the space or said snapshot expires. If you're moving data that equates to 80% of the pool's size you'll run into an issue where you're actually going to need 160% the size of the pool to move the data until you get the reserved space freed up.

So for example if you have 100G in DataSetA and you want to move that 100G to DataSetB. You have a snapshot schedule for DataSetA with a retain length of two weeks. You go ahead and move that data from DataSetA to DataSetB. For the next two weeks that data is essentially taking up 200G since it is now taking up 100G in DataSetB, and the snapshot of it in DataSetA is going to reserve the 100G it was previously taking up there just in case.
 
Status
Not open for further replies.
Top