Hey folks,
When I set up my pool I foolishly didn't create datasets for various things. Now my pool is quite full (I'll actually need to expand it soon I think) and I want to create datasets for shares and whatnot. I think a move from a folder on the volume to a dataset is a full read / write so is there a recommended method I should be using to move large chunks of data around (multiple TB)? If it is just a MV from the command line can I expect it to go at the full speed of my pool and will I run into issues where I am at 80% capacity? I'm thinking I should probably delete some stuff before doing this so I don't run into performance issues.
Thanks,
When I set up my pool I foolishly didn't create datasets for various things. Now my pool is quite full (I'll actually need to expand it soon I think) and I want to create datasets for shares and whatnot. I think a move from a folder on the volume to a dataset is a full read / write so is there a recommended method I should be using to move large chunks of data around (multiple TB)? If it is just a MV from the command line can I expect it to go at the full speed of my pool and will I run into issues where I am at 80% capacity? I'm thinking I should probably delete some stuff before doing this so I don't run into performance issues.
Thanks,