Dataset full, now cannot delete files

Status
Not open for further replies.

jwall

Dabbler
Joined
May 7, 2013
Messages
36
I have a dataset, to which a share is mapped, that I have a 2GB quota on. There is still terrabytes of free space on the volume but I managed to fill the 2GB quota on this dataset, now I cannot delete any of the files on the share. The share does not have the "Export Recycle Bin" setting enabled.

Is there a way to remove these files?
 

ProtoSD

MVP
Joined
Jul 1, 2011
Messages
3,348
What happens if you increase the Quota for the dataset temporarily?

You can always open a shell prompt from the GUI and delete stuff from the command line.
 

jwall

Dabbler
Joined
May 7, 2013
Messages
36
I considered increasing the quota but then thought there's probably a "proper" way to deal with this. From what I've read a ZFS volume/dataset is not intended to be filled to capacity, doing so causes problems like I'm having, but it seems like a design flaw, can we not be protected from ourselves? I suppose setting an appropriately sized quota on the volume could at least allow n00bs like me to dig themselves out of the hole when this happens. Am I correct with these assumptions?

Can you please tell me how to delete this file via the shell?
 

ProtoSD

MVP
Joined
Jul 1, 2011
Messages
3,348
I haven't heard of this happening with anyone else, I don't think it's any issue of protecting people from themselves. Usually people don't protect themselves from themselves by enabling quotas :)

Filling a drive to capacity is different than filling a dataset to capacity and those issue don't apply to datasets.

I'm suggesting that if you want to put a quota on yourself, then temporarily increase it, then try and delete files from the share and see if you still have the problem. If that works, then if you still feel compelled to set quotas, you can decrease it again after you delete stuff. Doing it this way is a lot easier and safer than trying to tell you how to delete files from the command line because I can already see the next post saying you accidently deleted all your files and don't have a backup.
 

jwall

Dabbler
Joined
May 7, 2013
Messages
36
Haha, but I like to live on the edge... I do have all my data backed up, but I see where you're coming from. I increased the quota to 3GB, browsed to the share from my Win7 machine and some of the files had already removed themselves. I deleted the rest just to prove that I could and that worked as well.

I realize I sound like a numbskull with this and my other posts, but I'd consider myself fairly technically astute, just not with unix or linux, give me a Windows server and I'm set.

Thanks for the help once again.
 

ProtoSD

MVP
Joined
Jul 1, 2011
Messages
3,348
I realize I sound like a numbskull with this and my other posts, but I'd consider myself fairly technically astute, just not with unix or linux, give me a Windows server and I'm set.

Thanks for the help once again.

You're welcome, glad to hear it worked. I realize it's difficult for people to convey their skills in a situation like this, it's like calling the tech support line and trying to get past the level 1 and 2 people telling them you've worked with computers for 30 years...
 

Caesar

Contributor
Joined
Feb 22, 2013
Messages
114
I ran into this problem when forgot that I set a dataset with a 50MB quota and then tried to copy a 200MB video. I got the disk full error and then I couldn't delete anything. I just nano'd the file and removed some lines and the I was able to delete. I saw someone suggest this command

Code:
echo '' > path/to/file
 

paleoN

Wizard
Joined
Apr 22, 2012
Messages
1,402
I considered increasing the quota but then thought there's probably a "proper" way to deal with this. From what I've read a ZFS volume/dataset is not intended to be filled to capacity, doing so causes problems like I'm having, but it seems like a design flaw, can we not be protected from ourselves?
The "proper" way is to increase the quota. Quota's are one of the ways of dealing with this. Some people also create a small ZFS dataset instead. The other way would be to truncate a large file or files. If you have the space for it.
 
Status
Not open for further replies.
Top