Dataset empty after Volume Import

Status
Not open for further replies.

ShuzZzle

Cadet
Joined
Nov 17, 2018
Messages
6
Hi,

So recently my boot device got corrupted during an upgrade (no idea why). Well I thought doesn't matter that much since the data should be untouched. So I quickly reinstalled FreeNAS and imported my Volume through the WebGUI Wizard. Pool showed up instantly and was imported without a hitch. Unfortunately the only thing I saw was a 100G vm_disk in my dataset. ~3TB of files were missing. Any Ideas what could have happened? I'm absolutely clueless. Maybe i can try to restore the config or zpool.cache from the corrupted boot device, but I don't understand how that could have happened in the first place.


Kind Regards
 

DrKK

FreeNAS Generalissimo
Joined
Oct 15, 2013
Messages
3,630
well, let's see if the data is actually gone first. Let us see the complete output of:
Code:
zfs get -t filesystem all | grep used
 

ShuzZzle

Cadet
Joined
Nov 17, 2018
Messages
6
Output as requested. It is gone, isn't it? :(

Code:
Vault															used					 106G																  -
Vault															usedbysnapshots		  0																	 -
Vault															usedbydataset			117K																  -
Vault															usedbychildren		   106G																  -
Vault															usedbyrefreservation	 0																	 -
Vault															logicalused			  9.97G																 -
Vault/.system													used					 6.16M																 -
Vault/.system													usedbysnapshots		  0																	 -
Vault/.system													usedbydataset			128K																  -
Vault/.system													usedbychildren		   6.03M																 -
Vault/.system													usedbyrefreservation	 0																	 -
Vault/.system													logicalused			  57.6M																 -
Vault/.system/configs-a421eaccddb44c098d96b72146b5211d		   used					 117K																  -
Vault/.system/configs-a421eaccddb44c098d96b72146b5211d		   usedbysnapshots		  0																	 -
Vault/.system/configs-a421eaccddb44c098d96b72146b5211d		   usedbydataset			117K																  -
Vault/.system/configs-a421eaccddb44c098d96b72146b5211d		   usedbychildren		   0																	 -
Vault/.system/configs-a421eaccddb44c098d96b72146b5211d		   usedbyrefreservation	 0																	 -
Vault/.system/configs-a421eaccddb44c098d96b72146b5211d		   logicalused			  36.5K																 -
Vault/.system/configs-a7c4a4a3d45a4720a1e6b8ad799731fd		   used					 117K																  -
Vault/.system/configs-a7c4a4a3d45a4720a1e6b8ad799731fd		   usedbysnapshots		  0																	 -
Vault/.system/configs-a7c4a4a3d45a4720a1e6b8ad799731fd		   usedbydataset			117K																  -
Vault/.system/configs-a7c4a4a3d45a4720a1e6b8ad799731fd		   usedbychildren		   0																	 -
Vault/.system/configs-a7c4a4a3d45a4720a1e6b8ad799731fd		   usedbyrefreservation	 0																	 -
Vault/.system/configs-a7c4a4a3d45a4720a1e6b8ad799731fd		   logicalused			  31K																   -
Vault/.system/cores											  used					 421K																  -
Vault/.system/cores											  usedbysnapshots		  0																	 -
Vault/.system/cores											  usedbydataset			421K																  -
Vault/.system/cores											  usedbychildren		   0																	 -
Vault/.system/cores											  usedbyrefreservation	 0																	 -
Vault/.system/cores											  logicalused			  1.78M																 -
Vault/.system/rrd-a421eaccddb44c098d96b72146b5211d			   used					 117K																  -
Vault/.system/rrd-a421eaccddb44c098d96b72146b5211d			   usedbysnapshots		  0																	 -
Vault/.system/rrd-a421eaccddb44c098d96b72146b5211d			   usedbydataset			117K																  -
Vault/.system/rrd-a421eaccddb44c098d96b72146b5211d			   usedbychildren		   0																	 -
Vault/.system/rrd-a421eaccddb44c098d96b72146b5211d			   usedbyrefreservation	 0																	 -
Vault/.system/rrd-a421eaccddb44c098d96b72146b5211d			   logicalused			  36.5K																 -
Vault/.system/rrd-a7c4a4a3d45a4720a1e6b8ad799731fd			   used					 4.62M																 -
Vault/.system/rrd-a7c4a4a3d45a4720a1e6b8ad799731fd			   usedbysnapshots		  0																	 -
Vault/.system/rrd-a7c4a4a3d45a4720a1e6b8ad799731fd			   usedbydataset			4.62M																 -
Vault/.system/rrd-a7c4a4a3d45a4720a1e6b8ad799731fd			   usedbychildren		   0																	 -
Vault/.system/rrd-a7c4a4a3d45a4720a1e6b8ad799731fd			   usedbyrefreservation	 0																	 -
Vault/.system/rrd-a7c4a4a3d45a4720a1e6b8ad799731fd			   logicalused			  54.3M																 -
Vault/.system/samba4											 used					 320K																  -
Vault/.system/samba4											 usedbysnapshots		  0																	 -
Vault/.system/samba4											 usedbydataset			320K																  -
Vault/.system/samba4											 usedbychildren		   0																	 -
Vault/.system/samba4											 usedbyrefreservation	 0																	 -
Vault/.system/samba4											 logicalused			  1.24M																 -
Vault/.system/syslog-a421eaccddb44c098d96b72146b5211d			used					 117K																  -
Vault/.system/syslog-a421eaccddb44c098d96b72146b5211d			usedbysnapshots		  0																	 -
Vault/.system/syslog-a421eaccddb44c098d96b72146b5211d			usedbydataset			117K																  -
Vault/.system/syslog-a421eaccddb44c098d96b72146b5211d			usedbychildren		   0																	 -
Vault/.system/syslog-a421eaccddb44c098d96b72146b5211d			usedbyrefreservation	 0																	 -
Vault/.system/syslog-a421eaccddb44c098d96b72146b5211d			logicalused			  36.5K																 -
Vault/.system/syslog-a7c4a4a3d45a4720a1e6b8ad799731fd			used					 117K																  -
Vault/.system/syslog-a7c4a4a3d45a4720a1e6b8ad799731fd			usedbysnapshots		  0																	 -
Vault/.system/syslog-a7c4a4a3d45a4720a1e6b8ad799731fd			usedbydataset			117K																  -
Vault/.system/syslog-a7c4a4a3d45a4720a1e6b8ad799731fd			usedbychildren		   0																	 -
Vault/.system/syslog-a7c4a4a3d45a4720a1e6b8ad799731fd			usedbyrefreservation	 0																	 -
Vault/.system/syslog-a7c4a4a3d45a4720a1e6b8ad799731fd			logicalused			  31K																   -
Vault/.system/webui											  used					 117K																  -
Vault/.system/webui											  usedbysnapshots		  0																	 -
Vault/.system/webui											  usedbydataset			117K																  -
Vault/.system/webui											  usedbychildren		   0																	 -
Vault/.system/webui											  usedbyrefreservation	 0																	 -
Vault/.system/webui											  logicalused			  36.5K																 -
Vault/d1														 used					 102G																  -
Vault/d1														 usedbysnapshots		  0																	 -
Vault/d1														 usedbydataset			117K																  -
Vault/d1														 usedbychildren		   102G																  -
Vault/d1														 usedbyrefreservation	 0																	 -
Vault/d1														 logicalused			  4.94G																 -
Vault/iocage													 used					 831K																  -
Vault/iocage													 usedbysnapshots		  0																	 -
Vault/iocage													 usedbydataset			128K																  -
Vault/iocage													 usedbychildren		   703K																  -
Vault/iocage													 usedbyrefreservation	 0																	 -
Vault/iocage													 logicalused			  256K																  -
Vault/iocage/download											used					 117K																  -
Vault/iocage/download											usedbysnapshots		  0																	 -
Vault/iocage/download											usedbydataset			117K																  -
Vault/iocage/download											usedbychildren		   0																	 -
Vault/iocage/download											usedbyrefreservation	 0																	 -
Vault/iocage/download											logicalused			  36.5K																 -
Vault/iocage/images											  used					 117K																  -
Vault/iocage/images											  usedbysnapshots		  0																	 -
Vault/iocage/images											  usedbydataset			117K																  -
Vault/iocage/images											  usedbychildren		   0																	 -
Vault/iocage/images											  usedbyrefreservation	 0																	 -
Vault/iocage/images											  logicalused			  36.5K																 -
Vault/iocage/jails											   used					 117K																  -
Vault/iocage/jails											   usedbysnapshots		  0																	 -
Vault/iocage/jails											   usedbydataset			117K																  -
Vault/iocage/jails											   usedbychildren		   0																	 -
Vault/iocage/jails											   usedbyrefreservation	 0																	 -
Vault/iocage/jails											   logicalused			  36.5K																 -
Vault/iocage/log												 used					 117K																  -
Vault/iocage/log												 usedbysnapshots		  0																	 -
Vault/iocage/log												 usedbydataset			117K																  -
Vault/iocage/log												 usedbychildren		   0																	 -
Vault/iocage/log												 usedbyrefreservation	 0																	 -
Vault/iocage/log												 logicalused			  36.5K																 -
Vault/iocage/releases											used					 117K																  -
Vault/iocage/releases											usedbysnapshots		  0																	 -
Vault/iocage/releases											usedbydataset			117K																  -
Vault/iocage/releases											usedbychildren		   0																	 -
Vault/iocage/releases											usedbyrefreservation	 0																	 -
Vault/iocage/releases											logicalused			  36.5K																 -
Vault/iocage/templates										   used					 117K																  -
Vault/iocage/templates										   usedbysnapshots		  0																	 -
Vault/iocage/templates										   usedbydataset			117K																  -
Vault/iocage/templates										   usedbychildren		   0																	 -
Vault/iocage/templates										   usedbyrefreservation	 0																	 -
Vault/iocage/templates										   logicalused			  36.5K																 -
Vault/jails													  used					 3.92G																 -
Vault/jails													  usedbysnapshots		  0																	 -
Vault/jails													  usedbydataset			149K																  -
Vault/jails													  usedbychildren		   3.92G																 -
Vault/jails													  usedbyrefreservation	 0																	 -
Vault/jails													  logicalused			  4.95G																 -
Vault/jails/.warden-template-pluginjail-11.0-x64				 used					 592M																  -
Vault/jails/.warden-template-pluginjail-11.0-x64				 usedbysnapshots		  589M																  -
Vault/jails/.warden-template-pluginjail-11.0-x64				 usedbydataset			3.56M																 -
Vault/jails/.warden-template-pluginjail-11.0-x64				 usedbychildren		   0																	 -
Vault/jails/.warden-template-pluginjail-11.0-x64				 usedbyrefreservation	 0																	 -
Vault/jails/.warden-template-pluginjail-11.0-x64				 logicalused			  921M																  -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356  used					 592M																  -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356  usedbysnapshots		  589M																  -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356  usedbydataset			3.56M																 -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356  usedbychildren		   0																	 -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356  usedbyrefreservation	 0																	 -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356  logicalused			  925M																  -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925  used					 593M																  -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925  usedbysnapshots		  589M																  -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925  usedbydataset			3.56M																 -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925  usedbychildren		   0																	 -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925  usedbyrefreservation	 0																	 -
Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925  logicalused			  925M																  -
Vault/jails/.warden-template-standard-11.0-x64				   used					 2.18G																 -
Vault/jails/.warden-template-standard-11.0-x64				   usedbysnapshots		  2.18G																 -
Vault/jails/.warden-template-standard-11.0-x64				   usedbydataset			3.59M																 -
Vault/jails/.warden-template-standard-11.0-x64				   usedbychildren		   0																	 -
Vault/jails/.warden-template-standard-11.0-x64				   usedbyrefreservation	 0																	 -
Vault/jails/.warden-template-standard-11.0-x64				   logicalused			  2.24G																 -
freenas-boot													 used					 1.04G																 -
freenas-boot													 usedbysnapshots		  0																	 -
freenas-boot													 usedbydataset			64K																   -
freenas-boot													 usedbychildren		   1.04G																 -
freenas-boot													 usedbyrefreservation	 0																	 -
freenas-boot													 logicalused			  2.11G																 -
freenas-boot/ROOT												used					 1.04G																 -
freenas-boot/ROOT												usedbysnapshots		  0																	 -
freenas-boot/ROOT												usedbydataset			29K																   -
freenas-boot/ROOT												usedbychildren		   1.04G																 -
freenas-boot/ROOT												usedbyrefreservation	 0																	 -
freenas-boot/ROOT												logicalused			  2.09G																 -
freenas-boot/ROOT/Initial-Install								used					 1K																	-
freenas-boot/ROOT/Initial-Install								usedbysnapshots		  0																	 -
freenas-boot/ROOT/Initial-Install								usedbydataset			1K																	-
freenas-boot/ROOT/Initial-Install								usedbychildren		   0																	 -
freenas-boot/ROOT/Initial-Install								usedbyrefreservation	 0																	 -
freenas-boot/ROOT/Initial-Install								logicalused			  512																   -
freenas-boot/ROOT/Wizard-2018-11-17_17:08:45					 used					 1K																	-
freenas-boot/ROOT/Wizard-2018-11-17_17:08:45					 usedbysnapshots		  0																	 -
freenas-boot/ROOT/Wizard-2018-11-17_17:08:45					 usedbydataset			1K																	-
freenas-boot/ROOT/Wizard-2018-11-17_17:08:45					 usedbychildren		   0																	 -
freenas-boot/ROOT/Wizard-2018-11-17_17:08:45					 usedbyrefreservation	 0																	 -
freenas-boot/ROOT/Wizard-2018-11-17_17:08:45					 logicalused			  512																   -
freenas-boot/ROOT/default										used					 1.04G																 -
freenas-boot/ROOT/default										usedbysnapshots		  2.51M																 -
freenas-boot/ROOT/default										usedbydataset			1.03G																 -
freenas-boot/ROOT/default										usedbychildren		   0																	 -
freenas-boot/ROOT/default										usedbyrefreservation	 0																	 -
freenas-boot/ROOT/default										logicalused			  2.09G																 -
freenas-boot/grub												used					 6.97M																 -
freenas-boot/grub												usedbysnapshots		  0																	 -
freenas-boot/grub												usedbydataset			6.97M																 -
freenas-boot/grub												usedbychildren		   0																	 -
freenas-boot/grub												usedbyrefreservation	 0																	 -
freenas-boot/grub												logicalused			  12.0M																 -

 

DrKK

FreeNAS Generalissimo
Joined
Oct 15, 2013
Messages
3,630
well, you've got 106GB of something under a child dataset of Vault. But that's the only thing I see with any significant storage.

But 3TB of pool data doesn't just disappear, sir, because the boot device failed. "zpool history" should tell you what's been done to each pool. But there is no such thing as 3TB of pool data disappearing without the user doing something to make it so. See if zpool history can give you a clue. But the missing 3TB is not a FreeNAS or ZFS thing. Something had to happen.
 

ShuzZzle

Cadet
Joined
Nov 17, 2018
Messages
6
well, you've got 106GB of something under a child dataset of Vault. But that's the only thing I see with any significant storage.

But 3TB of pool data doesn't just disappear, sir, because the boot device failed. "zpool history" should tell you what's been done to each pool. But there is no such thing as 3TB of pool data disappearing without the user doing something to make it so. See if zpool history can give you a clue. But the missing 3TB is not a FreeNAS or ZFS thing. Something had to happen.
Thank you for your time. I posted my history below because i don't really see the problem.
Code:
2017-08-22.09:35:55 zpool create -o cachefile=/data/zfs/zpool.cache -o failmode=continue -o autoexpand=on -O compression=lz4 -O aclmode=passthrough -O aclinherit=passthrough -f -m /Vault -o altroot=/mnt Vault raidz /dev/gptid/f67517e3-8757-11e7-acda-40167eae680e /dev/gptid/f71b11f3-8757-11e7-acda-40167eae680e /dev/gptid/f7c5ab4a-8757-11e7-acda-40167eae680e
2017-08-22.09:35:57 zfs inherit mountpoint Vault
2017-08-22.09:35:57 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-08-22.09:35:57 zfs create -o mountpoint=legacy Vault/.system
2017-08-22.09:35:57 zfs create -o mountpoint=legacy Vault/.system/cores
2017-08-22.09:35:58 zfs create -o mountpoint=legacy Vault/.system/samba4
2017-08-22.09:35:58 zfs create -o mountpoint=legacy Vault/.system/syslog-a7c4a4a3d45a4720a1e6b8ad799731fd
2017-08-22.09:35:58 zfs create -o mountpoint=legacy Vault/.system/rrd-a7c4a4a3d45a4720a1e6b8ad799731fd
2017-08-22.09:36:00 zfs create -o mountpoint=legacy Vault/.system/configs-a7c4a4a3d45a4720a1e6b8ad799731fd
2017-08-22.09:36:48 zfs create -o aclmode=restricted -o casesensitivity=sensitive Vault/d1
2017-08-22.09:36:54 zfs set org.freenas:description=Dataset 1 Vault/d1
2017-08-22.09:42:58 zfs create -o volblocksize=16K -V 100G Vault/d1/VM_DISK
2017-08-22.09:43:03 zfs set org.freenas:description= Vault/d1/VM_DISK
2017-08-22.09:43:16 zfs create Vault/jails
2017-08-22.09:44:00 zfs create -o mountpoint=/Vault/jails/.warden-template-pluginjail -p Vault/jails/.warden-template-pluginjail
2017-08-22.09:45:27 zfs snapshot Vault/jails/.warden-template-pluginjail@clean
2017-08-22.09:45:28 zfs clone Vault/jails/.warden-template-pluginjail@clean Vault/jails/plexmediaserver_1
2017-08-23.08:29:25 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-08-23.08:29:25 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-08-26.18:28:32 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-08-26.18:28:32 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-08-30.14:26:45 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-08-30.14:26:45 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-08-31.13:52:35 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-08-31.13:52:35 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-09-30.15:00:07 zpool scrub Vault
2017-11-04.16:00:07 zpool scrub Vault
2017-11-16.10:19:26 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-11-16.10:19:26 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-11-25.08:16:08 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-11-25.08:16:08 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-11-27.09:33:28 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-11-27.09:33:28 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-11-28.09:21:38 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-11-28.09:21:38 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-11-30.09:50:34 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-11-30.09:50:34 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-12-01.07:13:57 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-12-01.07:13:57 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-12-01.07:16:48 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-12-01.07:16:48 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-12-06.10:27:16 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-12-06.10:27:16 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-12-13.15:35:03 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-12-13.15:35:03 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-12-16.15:00:07 zpool scrub Vault
2018-01-01.12:06:11 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-01-01.12:06:11 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-01-01.12:08:25 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-01-01.12:08:25 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-01-01.12:08:40 zfs set mountpoint=none Vault/jails/.warden-template-pluginjail
2018-01-01.12:08:40 zfs rename -f Vault/jails/.warden-template-pluginjail Vault/jails/.warden-template-pluginjail-11.0-x64
2018-01-01.12:08:41 zfs set mountpoint=/Vault/jails/.warden-template-pluginjail-11.0-x64 Vault/jails/.warden-template-pluginjail-11.0-x64
2018-01-01.12:08:46 <iocage> zfs set org.freebsd.ioc:active=yes Vault
2018-01-01.12:22:44 zfs create -o mountpoint=/Vault/jails/.warden-template-pluginjail -p Vault/jails/.warden-template-pluginjail
2018-01-01.12:25:31 zfs snapshot Vault/jails/.warden-template-pluginjail@clean
2018-01-01.12:25:32 zfs clone Vault/jails/.warden-template-pluginjail@clean Vault/jails/sickrage_1
2018-01-07.06:48:14 zfs create -o mountpoint=/Vault/jails/.warden-template-standard -p Vault/jails/.warden-template-standard
2018-01-07.06:50:44 zfs snapshot Vault/jails/.warden-template-standard@clean
2018-01-07.06:51:05 zfs clone Vault/jails/.warden-template-standard@clean Vault/jails/csgo
2018-01-07.06:53:51 zfs destroy -fr Vault/jails/csgo
2018-01-27.15:01:08 zpool scrub Vault
2018-03-10.15:00:07 zpool scrub Vault
2018-04-10.09:41:11 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-04-10.09:41:11 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-04-10.09:43:41 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-04-10.09:43:41 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-04-10.09:43:57 zfs set mountpoint=none Vault/jails/.warden-template-pluginjail
2018-04-10.09:43:57 zfs rename -f Vault/jails/.warden-template-pluginjail Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356
2018-04-10.09:43:57 zfs set mountpoint=/Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356 Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356
2018-04-10.09:43:57 zfs set mountpoint=none Vault/jails/.warden-template-standard
2018-04-10.09:43:57 zfs rename -f Vault/jails/.warden-template-standard Vault/jails/.warden-template-standard-11.0-x64
2018-04-10.09:44:02 zfs set mountpoint=/Vault/jails/.warden-template-standard-11.0-x64 Vault/jails/.warden-template-standard-11.0-x64
2018-04-21.15:00:11 zpool scrub Vault
2018-04-29.11:18:39 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-04-29.11:18:39 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-05-03.14:16:48 zfs create -o mountpoint=/Vault/jails/.warden-template-pluginjail -p Vault/jails/.warden-template-pluginjail
2018-05-03.14:18:23 zfs snapshot Vault/jails/.warden-template-pluginjail@clean
2018-05-03.14:18:24 zfs clone Vault/jails/.warden-template-pluginjail@clean Vault/jails/syncthing_1
2018-05-09.13:49:09 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-05-09.13:49:09 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-05-09.13:49:25 zfs set mountpoint=none Vault/jails/.warden-template-pluginjail
2018-05-09.13:49:26 zfs rename -f Vault/jails/.warden-template-pluginjail Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925
2018-05-09.13:49:30 zfs set mountpoint=/Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925 Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925
2018-06-02.15:00:11 zpool scrub Vault
2018-06-16.14:36:01 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-06-16.14:36:01 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-07-14.15:00:10 zpool scrub Vault
2018-07-31.15:59:22 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-07-31.15:59:22 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-08-25.15:00:11 zpool scrub Vault
2018-08-30.14:34:28 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-08-30.14:34:28 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-10-07.05:21:05 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-10-07.05:21:05 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-10-13.15:00:11 zpool scrub Vault
2018-11-14.08:57:58 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-14.08:57:58 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-14.09:00:24 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-14.09:00:24 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.09:58:11 <iocage> zfs mount Vault/iocage
2018-11-17.09:58:11 <iocage> zfs mount Vault/iocage/download
2018-11-17.09:58:11 <iocage> zfs mount Vault/iocage/images
2018-11-17.09:58:11 <iocage> zfs mount Vault/iocage/jails
2018-11-17.09:58:12 <iocage> zfs mount Vault/iocage/log
2018-11-17.09:58:12 <iocage> zfs mount Vault/iocage/releases
2018-11-17.09:58:17 <iocage> zfs mount Vault/iocage/templates
2018-11-17.10:44:58 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-17.10:44:58 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.13:30:08 zpool import -f -R /mnt 8987588993181897069
2018-11-17.13:30:11 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.13:30:11 zfs set aclmode=passthrough Vault
2018-11-17.13:30:12 zfs set aclinherit=passthrough Vault
2018-11-17.13:30:13  zfs set mountpoint=legacy Vault/.system
2018-11-17.13:30:44 zfs destroy -r Vault/jails/plexmediaserver_1
2018-11-17.13:30:47 zfs destroy -r Vault/jails/sickrage_1
2018-11-17.13:30:52 zfs destroy -r Vault/jails/syncthing_1
2018-11-17.13:37:52 zpool export -f Vault
2018-11-17.13:41:24 zpool import -f -R /mnt 8987588993181897069
2018-11-17.13:41:28 zfs inherit -r mountpoint Vault
2018-11-17.13:41:28 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.13:41:28 zfs set aclmode=passthrough Vault
2018-11-17.13:41:30 zfs set aclinherit=passthrough Vault
2018-11-17.13:41:35  zfs set mountpoint=legacy Vault/.system
2018-11-17.13:49:09 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-17.13:49:09 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.14:00:35 zpool export Vault
2018-11-17.14:03:54 zpool import Vault
2018-11-17.14:09:39 zpool export Vault
2018-11-17.14:23:02 zpool import -f -R /mnt 8987588993181897069
2018-11-17.14:23:05 zfs inherit -r mountpoint Vault
2018-11-17.14:23:05 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.14:23:05 zfs set aclmode=passthrough Vault
2018-11-17.14:23:07 zfs set aclinherit=passthrough Vault
2018-11-17.14:23:12  zfs set mountpoint=legacy Vault/.system
2018-11-17.14:57:09 zpool scrub Vault
2018-11-17.17:08:29 zpool import -f -R /mnt 8987588993181897069
2018-11-17.17:08:33 zfs inherit -r mountpoint Vault
2018-11-17.17:08:33 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.17:08:33 zfs set aclmode=passthrough Vault
2018-11-17.17:08:35 zfs set aclinherit=passthrough Vault
2018-11-17.17:08:37  zfs set mountpoint=legacy Vault/.system
2018-11-17.17:10:50 zpool scrub Vault
2018-11-17.17:41:35 zpool export Vault
2018-11-17.17:46:07 zpool import -f -R /mnt 8987588993181897069
2018-11-17.17:46:10 zfs inherit -r mountpoint Vault
2018-11-17.17:46:10 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.17:46:11 zfs set aclmode=passthrough Vault
2018-11-17.17:46:13 zfs set aclinherit=passthrough Vault
2018-11-17.17:46:13  zfs set mountpoint=legacy Vault/.system
2018-11-17.18:26:30 zpool import -f -R /mnt 8987588993181897069
2018-11-17.18:26:30 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.18:26:39 zfs inherit -r mountpoint Vault
2018-11-17.18:26:39 zfs set aclmode=passthrough Vault
2018-11-17.18:26:40 zfs set aclinherit=passthrough Vault
2018-11-17.18:26:40  zfs set mountpoint=legacy Vault/.system
2018-11-17.18:26:40  zfs set mountpoint=legacy Vault/.system/cores
2018-11-17.18:26:41  zfs set mountpoint=legacy Vault/.system/samba4
2018-11-17.18:26:42  zfs set mountpoint=legacy Vault/.system/webui
2018-11-18.04:45:13 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-18.04:45:13 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-19.17:24:40 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-19.17:24:40 zpool set cachefile=/data/zfs/zpool.cache Vault



The crash happened on the 17.11 as you can see the only thing I destroyed were some jails not any relevant data. Unless iam blind i dont really understand what is going on?

EDIT: Also i dont even see the 106G VM_DISK
 

DrKK

FreeNAS Generalissimo
Joined
Oct 15, 2013
Messages
3,630

Allan Jude

Dabbler
Joined
Feb 6, 2014
Messages
22
What dataset was this 3TB of data in? The zpool history shows that no datasets were every created except d1, and the zvol for your vm: VM_DISK

How many disks are in your FreeNAS? This pool (Vault) seems to only be 3 disks. Is there another pool that might contain your 3TB of files?

Can you also provide:
zpool status
zpool list -v
zfs list
 

ShuzZzle

Cadet
Joined
Nov 17, 2018
Messages
6
Thank you for your time. I posted my history below because i don't really see the problem.
Code:
2017-08-22.09:35:55 zpool create -o cachefile=/data/zfs/zpool.cache -o failmode=continue -o autoexpand=on -O compression=lz4 -O aclmode=passthrough -O aclinherit=passthrough -f -m /Vault -o altroot=/mnt Vault raidz /dev/gptid/f67517e3-8757-11e7-acda-40167eae680e /dev/gptid/f71b11f3-8757-11e7-acda-40167eae680e /dev/gptid/f7c5ab4a-8757-11e7-acda-40167eae680e
2017-08-22.09:35:57 zfs inherit mountpoint Vault
2017-08-22.09:35:57 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-08-22.09:35:57 zfs create -o mountpoint=legacy Vault/.system
2017-08-22.09:35:57 zfs create -o mountpoint=legacy Vault/.system/cores
2017-08-22.09:35:58 zfs create -o mountpoint=legacy Vault/.system/samba4
2017-08-22.09:35:58 zfs create -o mountpoint=legacy Vault/.system/syslog-a7c4a4a3d45a4720a1e6b8ad799731fd
2017-08-22.09:35:58 zfs create -o mountpoint=legacy Vault/.system/rrd-a7c4a4a3d45a4720a1e6b8ad799731fd
2017-08-22.09:36:00 zfs create -o mountpoint=legacy Vault/.system/configs-a7c4a4a3d45a4720a1e6b8ad799731fd
2017-08-22.09:36:48 zfs create -o aclmode=restricted -o casesensitivity=sensitive Vault/d1
2017-08-22.09:36:54 zfs set org.freenas:description=Dataset 1 Vault/d1
2017-08-22.09:42:58 zfs create -o volblocksize=16K -V 100G Vault/d1/VM_DISK
2017-08-22.09:43:03 zfs set org.freenas:description= Vault/d1/VM_DISK
2017-08-22.09:43:16 zfs create Vault/jails
2017-08-22.09:44:00 zfs create -o mountpoint=/Vault/jails/.warden-template-pluginjail -p Vault/jails/.warden-template-pluginjail
2017-08-22.09:45:27 zfs snapshot Vault/jails/.warden-template-pluginjail@clean
2017-08-22.09:45:28 zfs clone Vault/jails/.warden-template-pluginjail@clean Vault/jails/plexmediaserver_1
2017-08-23.08:29:25 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-08-23.08:29:25 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-08-26.18:28:32 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-08-26.18:28:32 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-08-30.14:26:45 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-08-30.14:26:45 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-08-31.13:52:35 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-08-31.13:52:35 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-09-30.15:00:07 zpool scrub Vault
2017-11-04.16:00:07 zpool scrub Vault
2017-11-16.10:19:26 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-11-16.10:19:26 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-11-25.08:16:08 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-11-25.08:16:08 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-11-27.09:33:28 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-11-27.09:33:28 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-11-28.09:21:38 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-11-28.09:21:38 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-11-30.09:50:34 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-11-30.09:50:34 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-12-01.07:13:57 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-12-01.07:13:57 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-12-01.07:16:48 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-12-01.07:16:48 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-12-06.10:27:16 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-12-06.10:27:16 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-12-13.15:35:03 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2017-12-13.15:35:03 zpool set cachefile=/data/zfs/zpool.cache Vault
2017-12-16.15:00:07 zpool scrub Vault
2018-01-01.12:06:11 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-01-01.12:06:11 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-01-01.12:08:25 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-01-01.12:08:25 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-01-01.12:08:40 zfs set mountpoint=none Vault/jails/.warden-template-pluginjail
2018-01-01.12:08:40 zfs rename -f Vault/jails/.warden-template-pluginjail Vault/jails/.warden-template-pluginjail-11.0-x64
2018-01-01.12:08:41 zfs set mountpoint=/Vault/jails/.warden-template-pluginjail-11.0-x64 Vault/jails/.warden-template-pluginjail-11.0-x64
2018-01-01.12:08:46 <iocage> zfs set org.freebsd.ioc:active=yes Vault
2018-01-01.12:22:44 zfs create -o mountpoint=/Vault/jails/.warden-template-pluginjail -p Vault/jails/.warden-template-pluginjail
2018-01-01.12:25:31 zfs snapshot Vault/jails/.warden-template-pluginjail@clean
2018-01-01.12:25:32 zfs clone Vault/jails/.warden-template-pluginjail@clean Vault/jails/sickrage_1
2018-01-07.06:48:14 zfs create -o mountpoint=/Vault/jails/.warden-template-standard -p Vault/jails/.warden-template-standard
2018-01-07.06:50:44 zfs snapshot Vault/jails/.warden-template-standard@clean
2018-01-07.06:51:05 zfs clone Vault/jails/.warden-template-standard@clean Vault/jails/csgo
2018-01-07.06:53:51 zfs destroy -fr Vault/jails/csgo
2018-01-27.15:01:08 zpool scrub Vault
2018-03-10.15:00:07 zpool scrub Vault
2018-04-10.09:41:11 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-04-10.09:41:11 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-04-10.09:43:41 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-04-10.09:43:41 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-04-10.09:43:57 zfs set mountpoint=none Vault/jails/.warden-template-pluginjail
2018-04-10.09:43:57 zfs rename -f Vault/jails/.warden-template-pluginjail Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356
2018-04-10.09:43:57 zfs set mountpoint=/Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356 Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356
2018-04-10.09:43:57 zfs set mountpoint=none Vault/jails/.warden-template-standard
2018-04-10.09:43:57 zfs rename -f Vault/jails/.warden-template-standard Vault/jails/.warden-template-standard-11.0-x64
2018-04-10.09:44:02 zfs set mountpoint=/Vault/jails/.warden-template-standard-11.0-x64 Vault/jails/.warden-template-standard-11.0-x64
2018-04-21.15:00:11 zpool scrub Vault
2018-04-29.11:18:39 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-04-29.11:18:39 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-05-03.14:16:48 zfs create -o mountpoint=/Vault/jails/.warden-template-pluginjail -p Vault/jails/.warden-template-pluginjail
2018-05-03.14:18:23 zfs snapshot Vault/jails/.warden-template-pluginjail@clean
2018-05-03.14:18:24 zfs clone Vault/jails/.warden-template-pluginjail@clean Vault/jails/syncthing_1
2018-05-09.13:49:09 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-05-09.13:49:09 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-05-09.13:49:25 zfs set mountpoint=none Vault/jails/.warden-template-pluginjail
2018-05-09.13:49:26 zfs rename -f Vault/jails/.warden-template-pluginjail Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925
2018-05-09.13:49:30 zfs set mountpoint=/Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925 Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925
2018-06-02.15:00:11 zpool scrub Vault
2018-06-16.14:36:01 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-06-16.14:36:01 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-07-14.15:00:10 zpool scrub Vault
2018-07-31.15:59:22 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-07-31.15:59:22 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-08-25.15:00:11 zpool scrub Vault
2018-08-30.14:34:28 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-08-30.14:34:28 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-10-07.05:21:05 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-10-07.05:21:05 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-10-13.15:00:11 zpool scrub Vault
2018-11-14.08:57:58 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-14.08:57:58 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-14.09:00:24 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-14.09:00:24 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.09:58:11 <iocage> zfs mount Vault/iocage
2018-11-17.09:58:11 <iocage> zfs mount Vault/iocage/download
2018-11-17.09:58:11 <iocage> zfs mount Vault/iocage/images
2018-11-17.09:58:11 <iocage> zfs mount Vault/iocage/jails
2018-11-17.09:58:12 <iocage> zfs mount Vault/iocage/log
2018-11-17.09:58:12 <iocage> zfs mount Vault/iocage/releases
2018-11-17.09:58:17 <iocage> zfs mount Vault/iocage/templates
2018-11-17.10:44:58 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-17.10:44:58 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.13:30:08 zpool import -f -R /mnt 8987588993181897069
2018-11-17.13:30:11 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.13:30:11 zfs set aclmode=passthrough Vault
2018-11-17.13:30:12 zfs set aclinherit=passthrough Vault
2018-11-17.13:30:13  zfs set mountpoint=legacy Vault/.system
2018-11-17.13:30:44 zfs destroy -r Vault/jails/plexmediaserver_1
2018-11-17.13:30:47 zfs destroy -r Vault/jails/sickrage_1
2018-11-17.13:30:52 zfs destroy -r Vault/jails/syncthing_1
2018-11-17.13:37:52 zpool export -f Vault
2018-11-17.13:41:24 zpool import -f -R /mnt 8987588993181897069
2018-11-17.13:41:28 zfs inherit -r mountpoint Vault
2018-11-17.13:41:28 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.13:41:28 zfs set aclmode=passthrough Vault
2018-11-17.13:41:30 zfs set aclinherit=passthrough Vault
2018-11-17.13:41:35  zfs set mountpoint=legacy Vault/.system
2018-11-17.13:49:09 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-17.13:49:09 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.14:00:35 zpool export Vault
2018-11-17.14:03:54 zpool import Vault
2018-11-17.14:09:39 zpool export Vault
2018-11-17.14:23:02 zpool import -f -R /mnt 8987588993181897069
2018-11-17.14:23:05 zfs inherit -r mountpoint Vault
2018-11-17.14:23:05 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.14:23:05 zfs set aclmode=passthrough Vault
2018-11-17.14:23:07 zfs set aclinherit=passthrough Vault
2018-11-17.14:23:12  zfs set mountpoint=legacy Vault/.system
2018-11-17.14:57:09 zpool scrub Vault
2018-11-17.17:08:29 zpool import -f -R /mnt 8987588993181897069
2018-11-17.17:08:33 zfs inherit -r mountpoint Vault
2018-11-17.17:08:33 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.17:08:33 zfs set aclmode=passthrough Vault
2018-11-17.17:08:35 zfs set aclinherit=passthrough Vault
2018-11-17.17:08:37  zfs set mountpoint=legacy Vault/.system
2018-11-17.17:10:50 zpool scrub Vault
2018-11-17.17:41:35 zpool export Vault
2018-11-17.17:46:07 zpool import -f -R /mnt 8987588993181897069
2018-11-17.17:46:10 zfs inherit -r mountpoint Vault
2018-11-17.17:46:10 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.17:46:11 zfs set aclmode=passthrough Vault
2018-11-17.17:46:13 zfs set aclinherit=passthrough Vault
2018-11-17.17:46:13  zfs set mountpoint=legacy Vault/.system
2018-11-17.18:26:30 zpool import -f -R /mnt 8987588993181897069
2018-11-17.18:26:30 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-17.18:26:39 zfs inherit -r mountpoint Vault
2018-11-17.18:26:39 zfs set aclmode=passthrough Vault
2018-11-17.18:26:40 zfs set aclinherit=passthrough Vault
2018-11-17.18:26:40  zfs set mountpoint=legacy Vault/.system
2018-11-17.18:26:40  zfs set mountpoint=legacy Vault/.system/cores
2018-11-17.18:26:41  zfs set mountpoint=legacy Vault/.system/samba4
2018-11-17.18:26:42  zfs set mountpoint=legacy Vault/.system/webui
2018-11-18.04:45:13 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-18.04:45:13 zpool set cachefile=/data/zfs/zpool.cache Vault
2018-11-19.17:24:40 zpool import -c /data/zfs/zpool.cache.saved -o cachefile=none -R /mnt -f 8987588993181897069
2018-11-19.17:24:40 zpool set cachefile=/data/zfs/zpool.cache Vault



The crash happened on the 17.11 as you can see the only thing I destroyed were some jails not any relevant data. Unless iam blind i don't really understand what is going on?
What dataset was this 3TB of data in? The zpool history shows that no datasets were every created except d1, and the zvol for your vm: VM_DISK

How many disks are in your FreeNAS? This pool (Vault) seems to only be 3 disks. Is there another pool that might contain your 3TB of files?

Can you also provide:
zpool status
zpool list -v
zfs list

That's correct. Iam a beginner and only have 3 disks in my only pool ( or system for that matter ). The missing files should be in the d1 dataset example: /mnt/Vault/d1/Media/bla/bla/bla.mkv

zpool status
Code:
  pool: Vault
 state: ONLINE
status: Some supported features are not enabled on the pool. The pool can
	still be used, but some features are unavailable.
action: Enable all features using 'zpool upgrade'. Once this is done,
	the pool may no longer be accessible by software that does not support
	the features. See zpool-features(7) for details.
  scan: scrub repaired 0 in 0 days 00:01:02 with 0 errors on Sat Nov 17 17:11:46 2018
config:

	NAME											STATE	 READ WRITE CKSUM
	Vault										   ONLINE	   0	 0	 0
	  raidz1-0									  ONLINE	   0	 0	 0
		gptid/f67517e3-8757-11e7-acda-40167eae680e  ONLINE	   0	 0	 0
		gptid/f71b11f3-8757-11e7-acda-40167eae680e  ONLINE	   0	 0	 0
		gptid/f7c5ab4a-8757-11e7-acda-40167eae680e  ONLINE	   0	 0	 0

errors: No known data errors

  pool: freenas-boot
 state: ONLINE
  scan: none requested
config:

	NAME		STATE	 READ WRITE CKSUM
	freenas-boot  ONLINE	   0	 0	 0
	  da0p2	 ONLINE	   0	 0	 0

errors: No known data errors



zpool list -v

Code:
NAME									 SIZE  ALLOC   FREE  CKPOINT  EXPANDSZ   FRAG	CAP  DEDUP  HEALTH  ALTROOT
Vault								   10.9T  11.7G  10.9T		-		 -	 0%	 0%  1.00x  ONLINE  /mnt
  raidz1								10.9T  11.7G  10.9T		-		 -	 0%	 0%
	gptid/f67517e3-8757-11e7-acda-40167eae680e	  -	  -	  -		-		 -	  -	  -
	gptid/f71b11f3-8757-11e7-acda-40167eae680e	  -	  -	  -		-		 -	  -	  -
	gptid/f7c5ab4a-8757-11e7-acda-40167eae680e	  -	  -	  -		-		 -	  -	  -
freenas-boot							29.5G   759M  28.8G		-		 -	  -	 2%  1.00x  ONLINE  -
  da0p2								 29.5G   759M  28.8G		-		 -	  -	 2%



zfs list

Code:
NAME															  USED  AVAIL  REFER  MOUNTPOINT
Vault															 106G  6.91T   117K  /mnt/Vault
Vault/.system													17.2M  6.91T   128K  legacy
Vault/.system/configs-83e186d6839a4480b1a3e80d1bf343a2			250K  6.91T   250K  legacy
Vault/.system/configs-a421eaccddb44c098d96b72146b5211d			117K  6.91T   117K  legacy
Vault/.system/configs-a7c4a4a3d45a4720a1e6b8ad799731fd			117K  6.91T   117K  legacy
Vault/.system/cores											   980K  6.91T   980K  legacy
Vault/.system/rrd-83e186d6839a4480b1a3e80d1bf343a2			   10.0M  6.91T  10.0M  legacy
Vault/.system/rrd-a421eaccddb44c098d96b72146b5211d				117K  6.91T   117K  legacy
Vault/.system/rrd-a7c4a4a3d45a4720a1e6b8ad799731fd			   4.62M  6.91T  4.62M  legacy
Vault/.system/samba4											  330K  6.91T   330K  legacy
Vault/.system/syslog-83e186d6839a4480b1a3e80d1bf343a2			 245K  6.91T   245K  legacy
Vault/.system/syslog-a421eaccddb44c098d96b72146b5211d			 117K  6.91T   117K  legacy
Vault/.system/syslog-a7c4a4a3d45a4720a1e6b8ad799731fd			 117K  6.91T   117K  legacy
Vault/.system/webui											   117K  6.91T   117K  legacy
Vault/d1														  102G  6.91T   117K  /mnt/Vault/d1
Vault/d1/VM_DISK												  102G  7.01T  3.77G  -
Vault/iocage													  831K  6.91T   128K  /mnt/Vault/iocage
Vault/iocage/download											 117K  6.91T   117K  /mnt/Vault/iocage/download
Vault/iocage/images											   117K  6.91T   117K  /mnt/Vault/iocage/images
Vault/iocage/jails												117K  6.91T   117K  /mnt/Vault/iocage/jails
Vault/iocage/log												  117K  6.91T   117K  /mnt/Vault/iocage/log
Vault/iocage/releases											 117K  6.91T   117K  /mnt/Vault/iocage/releases
Vault/iocage/templates											117K  6.91T   117K  /mnt/Vault/iocage/templates
Vault/jails													  3.92G  6.91T   149K  /mnt/Vault/jails
Vault/jails/.warden-template-pluginjail-11.0-x64				  592M  6.91T  3.56M  /mnt/Vault/jails/.warden-template-pluginjail-11.0-x64
Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356   592M  6.91T  3.56M  /mnt/Vault/jails/.warden-template-pluginjail-11.0-x64-20180410184356
Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925   593M  6.91T  3.56M  /mnt/Vault/jails/.warden-template-pluginjail-11.0-x64-20180509224925
Vault/jails/.warden-template-standard-11.0-x64				   2.18G  6.91T  3.59M  /mnt/Vault/jails/.warden-template-standard-11.0-x64
freenas-boot													  758M  27.8G	64K  none
freenas-boot/ROOT												 758M  27.8G	29K  none
freenas-boot/ROOT/Initial-Install								   1K  27.8G   756M  legacy
freenas-boot/ROOT/default										 758M  27.8G   756M  legacy



It also should be noted that I didn't use ECC RAM. I knew the risk but if the RAM would be the problem why would my pool be healthy?
 
Last edited:

Allan Jude

Dabbler
Joined
Feb 6, 2014
Messages
22
Are you sure your media files were not in the three datasets you destroyed?
2018-11-17.13:30:44 zfs destroy -r Vault/jails/plexmediaserver_1
2018-11-17.13:30:47 zfs destroy -r Vault/jails/sickrage_1
2018-11-17.13:30:52 zfs destroy -r Vault/jails/syncthing_1

There is no data in the d1 dataset, just the 117kb used by the empty dataset itself.
 

ShuzZzle

Cadet
Joined
Nov 17, 2018
Messages
6
Are you sure your media files were not in the three datasets you destroyed?


There is no data in the d1 dataset, just the 117kb used by the empty dataset itself.
Yes those were auto-created jails by freenas plugin system. My data was exposed to the plexmedia jail but never in the plex Jail. The Data was "gone" already before i destroyed the Jails.
 

DrKK

FreeNAS Generalissimo
Joined
Oct 15, 2013
Messages
3,630
Sir, you are describing something that just seems impossible. 3TB of data blocks don't just disappear in ZFS with no record as to "why" in the zpool history. We would love to help you recover your data, but the only plausible explanations at this point are:
  • You don't realize it, but you are telling us something that is not correct
  • You don't realize it, but you did not import the pool you think you imported
I wish we could be more helpful. If you think of anything else, we would love to help.
 

ShuzZzle

Cadet
Joined
Nov 17, 2018
Messages
6
Sir, you are describing something that just seems impossible. 3TB of data blocks don't just disappear in ZFS with no record as to "why" in the zpool history. We would love to help you recover your data, but the only plausible explanations at this point are:
  • You don't realize it, but you are telling us something that is not correct
  • You don't realize it, but you did not import the pool you think you imported
I wish we could be more helpful. If you think of anything else, we would love to help.

Alright I guess that's fair. Just to sum up and potentially clear up some misunderstanding.

  • You don't realize it, but you are telling us something that is not correct
Theoretically the disks could have been encrypted but then again I shouldn't be able to get any Information from a encrypted disk. That's the only "possible" scenario I can think of.

  • You don't realize it, but you did not import the pool you think you imported

No Idea what you mean but Vault was the only pool i ever created so highly unlikely. (which reflects the zfs history)

To me the only possible explanation is that the upgrade somehow caused a kernel panic and then the kernel panic completely f***** my data since I didn't had any ECC RAM. And for whatever reason i could still mount my pool after a reinstall ??? Oh well.
 

DrKK

FreeNAS Generalissimo
Joined
Oct 15, 2013
Messages
3,630
Well if you only had one pool, then that can't be the problem :)

As for your last idea, about a kernel panic and/or non-ECC fandango, I sincerely doubt it. While we all strongly recommend ECC if someone is going to take the trouble to use FreeNAS, I don't think any of us believe that running non-ECC RAM is such a precarious situation that you would lose your pool in the scenario you describe. I don't know anything about the encryption and what impact that might have. I decided early on that the risk-reward ratio for encrypting your pool was absurdly poor, so I never considered it.
 
Status
Not open for further replies.
Top