- Joined
- May 28, 2011
- Messages
- 10,996
This posting is mostly for gcooper.
While attempting to auto-import my pool, which was actually listed, it failed stating a middleware problem had occured and to check my pool status.
Since I didn't collect any data at the time of the issue I'll rely on my memory best I can. "farm" is my pools name and it has several .
1) zpool status -v resulted in telling my pool was ONLINE and there were no problems.
2) zpool list resulted in no pool located.
3) zfs import farm resulted in telling me the pools was recently used by another system and there were no problems.
4) zfs import -f farm resulted in telling me it couldn't mount the many datasets I had. It also farmed my out to a web address but I didn't follow that.
5) After some internet searching I ended up having to type "zfs export farm".
6) The pool was not listed in the Auto-Import any longer.
7) I typed zfs export which resulted in farm now being listed in Auto-Import and I was actually able to import it fine.
Here is what my pool looks like right now, practically the same as before:
Not sure what happened but during this entire process I had also tried 8.0.2-Release just in case my build was corrupt but could not mount the pool. This is the first time I'd tried 8.0.2 or .3 on my real NAS. Since I rotate my flash drives I always have the previous working version and I was able to install it and still access my pool without issue.
If there is something I can do, without destroying my pool for testing purposes just let me know the exact commands you want me to perform. I'm truly a novice when it comes to Linux/Unix but I'm learning, I just don't need to experience too much pain if I can help it. I do have all my data backed up elsewhere so recovery would only take a 30 hour+ day to copy the data back if I screw something up.
-Mark
While attempting to auto-import my pool, which was actually listed, it failed stating a middleware problem had occured and to check my pool status.
Since I didn't collect any data at the time of the issue I'll rely on my memory best I can. "farm" is my pools name and it has several .
1) zpool status -v resulted in telling my pool was ONLINE and there were no problems.
2) zpool list resulted in no pool located.
3) zfs import farm resulted in telling me the pools was recently used by another system and there were no problems.
4) zfs import -f farm resulted in telling me it couldn't mount the many datasets I had. It also farmed my out to a web address but I didn't follow that.
5) After some internet searching I ended up having to type "zfs export farm".
6) The pool was not listed in the Auto-Import any longer.
7) I typed zfs export which resulted in farm now being listed in Auto-Import and I was actually able to import it fine.
Here is what my pool looks like right now, practically the same as before:
Code:
[root@freenas] ~# zpool status -v
pool: farm
state: ONLINE
scrub: none requested
config:
NAME STATE READ WRITE CKSUM
farm ONLINE 0 0 0
raidz1 ONLINE 0 0 0
gptid/4004da37-0d40-11e1-9d47-50e549b78964 ONLINE 0 0 0
gptid/40310ae1-0d40-11e1-9d47-50e549b78964 ONLINE 0 0 0
gptid/405d2497-0d40-11e1-9d47-50e549b78964 ONLINE 0 0 0
gptid/408a1ff1-0d40-11e1-9d47-50e549b78964 ONLINE 0 0 0
errors: No known data errors
-----------------------------------------------------------------
[root@freenas] ~# zfs list
NAME USED AVAIL REFER MOUNTPOINT
farm 1.22T 3.96T 24.0M /mnt/farm
farm/Madyson 163K 3.96T 163K /mnt/farm/Madyson
farm/Mark 163K 3.96T 163K /mnt/farm/Mark
farm/Rebecca 163K 3.96T 163K /mnt/farm/Rebecca
farm/backups 735G 3.96T 735G /mnt/farm/backups
farm/ftp 163K 10.0G 163K /mnt/farm/ftp
farm/main 123G 3.96T 123G /mnt/farm/main
farm/movies 372G 3.96T 372G /mnt/farm/movies
farm/music 163K 3.96T 163K /mnt/farm/music
farm/photos 22.5G 3.96T 22.5G /mnt/farm/photos
[root@freenas] ~#
-----------------------------------------------------------------
[root@freenas] ~# df -h
Filesystem Size Used Avail Capacity Mounted on
/dev/ufs/FreeNASs2a 927M 529M 324M 62% /
devfs 1.0K 1.0K 0B 100% /dev
/dev/md0 4.6M 1.8M 2.3M 44% /etc
/dev/md1 824K 2.0K 756K 0% /mnt
/dev/md2 149M 13M 124M 10% /var
/dev/ufs/FreeNASs4 20M 651K 18M 3% /data
farm 4.0T 24M 4.0T 0% /mnt/farm
farm/Madyson 4.0T 163K 4.0T 0% /mnt/farm/Madyson
farm/Mark 4.0T 163K 4.0T 0% /mnt/farm/Mark
farm/Rebecca 4.0T 163K 4.0T 0% /mnt/farm/Rebecca
farm/backups 4.7T 735G 4.0T 15% /mnt/farm/backups
farm/ftp 10G 163K 10G 0% /mnt/farm/ftp
farm/main 4.1T 123G 4.0T 3% /mnt/farm/main
farm/movies 4.3T 372G 4.0T 8% /mnt/farm/movies
farm/music 4.0T 163K 4.0T 0% /mnt/farm/music
farm/photos 4.0T 23G 4.0T 1% /mnt/farm/photos
[root@freenas] ~#
Not sure what happened but during this entire process I had also tried 8.0.2-Release just in case my build was corrupt but could not mount the pool. This is the first time I'd tried 8.0.2 or .3 on my real NAS. Since I rotate my flash drives I always have the previous working version and I was able to install it and still access my pool without issue.
If there is something I can do, without destroying my pool for testing purposes just let me know the exact commands you want me to perform. I'm truly a novice when it comes to Linux/Unix but I'm learning, I just don't need to experience too much pain if I can help it. I do have all my data backed up elsewhere so recovery would only take a 30 hour+ day to copy the data back if I screw something up.
-Mark