Is the behavior the same on a different OS?
I have booted the machine off of a USB stick with the latest Ubuntu installer. I have installed the package zfsutils-linux, imported the zpool and started a scrub. I will report back.
Is the behavior the same on a different OS?
I have booted the machine off of a USB stick with the latest Ubuntu installer. I have installed the package zfsutils-linux, imported the zpool and started a scrub. I will report back.
pool: ultraman state: ONLINE scan: scrub repaired 0 in 0h19m with 0 errors on Mon Aug 28 11:51:56 2017 ... ...
... ... 2017-08-28.10:44:49 zpool import -f ultraman 2017-08-28.10:45:20 zpool clear ultraman 2017-08-28.10:45:37 zpool scrub ultraman 2017-08-28.11:32:26 zpool scrub ultraman 2017-08-28.12:41:35 zpool export ultraman ... ...
Perhaps you should tabulate which drives in which bays have which firmware?
/dev/da0 Model Family: Western Digital Gold Device Model: WDC WD6002FRYZ-01WD5B0 Firmware Version: 01.01M02 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da1 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da2 Model Family: Western Digital Gold Device Model: WDC WD6002FRYZ-01WD5B0 Firmware Version: 01.01M02 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da3 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68L0BN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da4 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da5 Device Model: WDC WD6002FFWX-68TZ4N0 Firmware Version: 83.H0A83 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da6 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da7 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da8 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da9 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da10 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da11 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da12 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da13 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da14 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da15 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da16 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da17 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da18 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da19 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da20 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da21 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da22 Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 Firmware Version: 82.00A82 User Capacity: 6,001,175,126,016 bytes [6.00 TB] ----------- /dev/da23 Model Family: Western Digital Gold Device Model: WDC WD6002FRYZ-01WD5B0 Firmware Version: 01.01M02 User Capacity: 6,001,175,126,016 bytes [6.00 TB] -----------
Feel free to compile whatever you find in a thread. I'd be interested in finding out what exactly is going on.Maybe I should start a new thread focusing on the type of disk (WD60EFRX-68L0BN1)?
I have offlined the disk i suspect (da3/WX11DC61YDPV) and started a scrub.
scan: scrub repaired 0 in 14h52m with 0 errors on Tue Aug 29 03:47:00 2017
I'm aware of three types of the WD60EFRX.
This is the status of each type in regards to this thread:
WD60EFRX-68L0BN1 (Bad in current system)
WD60EFRX-68MYMN1 (No problems in current system)
WD60EFRX-68TGBN1 (Have not tried in current system)
Does anyone know of other types of WD60EFRX (This should really be a new thread)?
- Verify that the CCC in the reported model has changed from "0" to "1".
- WDC WD60EFRX-68MYMN0 changes to WDC WD60EFRX-68MYMN1
Perhaps try that firmware update procedure. 68L0bN1 could be an older firmware than 68MYMN1, since L comes before M ;)
Makes this a live issue. And would explain why you managed to find half a dozen people possibly the same results..
Model Family: Western Digital Red Device Model: WDC WD60EFRX-68L0BN1 LU WWN Device Id: 5 0014ee 0596e991f Firmware Version: 82.00A82
Model Family: Western Digital Red Device Model: WDC WD60EFRX-68MYMN1 LU WWN Device Id: 5 0014ee 2b745e38b Firmware Version: 82.00A82
(da2:mpr0:0:2:0): READ(16). CDB: 88 00 00 00 00 01 37 a5 79 a0 00 00 00 38 00 00 (da2:mpr0:0:2:0): CAM status: SCSI Status Error (da2:mpr0:0:2:0): SCSI status: Check Condition (da2:mpr0:0:2:0): SCSI sense: MEDIUM ERROR asc:11,0 (Unrecovered read error) (da2:mpr0:0:2:0): Info: 0x137a579a0 (da2:mpr0:0:2:0): Error 5, Unretryable error
I have the same disks in use as you
...
But I guess that's just a normal hdd hickup. But I will keep your findings in my mind, maybe there is an issue with that disk. Maybe the issue only shows up with more than 8 drives?
[root@ultraman] ~# badblocks -p 4 -b 4096 -wsv /dev/da3 Checking for bad blocks in read-write mode From block 0 to 1465130645 Testing with pattern 0xaa: set_o_direct: Inappropriate ioctl for device ...
1. Re-adding a bad drive to my zpool (Expect errors to reappear, not only on bad drive).
[root@ultraman] ~# dmesg | grep -i scsi\ status\ error | sort | uniq -c 1 (da10:mps0:0:22:0): CAM status: SCSI Status Error 1 (da11:mps0:0:23:0): CAM status: SCSI Status Error 1 (da12:mps0:0:24:0): CAM status: SCSI Status Error 1 (da16:mps0:0:28:0): CAM status: SCSI Status Error 1 (da17:mps0:0:29:0): CAM status: SCSI Status Error 1 (da19:mps0:0:31:0): CAM status: SCSI Status Error 1 (da20:mps0:0:32:0): CAM status: SCSI Status Error 1 (da22:mps0:0:34:0): CAM status: SCSI Status Error 3 (da26:mps0:0:9:0): CAM status: SCSI Status Error 1 (da7:mps0:0:19:0): CAM status: SCSI Status Error 1 (da9:mps0:0:21:0): CAM status: SCSI Status Error
pool: ultraman state: ONLINE status: One or more devices has experienced an unrecoverable error. An attempt was made to correct the error. Applications are unaffected. action: Determine if the device needs to be replaced, and clear the errors using 'zpool clear' or replace the device with 'zpool replace'. see: http://illumos.org/msg/ZFS-8000-9P scan: scrub repaired 1.04M in 15h11m with 0 errors on Wed Aug 30 23:55:26 2017 config: NAME STATE READ WRITE CKSUM ultraman ONLINE 0 0 0 mirror-0 ONLINE 0 0 0 gptid/2e00ac23-183d-11e7-ae9d-0025901ef244 ONLINE 0 0 0 gptid/6e71919e-1618-11e7-a3b7-0025901ef244 ONLINE 0 0 0 mirror-1 ONLINE 0 0 0 gptid/6f22c98c-1618-11e7-a3b7-0025901ef244 ONLINE 0 0 0 gptid/6fe54bfe-1618-11e7-a3b7-0025901ef244 ONLINE 0 0 0 mirror-2 ONLINE 0 0 0 gptid/70bfd5c6-1618-11e7-a3b7-0025901ef244 ONLINE 0 0 0 gptid/a20e4ec0-8c8d-11e7-ac97-0cc47a5312f0 ONLINE 0 0 0 mirror-3 ONLINE 0 0 0 gptid/7ad0f185-1619-11e7-a3b7-0025901ef244 ONLINE 0 0 0 gptid/9e899578-183c-11e7-ae9d-0025901ef244 ONLINE 0 0 0 mirror-4 ONLINE 0 0 0 gptid/427c2189-184c-11e7-ae9d-0025901ef244 ONLINE 0 0 0 gptid/4342a98c-184c-11e7-ae9d-0025901ef244 ONLINE 0 0 0 mirror-5 ONLINE 0 0 0 gptid/a1ab9a69-184c-11e7-ae9d-0025901ef244 ONLINE 0 0 0 gptid/a2851364-184c-11e7-ae9d-0025901ef244 ONLINE 0 0 3 mirror-6 ONLINE 0 0 0 gptid/0dcbcccd-184d-11e7-ae9d-0025901ef244 ONLINE 0 0 0 gptid/0e9ed582-184d-11e7-ae9d-0025901ef244 ONLINE 0 0 0 mirror-7 ONLINE 0 0 0 gptid/2b56cf1c-184d-11e7-ae9d-0025901ef244 ONLINE 0 0 0 gptid/2c2a1c62-184d-11e7-ae9d-0025901ef244 ONLINE 0 0 0 mirror-8 ONLINE 0 0 0 gptid/69718320-184d-11e7-ae9d-0025901ef244 ONLINE 0 0 0 gptid/6a4e6afa-184d-11e7-ae9d-0025901ef244 ONLINE 0 0 0 mirror-9 ONLINE 0 0 0 gptid/8fccb2c6-184d-11e7-ae9d-0025901ef244 ONLINE 0 0 0 gptid/90b48d70-184d-11e7-ae9d-0025901ef244 ONLINE 0 0 0 mirror-10 ONLINE 0 0 0 gptid/aeda4f88-184d-11e7-ae9d-0025901ef244 ONLINE 0 0 0 gptid/afb5bfc3-184d-11e7-ae9d-0025901ef244 ONLINE 0 0 0 mirror-11 ONLINE 0 0 0 gptid/bca6e4f0-1b71-11e7-ae9d-0025901ef244 ONLINE 0 0 0 gptid/d06f5f86-7b79-11e7-b91b-0cc47a5312f0 ONLINE 0 0 0 errors: No known data errors