Failed to Create ZPool

praetorian46

Dabbler
Joined
Apr 26, 2013
Messages
15
Hello,
I'm trying to create a ZPool consisting of 12x Drivers in a Striped RaidZ2 (6xRaidZ2 + 6xRaidZ2)
I'm getting the below error in the UI.

I have an existing RaidZ2 with a Faulted disk and a Degraded disk so I'm afraid to reboot the box to see if this goes away.
It looks to me that the error is complaining about the smartd service failing.
The only thing that shows up in the console while trying to create this ZPool is regarding da15 which is the Faulted disk in the other RaidZ2 pool.
Any help would be appriciated.

System Info:
Code:
OS Version:
FreeNAS-11.2-U3
(Build Date: Mar 27, 2019 18:24)
Processor:
Six-Core AMD Opteron(tm) Processor 2419 EE (12 cores)
Memory:
32 GiB
HostName:
freenas
Uptime:
6:33PM up 11 days, 22:03, 1 user


Console Output:
Code:
undefinedMay 10 18:30:00 freenas savecore: error reading last dump header at offset 2147483136 in /dev/da15p1: Device not configured
May 10 18:30:03 freenas smartd[95803]: Device: /dev/da15 [SAT], 8 Currently unreadable (pending) sectors
May 10 18:30:03 freenas smartd[95803]: Device: /dev/da15 [SAT], 8 Offline uncorrectable sectors


Error:
Code:
 File "/usr/local/lib/python3.6/site-packages/tastypie/resources.py", line 219, in wrapper
    response = callback(request, *args, **kwargs)

  File "./freenasUI/api/resources.py", line 1414, in dispatch_list
    request, **kwargs

  File "/usr/local/lib/python3.6/site-packages/tastypie/resources.py", line 450, in dispatch_list
    return self.dispatch('list', request, **kwargs)

  File "./freenasUI/api/utils.py", line 251, in dispatch
    request_type, request, *args, **kwargs

  File "/usr/local/lib/python3.6/site-packages/tastypie/resources.py", line 482, in dispatch
    response = method(request, **kwargs)

  File "/usr/local/lib/python3.6/site-packages/tastypie/resources.py", line 1384, in post_list
    updated_bundle = self.obj_create(bundle, **self.remove_api_resource_names(kwargs))

  File "/usr/local/lib/python3.6/site-packages/tastypie/resources.py", line 2175, in obj_create
    return self.save(bundle)

  File "./freenasUI/api/utils.py", line 415, in save
    form.save()

  File "./freenasUI/storage/forms.py", line 311, in save
    raise e

  File "./freenasUI/storage/forms.py", line 300, in save
    notifier().create_volume(volume, groups=grouped, init_rand=init_rand)

  File "./freenasUI/middleware/notifier.py", line 760, in create_volume
    vdevs = self.__prepare_zfs_vdev(vgrp['disks'], vdev_swapsize, encrypt, volume)

  File "./freenasUI/middleware/notifier.py", line 695, in __prepare_zfs_vdev
    swapsize=swapsize)

  File "./freenasUI/middleware/notifier.py", line 341, in __gpt_labeldisk
    c.call('disk.wipe', devname, 'QUICK', job=True)

  File "./freenasUI/middleware/notifier.py", line 341, in __gpt_labeldisk
    c.call('disk.wipe', devname, 'QUICK', job=True)

  File "/usr/local/lib/python3.6/site-packages/middlewared/client/client.py", line 477, in call
    raise ClientException(job['error'], trace=job['exception'])

middlewared.client.client.ClientException: [ESERVICESTARTFAILURE] The smartd service failed to start
 

praetorian46

Dabbler
Joined
Apr 26, 2013
Messages
15
A little update:
If I run the following command manually I can create the ZPool just fine:
Code:
root@freenas:~ # zpool create Hill-RaidZ2 raidz2 /dev/da0 /dev/da1 /dev/da2 /dev/da3 /dev/da4 /dev/da5 raidz2 /dev/da8 /dev/da9 /dev/da10 /dev/da11 /dev/da21 /dev/da22
root@freenas:~ # zpool status Hill-RaidZ2
  pool: Hill-RaidZ2
 state: ONLINE
  scan: none requested
config:

    NAME        STATE     READ WRITE CKSUM
    Hill-RaidZ2  ONLINE       0     0     0
      raidz2-0  ONLINE       0     0     0
        da0     ONLINE       0     0     0
        da1     ONLINE       0     0     0
        da2     ONLINE       0     0     0
        da3     ONLINE       0     0     0
        da4     ONLINE       0     0     0
        da5     ONLINE       0     0     0
      raidz2-1  ONLINE       0     0     0
        da8     ONLINE       0     0     0
        da9     ONLINE       0     0     0
        da10    ONLINE       0     0     0
        da11    ONLINE       0     0     0
        da21    ONLINE       0     0     0
        da22    ONLINE       0     0     0

errors: No known data errors


But then I when I try to import the ZPool... It doesn't show up in the list in the FreeNAS UI to import.
So I guess that's not going to work well...

Thank you,
Cody Hill
 

praetorian46

Dabbler
Joined
Apr 26, 2013
Messages
15
Well... Update Number 2.

I disabled the S.M.A.R.T. Service... And then created the ZPool.
This seemed to do the trick. I then re-enabled the service and now I'm migrating all my data off of the ZPool with 2x bad disks.

Maybe this post will helps someone in the future?

Thank you,
Cody Hill
 

diedrichg

Wizard
Joined
Dec 4, 2012
Messages
1,319
Well... Update Number 2.

I disabled the S.M.A.R.T. Service... And then created the ZPool.
This seemed to do the trick. I then re-enabled the service and now I'm migrating all my data off of the ZPool with 2x bad disks.

Maybe this post will helps someone in the future?

Thank you,
Cody Hill
Nice job investigating. This sounds like a bug. Would you please create an issue for it at
https://jira.ixsystems.com/projects...rojects-plugin:release-page&status=unreleased

and then post the bug # here. Thanks.
 

Qosmo

Cadet
Joined
Dec 20, 2014
Messages
7
Well... Update Number 2.

I disabled the S.M.A.R.T. Service... And then created the ZPool.
This seemed to do the trick. I then re-enabled the service and now I'm migrating all my data off of the ZPool with 2x bad disks.

Maybe this post will helps someone in the future?

Thank you,
Cody Hill
This worked for me tonight. Was replacing a failed drive and after disabling the SMART service the drive was added successfully. Thanks for replying with how you fixed it!!
 
Top