[EFAULT] Failed to wipe disk sdb: [Errno 5] Input/output error

tcwdjohn

Cadet
Joined
Feb 6, 2022
Messages
9
Hi all, hope someone can please help me.

I have the above error when trying to create a pool or wipe 2 of my WD Red 2TB hard drives, these drives were used in an old Freenas server which I removed because the PSU went faulty,

I have tried the drives in a Windows machine and windows cannot see the drives, so have converted another machine into a Truenas box running the latest version of Truenas Scale booting from a 250GB SSD.

When I try to create a pool, I get this error, I am a newbie to FreeBSD/Linux so do not know many commands, so please could you help.


Code:
Error: Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/middlewared/job.py", line 409, in run
    await self.future
  File "/usr/lib/python3/dist-packages/middlewared/job.py", line 445, in __run_body
    rv = await self.method(*([self] + args))
  File "/usr/lib/python3/dist-packages/middlewared/schema.py", line 1137, in nf
    res = await f(*args, **kwargs)
  File "/usr/lib/python3/dist-packages/middlewared/schema.py", line 1269, in nf
    return await func(*args, **kwargs)
  File "/usr/lib/python3/dist-packages/middlewared/plugins/pool.py", line 757, in do_create
    formatted_disks = await self.middleware.call('pool.format_disks', job, disks)
  File "/usr/lib/python3/dist-packages/middlewared/main.py", line 1324, in call
    return await self._call(
  File "/usr/lib/python3/dist-packages/middlewared/main.py", line 1281, in _call
    return await methodobj(*prepared_call.args)
  File "/usr/lib/python3/dist-packages/middlewared/plugins/pool_/format_disks.py", line 56, in format_disks
    await asyncio_map(format_disk, disks.items(), limit=16)
  File "/usr/lib/python3/dist-packages/middlewared/utils/asyncio_.py", line 16, in asyncio_map
    return await asyncio.gather(*futures)
  File "/usr/lib/python3/dist-packages/middlewared/utils/asyncio_.py", line 13, in func
    return await real_func(arg)
  File "/usr/lib/python3/dist-packages/middlewared/plugins/pool_/format_disks.py", line 29, in format_disk
    await self.middleware.call(
  File "/usr/lib/python3/dist-packages/middlewared/main.py", line 1324, in call
    return await self._call(
  File "/usr/lib/python3/dist-packages/middlewared/main.py", line 1292, in _call
    return await self.run_in_executor(prepared_call.executor, methodobj, *prepared_call.args)
  File "/usr/lib/python3/dist-packages/middlewared/main.py", line 1192, in run_in_executor
    return await loop.run_in_executor(pool, functools.partial(method, *args, **kwargs))
  File "/usr/lib/python3.9/concurrent/futures/thread.py", line 52, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/lib/python3/dist-packages/middlewared/plugins/disk_/format.py", line 26, in format
    raise CallError(f'Failed to wipe disk {disk}: {job.error}')
middlewared.service_exception.CallError: [EFAULT] Failed to wipe disk sdb: [Errno 5] Input/output error
 

Alecmascot

Guru
Joined
Mar 18, 2014
Messages
1,177
Search the forum for "Failed to wipe disk"
 

tcwdjohn

Cadet
Joined
Feb 6, 2022
Messages
9
Hi, not very helpful, i have searched but it all seems either not the same problem or asking to do stuff with gpart which according to what I can see is not even installed on my truenas fresh install.

I also get this error trying to install gparted

E: Package 'gpart' has no installation candidate
root@truenas[/sbin]# sudo apt update
Err:1 http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/angelfish truenas InRelease
Could not resolve 'apt.tn.ixsystems.com'
Err:2 http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/gluster bullseye InRelease
Could not resolve 'apt.tn.ixsystems.com'
Err:3 http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/docker buster InRelease
Could not resolve 'apt.tn.ixsystems.com'
Err:4 http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/libnvidia bullseye InRelease
Could not resolve 'apt.tn.ixsystems.com'
Err:5 http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/nvidia-container bullseye InRelease
Could not resolve 'apt.tn.ixsystems.com'
Err:6 http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/nvidia-docker bullseye InRelease
Could not resolve 'apt.tn.ixsystems.com'
Err:7 http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/debian bullseye InRelease
Could not resolve 'apt.tn.ixsystems.com'
Err:8 http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/debian-debug bullseye-debug InRelease
Could not resolve 'apt.tn.ixsystems.com'
Err:9 http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/helm all InRelease
Could not resolve 'apt.tn.ixsystems.com'
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
All packages are up to date.
W: Failed to fetch http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/angelfish/dists/truenas/InRelease Could not resolve 'apt.tn.ixsystems.com'
W: Failed to fetch http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/gluster/dists/bullseye/InRelease Could not resolve 'apt.tn.ixsystems.com'
W: Failed to fetch http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/docker/dists/buster/InRelease Could not resolve 'apt.tn.ixsystems.com'
W: Failed to fetch http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/libnvidia/dists/bullseye/InRelease Could not resolve 'apt.tn.ixsystems.com'
W: Failed to fetch http://apt.tn.ixsystems.com/apt-dir...C.2/nvidia-container/dists/bullseye/InRelease Could not resolve 'apt.tn.ixsystems.com'
W: Failed to fetch http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/nvidia-docker/dists/bullseye/InRelease Could not resolve 'apt.tn.ixsystems.com'
W: Failed to fetch http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/debian/dists/bullseye/InRelease Could not resolve 'apt.tn.ixsystems.com'
W: Failed to fetch http://apt.tn.ixsystems.com/apt-dir...2/debian-debug/dists/bullseye-debug/InRelease Could not resolve 'apt.tn.ixsystems.com'
W: Failed to fetch http://apt.tn.ixsystems.com/apt-direct/angelfish/22.02-RC.2/helm/dists/all/InRelease Could not resolve 'apt.tn.ixsystems.com'
W: Some index files failed to download. They have been ignored, or old ones used instead.
root@truenas[/sbin]# sudo apt install --assume-yes gpart
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
Package gpart is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source

E: Package 'gpart' has no installation candidate
root@truenas[/sbin]# sudo apt-get install gparted
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
E: Unable to locate package gparted
root@truenas[/sbin]#
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
gpart is the GEOM partition management tool on FreeBSD.

gparted is the GNU partition editor (or GNOME partition editor or GRAPHICAL partition editor or something else like that). Since it is a graphical tool, this will not be running on either Scale or CORE because both lack the graphical environment needed. It is the go-to partitioning tool for Linux newbies because it resembles something they can understand.

Since Linux has a trainwreck of somewhat competitive strategies for disk partitioning and management, and since I don't know what SCALE is doing there, my suggestion would be to use the TrueNAS GUI and find the disk wipe option. A backup alternative might be something like

dd if=/dev/zero of=/dev/sdb bs=1048576 count=1024
 

tcwdjohn

Cadet
Joined
Feb 6, 2022
Messages
9
gpart is the GEOM partition management tool on FreeBSD.

gparted is the GNU partition editor (or GNOME partition editor or GRAPHICAL partition editor or something else like that). Since it is a graphical tool, this will not be running on either Scale or CORE because both lack the graphical environment needed. It is the go-to partitioning tool for Linux newbies because it resembles something they can understand.

Since Linux has a trainwreck of somewhat competitive strategies for disk partitioning and management, and since I don't know what SCALE is doing there, my suggestion would be to use the TrueNAS GUI and find the disk wipe option. A backup alternative might be something like

dd if=/dev/zero of=/dev/sdb bs=1048576 count=1024
Ah got you, so not installed. I have tried the disk wipe option but it gives me the same I/O error

I have just run command you have put and its still not given me any response in putty/ssh, so maybe its doing something

Will update, thank you
 

tcwdjohn

Cadet
Joined
Feb 6, 2022
Messages
9
Ah got you, so not installed. I have tried the disk wipe option but it gives me the same I/O error

I have just run command you have put and its still not given me any response in putty/ssh, so maybe its doing something

Will update, thank you
Screenshot 2022-02-06 201532.png
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
Should run pretty quickly, maybe 10-20 seconds. If it doesn't, it's possible the disk is not in good health.
 

tcwdjohn

Cadet
Joined
Feb 6, 2022
Messages
9
Should run pretty quickly, maybe 10-20 seconds. If it doesn't, it's possible the disk is not in good health.
Ok, yes still not completed, not sure if its crashed, the smart status of the drive says its fine, is there any other commands I can do to check the drive? sorry for being a noob,
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
I may be the wrong person for this question. I typically just junk drives that give me trouble. Plus I'm a FreeBSD guy, so debugging this on Linux isn't a strength here.
 

tcwdjohn

Cadet
Joined
Feb 6, 2022
Messages
9
I may be the wrong person for this question. I typically just junk drives that give me trouble. Plus I'm a FreeBSD guy, so debugging this on Linux isn't a strength here.
Ok, could it have anything to do with the drives being setup with raid on last system?
 

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
Maybe, but sorta unlikely if your last system was FreeNAS.
 

D0doooh

Cadet
Joined
Feb 17, 2022
Messages
1
Were you able to solve your problem?

I have the exact same problem, I am trying to create a pool with eight Seagate Exos X18 18TB HDD's with my IBM M1015 (Flashed in IT Mode), the disks are all fine, Smart status as well as purchased new.
 

stajo

Explorer
Joined
Jan 3, 2020
Messages
71
I have the same problem, tried with 3 disks now. Failed to wipe disk sdb: [Errno 5] Input/output error.

IBM M1015 (Flashed in IT Mode)
 
Joined
Oct 10, 2023
Messages
7
Hi. I also have the same problem, but for all 7 disks. No one from them can be wiped.
I plugged them on sata connector on mainboard.
 
Top