Ubuntu jail file count out of sync in directory with millions of files

Status
Not open for further replies.

Forrest Gump

Cadet
Joined
Dec 15, 2014
Messages
5
I've got an Ubuntu Jail on FreeNAS 9.3 (and had one with 9.2.x as well), and I'm seeing a very strange behavior with a directory that has millions of files in it (and growing). The directory in question is shared via the built-in "Add Storage" feature.

root@ubuntu:/mnt/test# ls | wc -l
5432395
root@ubuntu:/mnt/test# ls | wc -l
4587545
root@ubuntu:/mnt/test# ls | wc -l
404516
root@ubuntu:/mnt/test# ls | wc -l
3895953


The first attempt is an accurate count of the number of files in the directory. However, the three subsequent attempts are not accurate. Eventually the count will get back to normal again, but also likely cycle as demonstrated above.

Notes:
  • Yes, I realize that millions of files in a single directory is a bad idea. But I expect consistent behavior.
  • It's not immediately obvious when the file count is below what it should be, which files are missing. Oldest files? Newest files? Something nondeterministic?
When doing the same test outside of a jail, or even in a standard FreeBSD jail, the file count is accurate (Currently ~5.4Million files and growing)

Thoughts?
 
D

dlavigne

Guest
Wonders if xargs needs to be involved giving the number of files being piped into wc....
 

Forrest Gump

Cadet
Joined
Dec 15, 2014
Messages
5
Wonders if xargs needs to be involved given the number of files being piped into wc....
No difference. Given that there's no wildcard expansion happening, I wouldn't expect any difference or limit. ls | wc -l is the simplest example. find behaves the same way, as does the python:
files = os.listdir('.')
print('Files in directory: %s' % len(files))
 
Status
Not open for further replies.
Top