Sorry, but I cannot find specific info. Is there a maximum number of datasets that can be created in a pool? Or per system? (Assuming you have the resources of course).
Please do not redirect users offsite especially for simple answers. If you want to post a link offsite to something highly complex and detailed, that's fine.
Sorry, but I cannot find specific info. Is there a maximum number of datasets that can be created in a pool? Or per system? (Assuming you have the resources of course).
It's effectively "nearly unlimited" but may start getting impractical once you get into the thousands. This is the same way that the number of supported snapshots, while theoretically unlimited, becomes unwieldy after a certain point. The classic Solaris answer of ""On systems with 1,000s of ZFS file systems, provision 1 GB of extra memory for every 10,000 mounted file systems including snapshots."" may also be relevant.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.