I was also experimenting with running one cron job every two or three minutes (I was thinking it could save energy), but the verdict isn't out yet. It takes time to properly test because every time I go in to test I'm refreshing the directories myself.
Has anyone else already tried this?
I did this as a test a while back. Also tried to do this for a customer a week or two ago. It didn't go as well as we had hoped. The major problem is that if you have lots of directories and files caching that stuff in RAM requires a system with a HUGE amount of RAM. Metadata cannot consume more than 30% of your ARC, so if you have 30GB of metadata on your pool you'll need 100GB of ARC! Then, if you play tricks and force it into L2ARC it takes significant time to "warm up" so that doesn't work too well either. People aren't too thrilled with the idea of reboot/booting up their server and waiting 24 hours while the server 'gets warmed up' for higher speed.
Then you have stuff like Windows clients that open a directory of pictures and background process automatically kick off thumbnailing every picture and comparing the thumbnails generated to what is in thumbs.db. That causes more load on the server that is not cached at all with any kind of ls or smb client command. Unfortunately this also runs the risk of flushing information from the ARC/L2ARC that you might be deliberately trying to cache.
I wont lie, there's no silver bullet except to go all-SSD so that the metadata will always have microsecond seek times. But for most people that's not a particularly cost effective option. :/