2025-01-30, 05:04 AM
I'd already disabled real time monitoring to no effect. I do think there's something going on with Samba in my case, and I'm going to try to dive into that a bit further.
I tried running a simple script to hopefully open enough file descriptors. This allowed me to test directly on the NAS via ssh, as well as from MacOS.
On the NAS, this runs instantly and fails at 1012 open files, and returns "Too many open files"
From the Mac, accessing the same location, but via Samba, it's much slower (no real surprise there), but eventually slows to an incredibly slow rate, like, one new attempt per minute, but never hits a "Too many open files" error. So, I'm a bit more confident that something related to SMB is the issue here.
Thanks so much for your time, and let me know if you think of anything else I should try. But I do understand that an SMB mount issue is sort of outside of the scope of this forum.
I tried running a simple script to hopefully open enough file descriptors. This allowed me to test directly on the NAS via ssh, as well as from MacOS.
Code:
i=0
while true; do
touch "testfile_$i"
exec {fd}<"testfile_$i" || break
echo "Opened file descriptor $i"
((i++))
done
echo "Hit limit at $i open files!"
On the NAS, this runs instantly and fails at 1012 open files, and returns "Too many open files"
From the Mac, accessing the same location, but via Samba, it's much slower (no real surprise there), but eventually slows to an incredibly slow rate, like, one new attempt per minute, but never hits a "Too many open files" error. So, I'm a bit more confident that something related to SMB is the issue here.
Thanks so much for your time, and let me know if you think of anything else I should try. But I do understand that an SMB mount issue is sort of outside of the scope of this forum.