2025-07-12, 09:26 PM
Enabling deinterlacing in client settings seems great for viewing older interlaced content because it takes load off of the server and prevents unecessary reencoding. The problem is that, as currently implemented, enabling this setting causes everything to be deinterlaced, including progressive scan content. While I haven't perceived any visual artifacts from this process, it does consume far more client resources, preventing smooth playback of 4K content on some of my devices. Ideally the client deinterlacer would be able to detect whether the source is interlaced and only kick in as needed. Forgive me if this is a known bug/technical limitation/feature request, but if not then I will submit it as one myself. Just probing to see what's known about this, thanks.