2025-03-05, 08:33 PM
(This post was last modified: 2025-03-05, 09:09 PM by _Nick. Edited 5 times in total.)
Hi all,
I'm configuring NVDEC hardware decoding in Jellyfin for my RTX 3070 Ti, and I need clarification on how HEVC RExt (8/10bit, 12bit) is implemented.
According to NVIDIA’s NVDEC matrix, my GPU supports:
✅ HEVC 4:2:0 (8-bit, 10-bit, 12-bit)
❌ HEVC 4:2:2 (8-bit, 10-bit) → NOT supported
✅ HEVC 4:4:4 (8-bit, 10-bit, 12-bit)
The issue: Does Jellyfin’s HEVC RExt setting apply only to 4:4:4 (which my GPU supports) or does it also include 4:2:2 (which my GPU does NOT support)?
If HEVC RExt includes 4:2:2, then enabling it could cause problems. But if it’s only for 4:4:4, then I should tick it?
Has anyone tested this, or can the devs clarify how Jellyfin handles these two settings?
Thanks!
Nick
![[Image: attachment.php?aid=6980]](https://forum.jellyfin.org/attachment.php?aid=6980)
![[Image: attachment.php?aid=6982]](https://forum.jellyfin.org/attachment.php?aid=6982)
I'm configuring NVDEC hardware decoding in Jellyfin for my RTX 3070 Ti, and I need clarification on how HEVC RExt (8/10bit, 12bit) is implemented.
According to NVIDIA’s NVDEC matrix, my GPU supports:
✅ HEVC 4:2:0 (8-bit, 10-bit, 12-bit)
❌ HEVC 4:2:2 (8-bit, 10-bit) → NOT supported
✅ HEVC 4:4:4 (8-bit, 10-bit, 12-bit)
The issue: Does Jellyfin’s HEVC RExt setting apply only to 4:4:4 (which my GPU supports) or does it also include 4:2:2 (which my GPU does NOT support)?
If HEVC RExt includes 4:2:2, then enabling it could cause problems. But if it’s only for 4:4:4, then I should tick it?
Has anyone tested this, or can the devs clarify how Jellyfin handles these two settings?
Thanks!
Nick