2024-05-12, 09:15 AM
(2024-05-11, 11:17 AM)Efficient_Good_5784 Wrote:(2024-05-11, 10:43 AM)nurunet Wrote:Are you getting confused? I'm looking at the picture that @MisterMcDuck shared of his HWA decode settings and he has different settings.
- Why did you disable HW decoding for anything except AV1? I would have thought modern GPUs should be able to handle all those codecs.
The options he has enabled:
- H264
- HEVC
- HEVC 10-bit
- VC1
- VP9 10-bit
The options he has disabled:
- MPEG2
- VP8
- VP9
- AV1
Sorry, not a native speaker. I meant to ask why AV1 was not the sole "deselected" codec - as I believed the hardware should support all others?