2024-05-12, 04:22 PM
(This post was last modified: 2024-05-12, 04:27 PM by Efficient_Good_5784. Edited 2 times in total.)
@MisterMcDuck states that he mainly stuck with the default settings, so he didn't mess with the settings that were given to him.
Just in case you don't know, just because certain hardware is more modern (or recent) doesn't mean that it supports a lot more things. Newer hardware may drop support for older things. In this case with HWA support on a GPU, it sometimes is impossible for a company to build a GPU with physical support for all the new and old video codecs, so they make decisions on when to add or drop support for them. It's not a software thing, it's a hardware thing. There's literal traces on the GPU boards that provide the HWA ability for certain codecs. I'm not sure of any exact reasons (and there could be multiple), but I would imagine space constraints would be one reason why GPU companies will have to give up on certain codecs and add others.
In this case, MisterMcDuck has an AMD 5700XT, which uses the Video Core Next (VCN) 2.0 HWA encoding implementation.
If you look at the Wiki page for VCN, you'll see a paragraph detailing what VCN supports as a baseline for HWA encoding and decoding along with a table that further shows what 2.0 specifically supports apart from the baseline.
Doing this work for MisterMcDuck, the decode settings that he should enable and disable are as follows.
Enable:
If you're curious about this and have an Intel GPU/iGPU, you can figure this out too by looking at Intel's Quick Sync Wiki page for HWA encode & decode support.
Just in case you don't know, just because certain hardware is more modern (or recent) doesn't mean that it supports a lot more things. Newer hardware may drop support for older things. In this case with HWA support on a GPU, it sometimes is impossible for a company to build a GPU with physical support for all the new and old video codecs, so they make decisions on when to add or drop support for them. It's not a software thing, it's a hardware thing. There's literal traces on the GPU boards that provide the HWA ability for certain codecs. I'm not sure of any exact reasons (and there could be multiple), but I would imagine space constraints would be one reason why GPU companies will have to give up on certain codecs and add others.
In this case, MisterMcDuck has an AMD 5700XT, which uses the Video Core Next (VCN) 2.0 HWA encoding implementation.
If you look at the Wiki page for VCN, you'll see a paragraph detailing what VCN supports as a baseline for HWA encoding and decoding along with a table that further shows what 2.0 specifically supports apart from the baseline.
Doing this work for MisterMcDuck, the decode settings that he should enable and disable are as follows.
Enable:
- H264
- HEVC
- MPEG2
- VC1
- VP9
- HEVC 10bit
- VP9 10bit
- VP8
- AV1
If you're curious about this and have an Intel GPU/iGPU, you can figure this out too by looking at Intel's Quick Sync Wiki page for HWA encode & decode support.