2024-10-25, 11:30 AM
Generally speaking, AMD should be avoided considering it features the poorest encode quality of all supported hardware. Here's the table from the documentation:
Apple ≥ Intel ≥ Nvidia >>> AMD
Depending on your media, you may want to stick with the 1070 since it can encode and decode H.265, unlike the 750 Ti; unless all your content is H.264 or you're willing to convert everything over. Overall, the card you're using at the moment simply has better video codec support all around. You can find all the info here. I'd also advise taking a look at the hardware selection section of the documentation, and the NVIDIA page.
Unfortunately many cards featuring good encoding tend to have increased power consumption, even on idle and without a graphical environment. Such is the case and a known drawback with Intel's excellent Arc series (for video encoding).
Apple ≥ Intel ≥ Nvidia >>> AMD
Depending on your media, you may want to stick with the 1070 since it can encode and decode H.265, unlike the 750 Ti; unless all your content is H.264 or you're willing to convert everything over. Overall, the card you're using at the moment simply has better video codec support all around. You can find all the info here. I'd also advise taking a look at the hardware selection section of the documentation, and the NVIDIA page.
Unfortunately many cards featuring good encoding tend to have increased power consumption, even on idle and without a graphical environment. Such is the case and a known drawback with Intel's excellent Arc series (for video encoding).
Server specs => OS: Debian 12 | GPU: Arc A380 | CPU: Ryzen 5 5600X | 64GB RAM | 56TB