2023-09-03, 01:43 PM
(This post was last modified: 2023-09-03, 01:54 PM by buck. Edited 1 time in total.)
I currently have a server set up on an old system with a AMD FX 6300 CPU and a ATi Radeon HD 6770 GPU, running Jellyfin in Windows 10. I would like to watch movies in 4K 10-bit HDR HEVC on my playback devices, which are two separate Chromecast devices running the Jellyfin Android app. Alas, my antique hardware doesn't support this (to no surprise), and I am therefore trying to figure out what I should upgrade on this old hunk of junk to be able to play and transcode such content, and looking for recommendations from more experienced users.
Will an upgrade to a newer GPU like a Radeon RX 580 8GB do the trick (would cost me about 70 USD), or would it be beneficial to upgrade the entire system to a new platform, like a 10th or 11th gen Intel CPU with integrated graphics (about 250 USD)?
The GPU upgrade would certainly be a cheaper upfront investment, but consume more power. Afaik, the GPU upgrade would enable 10-bit HEVC decoding, according to the documentation, but not encoding. Do I need encoding support, or would decoding support suffice?
Will an upgrade to a newer GPU like a Radeon RX 580 8GB do the trick (would cost me about 70 USD), or would it be beneficial to upgrade the entire system to a new platform, like a 10th or 11th gen Intel CPU with integrated graphics (about 250 USD)?
The GPU upgrade would certainly be a cheaper upfront investment, but consume more power. Afaik, the GPU upgrade would enable 10-bit HEVC decoding, according to the documentation, but not encoding. Do I need encoding support, or would decoding support suffice?