2024-10-17, 02:32 PM
When I use Jellyfin for hardware transcoding, the GPU usage is very high, reaching up to 95% or more.
However, in a previous version (I can't recall which one), the GPU usage remained stable at around 30%.
I've noticed that the transcoding logic seems to quickly transcode the set portion (I've set it to 3 minutes) by utilizing extremely high GPU usage, and after the task is complete, the GPU usage drops back to 0. This causes a problem: when someone is streaming with transcoding, my machine generates a lot of noise. I would prefer my GPU to continuously transcode rather than finishing the task quickly and then stopping.
(This is unrelated to the video bitrate, as this situation can occur even with a very low bitrate.)
My CPU is AMD Ryzen 7 8845H, and my GPU is RTX 4060 Laptop.
I am not sure if this is a bug or a change in the new version. I hope someone can clarify my doubts, and it would be great if a solution could be proposed. Thank you.
However, in a previous version (I can't recall which one), the GPU usage remained stable at around 30%.
I've noticed that the transcoding logic seems to quickly transcode the set portion (I've set it to 3 minutes) by utilizing extremely high GPU usage, and after the task is complete, the GPU usage drops back to 0. This causes a problem: when someone is streaming with transcoding, my machine generates a lot of noise. I would prefer my GPU to continuously transcode rather than finishing the task quickly and then stopping.
(This is unrelated to the video bitrate, as this situation can occur even with a very low bitrate.)
My CPU is AMD Ryzen 7 8845H, and my GPU is RTX 4060 Laptop.
I am not sure if this is a bug or a change in the new version. I hope someone can clarify my doubts, and it would be great if a solution could be proposed. Thank you.