2024-10-20, 07:53 PM
I'm not sure if I have an issue or not. If I do have an issue I'm guessing it's with my hardware configuration. I'm just looking for feedback on whether this is normal and if not, what I should be expecting.
My TrickPlay scanning is quite slow. It takes days. I've seen people refer to that being normal. But I have an Nvidia GPU and use an LXC in Proxmox. I can see the process in the GPU with nvtop and nvidia-smi. But Jellyfin is only using 1% of the GPU and the DEC percentage stays at 98-100%. Is that normal behavior?
No matter how many threads I set in TrickPlay settings, I see two -thread arguments in the process, the first one in the process is always 1 and the second one is the number I set in Jellyfin. Is it normal for DEC to be maxed out with such low GPU use and for two -thread arguments to exist in the process?
I'm also using the GPU in other containers. It's regularly doing some work in Frigate for NVR, but when Jellyfin is off the DEC percentage is pretty low. It also runs some AI stuff on occasion. The GPU is a Tesla P100 12GB and the total GPU use is around 1.2GB with TrickPlay and Frigate running.
I tried running an ffmpeg test script, which works in other containers, but it won't run with ffmpeg, jellyfin-ffmpeg, or jellyfin-ffmpeg6 as the argument. It says none of those are valid to run. Jellyfin-ffmpeg6 is installed along with the Nvidia drivrer with --no-kernel-module, Nvidia container runtime, Cudnn, etc.
My library is about 10TB. This could all be normal. I'm just not sure and want to make sure I'm not missing anything. I'm not in front of a computer now but can provide specific configs or screenshots later.
Thanks to anyone willing to help out!
My TrickPlay scanning is quite slow. It takes days. I've seen people refer to that being normal. But I have an Nvidia GPU and use an LXC in Proxmox. I can see the process in the GPU with nvtop and nvidia-smi. But Jellyfin is only using 1% of the GPU and the DEC percentage stays at 98-100%. Is that normal behavior?
No matter how many threads I set in TrickPlay settings, I see two -thread arguments in the process, the first one in the process is always 1 and the second one is the number I set in Jellyfin. Is it normal for DEC to be maxed out with such low GPU use and for two -thread arguments to exist in the process?
I'm also using the GPU in other containers. It's regularly doing some work in Frigate for NVR, but when Jellyfin is off the DEC percentage is pretty low. It also runs some AI stuff on occasion. The GPU is a Tesla P100 12GB and the total GPU use is around 1.2GB with TrickPlay and Frigate running.
I tried running an ffmpeg test script, which works in other containers, but it won't run with ffmpeg, jellyfin-ffmpeg, or jellyfin-ffmpeg6 as the argument. It says none of those are valid to run. Jellyfin-ffmpeg6 is installed along with the Nvidia drivrer with --no-kernel-module, Nvidia container runtime, Cudnn, etc.
My library is about 10TB. This could all be normal. I'm just not sure and want to make sure I'm not missing anything. I'm not in front of a computer now but can provide specific configs or screenshots later.
Thanks to anyone willing to help out!