Today, 03:06 AM
(This post was last modified: Today, 03:13 AM by fiverglem. Edited 1 time in total.)
I just popped a GTX 1070 into my server and got the newest driver installed and rebooted then started jellyfin up and enabled everything for GPU transcoding but it still uses my CPU to transcode. Followed the instructions on the website and been googling this issue for about 4 days now and cannot figure it out. Reinstalled my GPU drivers (clean install) and rebooted, reinstalled jellyfin and rebooted. Server is running the following hardware/software: OS: Windows 10 64bit CPU: Intel i7 4790 GPU: GTX 1070. Attempting to watch a video on my gaming PC (through firefox) send the CPU to 20-40% usage and the GPU remains idle, i have all of the video decoders enabled and have the acceleration set to: NVENC and i cannot for the life of me get it to work. Anymore information needed please ask and i will post the information needed, if it makes a difference to check what the hardware is doing i am using microsoft remote desktop client and the server has nothing plugged into the GPU as it sits on a server rack in the closet. Thank you in advanced!
EDIT 1: Just noticed that Plex does not transcode (the same video) when using their desktop app but does still use the CPU to transcode when watching through Firefox, so now that i know firefox is causing transcoding to happen is there a reason that plex and jellyfin would still be using my CPU to transcode instead of the GPU?
EDIT 1: Just noticed that Plex does not transcode (the same video) when using their desktop app but does still use the CPU to transcode when watching through Firefox, so now that i know firefox is causing transcoding to happen is there a reason that plex and jellyfin would still be using my CPU to transcode instead of the GPU?