2024-01-16, 07:28 PM
(This post was last modified: 2024-01-16, 07:40 PM by fizzik. Edited 1 time in total.)
Hey guys. Going to try and give you as much info as I possibly can, apologies if it's too much.
Running Proxmox with an Ubuntu Server 20.04 LTS. I have gotten GPU (Quadro P2000) passthrough working with my VM, so that is fortunately not an issue.
This VM in particular is running everything through Docker - Portainer.
Here is my compose file:
Here is a picture showing everything I have enabled. Meets all the specs on the Nvidia site for decoding compatibility:
As it stands, streaming works totally fine. Going off the logs, it doesn't look like it's using my CPU. By all accounts, it appears to be using the GPU. Logs here:
However, when I run a stream (making sure it's set to to not direct), nothing appears when running nvidia-smi. This is what I get when running 5 concurrent streams:
Any assistance you guys could offer would be greatly appreciated. If there is any information you guys need that I have not provided, please let me know. I'm sure you get asked this all the time...
Running Proxmox with an Ubuntu Server 20.04 LTS. I have gotten GPU (Quadro P2000) passthrough working with my VM, so that is fortunately not an issue.
This VM in particular is running everything through Docker - Portainer.
Here is my compose file:
Code:
version: '3.6'
services:
jellyfin:
image: jellyfin/jellyfin
user: 1000:1000
network_mode: 'host'
volumes:
- "/home/dylan/Downloads/jellysrv/config:/config"
- "/srv/Plex/Movies:/data/movies"
- "/srv/Plex/Shows:/data/tvshows"
ports:
- "1900:1900/udp"
- "7359:7359/udp"
- "8096:8096/tcp"
runtime: nvidia
environment:
- "PUID=1000"
- "PGID=1000"
- "TZ=America/New_York"
- "PATH=/lsiopy/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
- "HOME=/root"
- "LANGUAGE=en_US.UTF-8"
- "LANG=en_US.UTF-8"
- "TERM=xterm"
- "S6_CMD_WAIT_FOR_SERVICES_MAXTIME=0"
- "S6_VERBOSITY=1"
- "S6_STAGE2_HOOK=/docker-mods"
- "VIRTUAL_ENV=/lsiopy"
- "LSIO_FIRST_PARTY=true"
- "NVIDIA_DRIVER_CAPABILITIES=compute,video,utility"
cap_drop:
- "AUDIT_CONTROL"
- "BLOCK_SUSPEND"
- "DAC_READ_SEARCH"
- "IPC_LOCK"
- "IPC_OWNER"
- "LEASE"
- "LINUX_IMMUTABLE"
- "MAC_ADMIN"
- "MAC_OVERRIDE"
- "NET_ADMIN"
- "NET_BROADCAST"
- "SYSLOG"
- "SYS_ADMIN"
- "SYS_BOOT"
- "SYS_MODULE"
- "SYS_NICE"
- "SYS_PACCT"
- "SYS_PTRACE"
- "SYS_RAWIO"
- "SYS_RESOURCE"
- "SYS_TIME"
- "SYS_TTY_CONFIG"
- "WAKE_ALARM"
deploy:
resources:
reservations:
devices:
- capabilities: [gpu]
Here is a picture showing everything I have enabled. Meets all the specs on the Nvidia site for decoding compatibility:
As it stands, streaming works totally fine. Going off the logs, it doesn't look like it's using my CPU. By all accounts, it appears to be using the GPU. Logs here:
Code:
[2024-01-16 10:01:58.098 -05:00] [INF] [1] MediaBrowser.MediaEncoding.Encoder.MediaEncoder: Available filters: ["deinterlace_qsv", "deinterlace_vaapi", "hwupload_cuda", "hwupload_vaapi", "overlay_opencl", "overlay_qsv", "overlay_vaapi", "overlay_cuda", "procamp_vaapi", "scale_cuda", "scale_opencl", "scale_qsv", "scale_vaapi", "tonemap_cuda", "tonemap_opencl", "tonemap_vaapi", "vpp_qsv", "yadif_cuda", "zscale", "alphasrc"]
[2024-01-16 10:01:58.175 -05:00] [INF] [1] MediaBrowser.MediaEncoding.Encoder.MediaEncoder: Available hwaccel types: ["cuda", "vaapi", "qsv", "drm", "opencl", "vulkan"]
[2024-01-16 10:01:58.275 -05:00] [INF] [1] MediaBrowser.MediaEncoding.Encoder.MediaEncoder: FFmpeg: "/usr/lib/jellyfin-ffmpeg/ffmpeg"
However, when I run a stream (making sure it's set to to not direct), nothing appears when running nvidia-smi. This is what I get when running 5 concurrent streams:
Code:
dylan@dport:~$ docker exec -it jellyfin nvidia-smi
Tue Jan 16 14:25:02 2024
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.154.05 Driver Version: 535.154.05 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 Quadro P2000 Off | 00000000:01:00.0 Off | N/A |
| 67% 34C P0 18W / 75W | 0MiB / 5120MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| No running processes found |
+---------------------------------------------------------------------------------------+
Any assistance you guys could offer would be greatly appreciated. If there is any information you guys need that I have not provided, please let me know. I'm sure you get asked this all the time...