• Login
  • Register
  • Login Register
    Login
    Username/Email:
    Password:
    Or login with a social network below
  • Forum
  • Website
  • GitHub
  • Status
  • Translation
  • Features
  • Team
  • Rules
  • Help
  • Feeds
User Links
  • Login
  • Register
  • Login Register
    Login
    Username/Email:
    Password:
    Or login with a social network below

    Useful Links Forum Website GitHub Status Translation Features Team Rules Help Feeds
    Jellyfin Forum Support Troubleshooting Jellyfin Docker not using Vulkan with AMD for Tonemapping

     
    • 0 Vote(s) - 0 Average

    Jellyfin Docker not using Vulkan with AMD for Tonemapping

    Jademalo
    Offline

    Junior Member

    Posts: 4
    Threads: 2
    Joined: 2025 Mar
    Reputation: 0
    Country:United Kingdom
    #1
    2025-03-09, 04:55 PM
    I've been running Jellyfin as a regular install, and had both transcoding and hardware tonemapping enabled.
    I decided to switch to docker to make things easier to orchestrate, but I'm having a bit of an issue with hardware tonemapping.

    When enabling it, for some reason Jellyfin tries to use OpenCL, which doesn't work.

    Code:
    /usr/lib/jellyfin-ffmpeg/ffmpeg -analyzeduration 200M -probesize 1G -f mov,mp4,m4a,3gp,3g2,mj2 -init_hw_device opencl=ocl:0.0 -filter_hw_device ocl -hwaccel vaapi -hwaccel_output_format vaapi -noautorotate -i file:"/media/exodus.mp4" -noautoscale -map_metadata -1 -map_chapters -1 -threads 0 -map 0:0 -map 0:1 -map -0:s -codec:v:0 h264_vaapi -rc_mode VBR -b:v 42805084 -maxrate 42805084 -bufsize 85610168 -profile:v:0 high -sei -a53_cc -force_key_frames:0 "expr:gte(t,n_forced*3)" -vf "setparams=color_primaries=bt2020:color_trc=smpte2084:colorspace=bt2020nc,hwdownload,format=p010le,hwupload=derive_device=opencl,tonemap_opencl=format=nv12:p=bt709:t=bt709:m=bt709:tonemap=bt2390:peak=100:desat=0,hwdownload,format=nv12,hwupload_vaapi" -codec:a:0 libfdk_aac -ac 2 -ab 256000 -af "volume=2" -copyts -avoid_negative_ts disabled -max_muxing_queue_size 2048 -f hls -max_delay 5000000 -hls_time 3 -hls_segment_type fmp4 -hls_fmp4_init_filename "0d0ced89159a4038f8259906775d5e97-1.mp4" -start_number 0 -hls_segment_filename "/cache/transcodes/0d0ced89159a4038f8259906775d5e97%d.mp4" -hls_playlist_type vod -hls_list_size 0 -y "/cache/transcodes/0d0ced89159a4038f8259906775d5e97.m3u8"


    ffmpeg version 7.0.2-Jellyfin Copyright (c) 2000-2024 the FFmpeg developers
      built with gcc 12 (Debian 12.2.0-14)
      configuration: --prefix=/usr/lib/jellyfin-ffmpeg --target-os=linux --extra-version=Jellyfin --disable-doc --disable-ffplay --disable-ptx-compression --disable-static --disable-libxcb --disable-sdl2 --disable-xlib --enable-lto=auto --enable-gpl --enable-version3 --enable-shared --enable-gmp --enable-gnutls --enable-chromaprint --enable-opencl --enable-libdrm --enable-libxml2 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libfontconfig --enable-libharfbuzz --enable-libbluray --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libopenmpt --enable-libdav1d --enable-libsvtav1 --enable-libwebp --enable-libvpx --enable-libx264 --enable-libx265 --enable-libzvbi --enable-libzimg --enable-libfdk-aac --arch=amd64 --enable-libshaderc --enable-libplacebo --enable-vulkan --enable-vaapi --enable-amf --enable-libvpl --enable-ffnvcodec --enable-cuda --enable-cuda-llvm --enable-cuvid --enable-nvdec --enable-nvenc
      libavutil      59.  8.100 / 59.  8.100
      libavcodec    61.  3.100 / 61.  3.100
      libavformat    61.  1.100 / 61.  1.100
      libavdevice    61.  1.100 / 61.  1.100
      libavfilter    10.  1.100 / 10.  1.100
      libswscale      8.  1.100 /  8.  1.100
      libswresample  5.  1.100 /  5.  1.100
      libpostproc    58.  1.100 / 58.  1.100
    [AVHWDeviceContext @ 0x5d84884f3080] Failed to get number of OpenCL platforms: -1001.
    Device creation failed: -19.
    Failed to set value 'opencl=ocl:0.0' for option 'init_hw_device': No such device
    Error parsing global options: No such device

    Running the command to check for vulkan compatibility shows it as being set up correctly, and that ffpmeg in the container is seeing vulkan;

    Code:
    docker exec -it jellyfin /usr/lib/jellyfin-ffmpeg/ffmpeg -v debug -init_hw_device vulkan
    ffmpeg version 7.0.2-Jellyfin Copyright (c) 2000-2024 the FFmpeg developers
      built with gcc 12 (Debian 12.2.0-14)
      configuration: --prefix=/usr/lib/jellyfin-ffmpeg --target-os=linux --extra-version=Jellyfin --disable-doc --disable-ffplay --disable-ptx-compression --disable-static --disable-libxcb --disable-sdl2 --disable-xlib --enable-lto=auto --enable-gpl --enable-version3 --enable-shared --enable-gmp --enable-gnutls --enable-chromaprint --enable-opencl --enable-libdrm --enable-libxml2 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libfontconfig --enable-libharfbuzz --enable-libbluray --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libopenmpt --enable-libdav1d --enable-libsvtav1 --enable-libwebp --enable-libvpx --enable-libx264 --enable-libx265 --enable-libzvbi --enable-libzimg --enable-libfdk-aac --arch=amd64 --enable-libshaderc --enable-libplacebo --enable-vulkan --enable-vaapi --enable-amf --enable-libvpl --enable-ffnvcodec --enable-cuda --enable-cuda-llvm --enable-cuvid --enable-nvdec --enable-nvenc
      libavutil      59.  8.100 / 59.  8.100
      libavcodec    61.  3.100 / 61.  3.100
      libavformat    61.  1.100 / 61.  1.100
      libavdevice    61.  1.100 / 61.  1.100
      libavfilter    10.  1.100 / 10.  1.100
      libswscale      8.  1.100 /  8.  1.100
      libswresample  5.  1.100 /  5.  1.100
      libpostproc    58.  1.100 / 58.  1.100
    Splitting the commandline.
    Reading option '-v' ... matched as option 'v' (set logging level) with argument 'debug'.
    Reading option '-init_hw_device' ... matched as option 'init_hw_device' (initialise hardware device) with argument 'vulkan'.
    Finished splitting the commandline.
    Parsing a group of options: global .
    Applying option v (set logging level) with argument debug.
    Applying option init_hw_device (initialise hardware device) with argument vulkan.
    [AVHWDeviceContext @ 0x580f822ab6c0] Supported layers:
    [AVHWDeviceContext @ 0x580f822ab6c0]    VK_LAYER_MESA_device_select
    [AVHWDeviceContext @ 0x580f822ab6c0]    VK_LAYER_MESA_overlay
    [AVHWDeviceContext @ 0x580f822ab6c0] Using instance extension VK_KHR_portability_enumeration
    [AVHWDeviceContext @ 0x580f822ab6c0] GPU listing:
    [AVHWDeviceContext @ 0x580f822ab6c0]    0: AMD Radeon Graphics (RADV RAPHAEL_MENDOCINO) (integrated) (0x164e)
    [AVHWDeviceContext @ 0x580f822ab6c0] Device 0 selected: AMD Radeon Graphics (RADV RAPHAEL_MENDOCINO) (integrated) (0x164e)
    [AVHWDeviceContext @ 0x580f822ab6c0] Using device extension VK_KHR_push_descriptor
    [AVHWDeviceContext @ 0x580f822ab6c0] Using device extension VK_EXT_descriptor_buffer
    [AVHWDeviceContext @ 0x580f822ab6c0] Using device extension VK_EXT_physical_device_drm
    [AVHWDeviceContext @ 0x580f822ab6c0] Using device extension VK_EXT_shader_atomic_float
    [AVHWDeviceContext @ 0x580f822ab6c0] Using device extension VK_KHR_external_memory_fd
    [AVHWDeviceContext @ 0x580f822ab6c0] Using device extension VK_EXT_external_memory_dma_buf
    [AVHWDeviceContext @ 0x580f822ab6c0] Using device extension VK_EXT_image_drm_format_modifier
    [AVHWDeviceContext @ 0x580f822ab6c0] Using device extension VK_KHR_external_semaphore_fd
    [AVHWDeviceContext @ 0x580f822ab6c0] Using device extension VK_EXT_external_memory_host
    [AVHWDeviceContext @ 0x580f822ab6c0] Queue families:
    [AVHWDeviceContext @ 0x580f822ab6c0]    0: graphics compute transfer (queues: 1)
    [AVHWDeviceContext @ 0x580f822ab6c0]    1: compute transfer (queues: 4)
    [AVHWDeviceContext @ 0x580f822ab6c0]    2: sparse (queues: 1)
    [AVHWDeviceContext @ 0x580f822ab6c0] Using device: AMD Radeon Graphics (RADV RAPHAEL_MENDOCINO)
    [AVHWDeviceContext @ 0x580f822ab6c0] Alignments:
    [AVHWDeviceContext @ 0x580f822ab6c0]    optimalBufferCopyRowPitchAlignment: 1
    [AVHWDeviceContext @ 0x580f822ab6c0]    minMemoryMapAlignment:              4096
    [AVHWDeviceContext @ 0x580f822ab6c0]    nonCoherentAtomSize:                64
    [AVHWDeviceContext @ 0x580f822ab6c0]    minImportedHostPointerAlignment:    4096
    [AVHWDeviceContext @ 0x580f822ab6c0] Using queue family 0 (queues: 1) for graphics
    [AVHWDeviceContext @ 0x580f822ab6c0] Using queue family 1 (queues: 4) for compute transfers
    Successfully parsed a group of options.
    Universal media converter
    usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...

    Use -h to get full help or, even better, run 'man ffmpeg'

    Regular hardware transcode works fine, it's specifically tonemapping that fails.
    Here's my Docker Compose file, group 104 is render and the render device is passed through correctly;

    Code:
    services:
      jellyfin:
        image: jellyfin/jellyfin
        container_name: jellyfin
        user: 103:112
        group_add:
          - "104"
          - "44"
        network_mode: host
        volumes:
          - /etc/jellyfin:/config
          - /var/cache/jellyfin:/cache
          - type: bind
            source: /media
            target: /media
          - type: bind
            source: /media/fonts
            target: /usr/local/share/fonts/custom
            read_only: true
        devices:
          - /dev/dri/renderD128:/dev/dri/renderD128
        restart: unless-stopped
        # Optional - alternative address used for autodiscovery
        environment:
          - JELLYFIN_PublishedServerUrl=http://example.com
        # Optional - may be necessary for docker healthcheck to pass if running in host network mode
        extra_hosts:
          - host.docker.internal:host-gateway

    Any ideas? I'm totally stumped at this point.
    Thanks
    Jademalo
    Offline

    Junior Member

    Posts: 4
    Threads: 2
    Joined: 2025 Mar
    Reputation: 0
    Country:United Kingdom
    #2
    2025-03-09, 05:34 PM (This post was last modified: 2025-03-10, 10:20 AM by Jademalo. Edited 3 times in total.)
    I've got it working, but I've got a feeling there's still a bug here.

    Adding
    Code:
    - /dev/dri/card0:/dev/dri/card0
    to the docker compose file fixed it, but from my understanding that shouldn't actually be necessary?
    When running Jellyfin in an LXC directly I only need to pass /dev/dri/renderD128 through for it to work, but running a docker container within that LXC requires me to pass both card0 and renderD128.

    If it worked directly with just renderD128, why does the containerised version need card0 as well?
    I can only assume it's something strange with hardware detection in the container, since it was attempting to use OpenCL when Vulkan was there and working. I can only assume that when it sees card0 it realises it can use vulkan, but that doesn't explain why running it directly in the LXC was fine.
    « Next Oldest | Next Newest »

    Users browsing this thread: 1 Guest(s)


    • View a Printable Version
    • Subscribe to this thread
    Forum Jump:

    Home · Team · Help · Contact
    © Designed by D&D - Powered by MyBB
    L


    Jellyfin

    The Free Software Media System

    Linear Mode
    Threaded Mode