2024-05-25, 09:50 PM
(This post was last modified: 2024-05-25, 10:06 PM by zjeffer. Edited 3 times in total.)
I've found something with some help from ChatGPT 
This ffmpeg command is executed by jellyfin:
and produces this error:
But when I change
What's the difference between hwupload_vaapi and hwupload? Does hwupload still use hardware encoding of some kind?

This ffmpeg command is executed by jellyfin:
Code:
/usr/lib/jellyfin-ffmpeg/ffmpeg -analyzeduration 200M -probesize 1G -init_hw_device vaapi=va:/dev/dri/renderD129,driver=i965 -noautorotate -i file:"/media/Films/Poor.Things.2023.1080p.WEBRip.DDP5.1.x265.10bit-GalaxyRG265[TGx]/Poor.Things.2023.1080p.WEBRip.DDP5.1.x265.10bit-GalaxyRG265.mkv" -map_metadata -1 -map_chapters -1 -threads 0 -map 0:0 -map 0:1 -map -0:s -codec:v:0 h264_vaapi -rc_mode CBR -b:v 7289035 -maxrate 7289035 -bufsize 14578070 -sei -a53_cc -force_key_frames:0 "expr:gte(t,n_forced*3)" -vf "setparams=color_primaries=bt709:color_trc=bt709:colorspace=bt709,scale=trunc(min(max(iw\,ih*a)\,min(1790\,1080*a))/2)*2:trunc(min(max(iw/a\,ih)\,min(1790/a\,1080))/2)*2,format=nv12,hwupload_vaapi" -codec:a:0 libfdk_aac -ac 6 -ab 640000 -copyts -avoid_negative_ts disabled -max_muxing_queue_size 2048 -f hls -max_delay 5000000 -hls_time 3 -hls_segment_type fmp4 -hls_fmp4_init_filename "36c3ace70048e6dc9742510797b05bb2-1.mp4" -start_number 0 -hls_segment_filename "/config/data/transcodes/36c3ace70048e6dc9742510797b05bb2%d.mp4" -hls_playlist_type vod -hls_list_size 0 -y "/config/data/transcodes/36c3ace70048e6dc9742510797b05bb2.m3u8"
and produces this error:
Code:
DRM_IOCTL_I915_GEM_APERTURE failed: Invalid argument
Assuming 131072kB available aperture size.
May lead to reduced performance or incorrect rendering.
get chip id failed: -1 [22]
param: 4, val: 0
i915 does not support EXECBUFER2
[AVHWDeviceContext @ 0x58fb4a638bc0] libva: /usr/lib/jellyfin-ffmpeg/lib/dri/i965_drv_video.so init failed
[AVHWDeviceContext @ 0x58fb4a638bc0] Failed to initialise VAAPI connection: -1 (unknown libva error).
[AVFilterGraph @ 0x58fb4930d5c0] Error initializing filters
Error reinitializing filters!
Failed to inject frame into filter network: Input/output error
Error while processing the decoded data for stream #0:0
[libfdk_aac @ 0x58fb4932c500] 2 frames left in the queue on closing
Conversion failed!%
But when I change
format=nv12,hwupload_vaapi
to format=nv12,hwupload
in that command and run it manually within the docker container, it transcodes fine, and with much less CPU usage (20% instead of 80-90%) compared to disabling the "hardware encoding" checkbox. However, it also transcodes much slower than disabling hardware encoding: 2x instead of 6-7x.What's the difference between hwupload_vaapi and hwupload? Does hwupload still use hardware encoding of some kind?