2023-09-19, 09:36 AM
Thanks for your reply.
I can't say that I would agree with the "definition" of the ideal above (I know that it wasn't meant that way). I'd say, the ideal would be that the studios go and scan their negatives at at least 4K or 8K, and then re-master properly for release. Because even they can't, in almost all cases, provide a good upscale and sell it as 4K UHD (e.g. Pirates of the Caribbean 1, one of the worst 4K UHD discs ever released). And if they can't, what are our chances of coming even close?
Concluding: I have tested and found nothing but bad results, and also it would waste a lot of storage space, time and electricity. Even major studios have failed repeatedly with upscales. And on the other hand, I do have an efficient process of watching these videos at very good quality. It's just that I like the Jellyfin software as a whole and would like to use it, replacing the Dune + Kodi part of my video chain. But not at the expense of image quality. Hence my inquiry.
So basically your last sentence ("If the client decides it's going to upscale after the fact...") is what I'm after. Can the client be "taught" to realize that if the original resolution of a given file is within the display device's capabilities that then no further upscaling is required? (Again, as an option, this is not suppossed to become the standard MO.)
All these resolutions/display modes are in the HDMI specifications, so it is not like asking the client to output at some odd custom resolution/display mode. It is more like saying don't bother upscaling, there's another device in the video chain that'll take it from here. Just like saying leave the audio alone, there's an audio processor down the line and that'll take care of it. It's like resolution and rate matching, but consistently enforced for all(!) HDMI specified display modes, not just 1080p vs. 2160p, and in accordance with the actual EDID.
I sincerely hope that nothing of the above is sounding offensive in any way. It definitely isn't meant to be.
(2023-09-18, 10:45 PM)bitmap Wrote: So...can I ask, since you are very particular about this...why haven't you found a filter and process to de-interlace and upscale these to your liking and provide those while preserving the originals? That would be the ideal -- have an "HD" copy that's pre-upscaled and de-interlaced meaning the client doesn't need to do any post-processing, you've already done it. You get to handle everything. You could run it through VapourSynth or just figure out the ffmpeg options to take care of it to your liking.Well, this is not so easy. First, I've tested several upscalers, even commercial ones, but especially for SD content, the results range from not good to ridiculous. Usually, tiny issues in the source get amplified by not so intelligent AI scalers etc. Then, this needs a lot of time for a single file, even if powered by an RTX 4090 (let's ignore the huge waste of electricity for the moment). Afterwards, the result would have to be checked (i.e. real-time watching) to make sure, everything's so good, the original could be deleted. Keeping both would bloat the storage space requirements significantly (again, even more waste of resources and electricity). Doing this for the thousands of DVDs I have? At least from my point of view, this isn't feasible, and also not ecologically sound. On the other hand, I have a video processor that does better upscaling on the fly, with a <50W or something power budget. I can watch a lot of movies/TV shows that way before even reaching the electricity consumption of a single (not so good) upscale. Also, considering the amount of time saved by just doing it on the fly, while actually watching it for enjoyment, that's fascinatingly efficient from an overall perspective.
I can't say that I would agree with the "definition" of the ideal above (I know that it wasn't meant that way). I'd say, the ideal would be that the studios go and scan their negatives at at least 4K or 8K, and then re-master properly for release. Because even they can't, in almost all cases, provide a good upscale and sell it as 4K UHD (e.g. Pirates of the Caribbean 1, one of the worst 4K UHD discs ever released). And if they can't, what are our chances of coming even close?
Concluding: I have tested and found nothing but bad results, and also it would waste a lot of storage space, time and electricity. Even major studios have failed repeatedly with upscales. And on the other hand, I do have an efficient process of watching these videos at very good quality. It's just that I like the Jellyfin software as a whole and would like to use it, replacing the Dune + Kodi part of my video chain. But not at the expense of image quality. Hence my inquiry.
(2023-09-18, 10:45 PM)bitmap Wrote: I only ask because forcing specific bounds on transcoding is not particularly feasible. A lot of this is dictated by client device profiles as well as compatibility. So whatever a client says it will work with, the server sends. If the client decides it's going to upscale after the fact, the server has no say in the matter.Yeah, that's alright. I'm not asking Jellyfin (server) to do anything about it. I'm asking whether the client (i.e. Swiftfin) can be configured to act like this:
- The user selects a video file from the server, let's say an NTSC DVD based *.mkv (i.e. 480i)
- The client checks with the OS it is running on what the display device connected to the output port is capable of ("Hey, can we do 480i or 480p?")
- The OS checks the EDID and responds ("Yeah, we can do 480p and we have resolution + rate matching.")
- The client then requests the video from the server in original quality and de-interlaced
- Finally, playback is initiated at 480p and sent to the output port
So basically your last sentence ("If the client decides it's going to upscale after the fact...") is what I'm after. Can the client be "taught" to realize that if the original resolution of a given file is within the display device's capabilities that then no further upscaling is required? (Again, as an option, this is not suppossed to become the standard MO.)
All these resolutions/display modes are in the HDMI specifications, so it is not like asking the client to output at some odd custom resolution/display mode. It is more like saying don't bother upscaling, there's another device in the video chain that'll take it from here. Just like saying leave the audio alone, there's an audio processor down the line and that'll take care of it. It's like resolution and rate matching, but consistently enforced for all(!) HDMI specified display modes, not just 1080p vs. 2160p, and in accordance with the actual EDID.
I sincerely hope that nothing of the above is sounding offensive in any way. It definitely isn't meant to be.
END OF LINE.