Slow streaming over internet - Printable Version +- Jellyfin Forum (https://forum.jellyfin.org) +-- Forum: Support (https://forum.jellyfin.org/f-support) +--- Forum: Troubleshooting (https://forum.jellyfin.org/f-troubleshooting) +---- Forum: Networking & Access (https://forum.jellyfin.org/f-networking-access) +---- Thread: Slow streaming over internet (/t-slow-streaming-over-internet) |
Slow streaming over internet - jellyfin_fan - 2024-08-22 Hi, I am running a jellyfin server (10.9.9) inside proxmox in an LXC that works great on my local network. However when I try direct playing 4k vids over the internet I get a lot of buffering issues. My internet connection is 100 Mbps fiber with symmetrical upload/download but it seems to be over provisioned because speed test gives 200/200. However when I run iperf over the internet, it reports speed of 30 - 40 Mbps. Which seems to explain why my 50-80 Mbps 4k vids are buffering. If I increase the iperf parallel it bumps results to 70-100 Mbps The speed test is reporting 200 Mbps upload but jellyfin seems to only use 30-40. How can I improve my upload performance? Is there any setting in jellyfin to improve it or some linux network settings I can try? RE: Slow streaming over internet - TheDreadPirate - 2024-08-22 Are you using a reverse proxy for handling remote clients? RE: Slow streaming over internet - jennystreaming - 2024-08-22 Might I ask how you do the speedtest? I have similar issues as well (also LXC container inside Proxmox) so I would like to make those tests as well. I am behind a reverse proxy (Nginx Proxy manager). (2024-08-22, 03:57 PM)jellyfin_fan Wrote: Hi, I am running a jellyfin server (10.9.9) inside proxmox in an LXC that works great on my local network. However when I try direct playing 4k vids over the internet I get a lot of buffering issues. RE: Slow streaming over internet - Fate - 2024-08-22 What router are you using? Did you port forward the jellyfin port to the internet or using VPN? What hardware is your proxmox on? especially what network card? You can run speedtest cli directly on the proxmox and on the lxc to compare RE: Slow streaming over internet - jellyfin_fan - 2024-08-22 (2024-08-22, 06:42 PM)Fate Wrote: What router are you using? Yes I am using nginx proxy manager (NPM) for reverse proxy on another container. The proxmox host is running on a pc with AMD 3700x cpu with 32GB of ram. The network card is a USB 2.5Gbe adapter. As for router, I am using Opnsense in another LXC on the same host. For the speed tests, I installed the ookla speedtest cli on my jellyfin instance. It is getting 236 Mbps inside the container. Idle Latency: 1.03 ms (jitter: 0.46ms, low: 1.00ms, high: 1.92ms) Download: 236.69 Mbps (data used: 184.0 MB) 9.43 ms (jitter: 0.78ms, low: 1.93ms, high: 10.98ms) Upload: 236.23 Mbps (data used: 106.3 MB) 1.92 ms (jitter: 23.21ms, low: 1.01ms, high: 213.64ms) Packet Loss: 0.0% for iperf, on the local network I get 2.36 Gbps from the jellyfin instance thru NPM. Meaning I set up port forwarding on NPM to forward the iperf port to the jelly instance, I then connect to NPM ip for the test [ ID] Interval Transfer Bitrate [ 5] 0.00-10.01 sec 2.77 GBytes 2.37 Gbits/sec sender [ 5] 0.00-10.05 sec 2.76 GBytes 2.36 Gbits/sec receiver for internet iperf, I ran same test but the client was on a vpn to simulate internet. The ookla speed test on this client was 195Mbps down and 140 Mbps up when VPN was running [ ID] Interval Transfer Bitrate [ 5] 0.00-10.00 sec 57.1 MBytes 47.9 Mbits/sec sender [ 5] 0.00-10.27 sec 55.4 MBytes 45.2 Mbits/sec receiver when I increased client streams to N=5, bandwidth increases to ~70 Mbps [SUM] 0.00-10.01 sec 93.4 MBytes 78.3 Mbits/sec sender [SUM] 0.00-10.28 sec 87.7 MBytes 71.6 Mbits/sec receiver Lastly I tested connecting from my phone on 5G network. Result is slower than the VPN test. Also the app I got for iperf does not seem to allow parallel streams Ookla speed test was 577 Mbps down and 16 up on the 5G [ ID] Interval Transfer Bitrate [ 5] 0.00-5.18 sec 17.2 MBytes 27.9 Mbits/sec receiver RE: Slow streaming over internet - jellyfin_fan - 2024-08-23 (2024-08-22, 06:20 PM)jennystreaming Wrote: Might I ask how you do the speedtest? I have similar issues as well (also LXC container inside Proxmox) so I would like to make those tests as well. I am behind a reverse proxy (Nginx Proxy manager). For internet speed test, I am using the Ookla cli speedtest tool. Follow the instructions on the below site to install in your LXC running jellyfin. You can then run internet speed tests in the shell https://www.speedtest.net/apps/cli for direct speed test to my clients, I am using iperf3. I installed iperf3 via APT in the jellyfin LXC. Then I went into NPM and added a new "Stream". I set the forward port to be the port the iperf3 will listen on (5201 by default), then set the forward host to the IP of the jellyfin LXC, and set foward port to same port as incoming. Lastly I went into my router and set up a port forward to my NPM host. Set this port forward to be the same port as the incoming in NPM. For ease of use you can set all these ports to be the same, but they don't have to be. Just make sure the exit port of the router port forward matches input port of NPM, and the forward port of NPM matches your iperf3 listening port. If you are running the proxmox firewall on the LXC I think you might need to open a port on that as well. As a word of caution, it is good practice to remove the port forwards once your tests are done and the disable NPM stream as well Then I run the "iperf3 -s" on jellyfin LXC to start up the server. On my client I use iperf3 -c example.com -p [port that you forwarded on the router]. The domain name is the url that points to your NPM instance. RE: Slow streaming over internet - Fate - 2024-08-23 (2024-08-22, 08:58 PM)jellyfin_fan Wrote:I was kinda hoping someone had a better idea.... But the speedtest look ok..(2024-08-22, 06:42 PM)Fate Wrote: What router are you using? Can you try lowering you WAN MTU to 1420 on the opnsense to rule out any issues with fragmentation? I don't think that's it but I got no better idea right now. |