![]() |
Excessive RAM consumption - Printable Version +- Jellyfin Forum (https://forum.jellyfin.org) +-- Forum: Support (https://forum.jellyfin.org/f-support) +--- Forum: Troubleshooting (https://forum.jellyfin.org/f-troubleshooting) +--- Thread: Excessive RAM consumption (/t-excessive-ram-consumption) |
RE: Excessive RAM consumption - TheDreadPirate - 2025-04-09 (2025-04-09, 12:51 PM)evrial Wrote:(2024-02-27, 09:45 PM)TheDreadPirate Wrote: Unused memory is wasted memory. You're taking my comment out of context. We get many questions about memory usage that are often along the lines of "why does Jellyfin use more than 100MB of memory". "Why does Jellyfin cache so much data". I assumed, initially, this was what the question would be before understanding the scale of the issue. However, I stand by my comment when NOT in the context of OOM issues. The devs are aware of this thread, but so far no one on the dev team has been able to reproduce the problem. RE: Excessive RAM consumption - blubkatze - 2025-05-09 I do have the same problem that it increases until OOM. For the time being I added a systemd timer to restart the docker container every 30minutes, so it doesn't hog all the memory and hangs the server (since the server also hosts DNS and other services inside the network). On investigation it seems like it starts a jellyfin thread every other second[1] inside the docker logs there is nothing from the time it says it started until the OOM[2] No trickplay enable on any library and the following plugins enabled[3] With enabled debug logs it looks like it does try to check the Scheduled Tasks all the time it increases the PID count[4]. Maybe the threads aren't closed after the check which contributed to the memory leak. Happy to give you need more information if needed. [1] # while true; do docker stats --no-stream jellyfin; sleep 5; done CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 1.00% 304.1MiB / 3.501GiB 8.48% 0B / 0B 201kB / 213MB 832 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 2.68% 304.5MiB / 3.501GiB 8.49% 0B / 0B 201kB / 213MB 839 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 0.79% 304.8MiB / 3.501GiB 8.50% 0B / 0B 201kB / 213MB 844 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 1.65% 305.1MiB / 3.501GiB 8.51% 0B / 0B 201kB / 213MB 850 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 0.82% 305.7MiB / 3.501GiB 8.53% 0B / 0B 201kB / 213MB 859 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 1.81% 306MiB / 3.501GiB 8.54% 0B / 0B 201kB / 213MB 864 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 0.98% 306.4MiB / 3.501GiB 8.55% 0B / 0B 201kB / 213MB 870 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 0.80% 306.6MiB / 3.501GiB 8.55% 0B / 0B 201kB / 213MB 875 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 1.78% 306.9MiB / 3.501GiB 8.56% 0B / 0B 201kB / 213MB 880 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 0.89% 307.3MiB / 3.501GiB 8.57% 0B / 0B 201kB / 213MB 887 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 1.60% 307.2MiB / 3.501GiB 8.57% 0B / 0B 201kB / 213MB 892 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 0.87% 307.5MiB / 3.501GiB 8.58% 0B / 0B 201kB / 213MB 898 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 1.17% 307.7MiB / 3.501GiB 8.58% 0B / 0B 201kB / 213MB 901 CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS 744ff4e4d855 jellyfin 0.89% 307.9MiB / 3.501GiB 8.59% 0B / 0B 201kB / 213MB 905 [2] [..] Emby.Server.Implementations.ApplicationHost: Core startup complete Main: Startup complete 0:00:36.9432776 [3] Emby.Server.Implementations.Plugins.PluginManager: Loaded plugin: AniDB 10.0.0.0 Emby.Server.Implementations.Plugins.PluginManager: Loaded plugin: AniList 11.0.0.0 Emby.Server.Implementations.Plugins.PluginManager: Loaded plugin: TMDb 10.10.6.0 Emby.Server.Implementations.Plugins.PluginManager: Loaded plugin: Studio Images 10.10.6.0 Emby.Server.Implementations.Plugins.PluginManager: Loaded plugin: OMDb 10.10.6.0 Emby.Server.Implementations.Plugins.PluginManager: Loaded plugin: MusicBrainz 10.10.6.0 Emby.Server.Implementations.Plugins.PluginManager: Loaded plugin: AudioDB 10.10.6.0 [4] [11:58:26] [INF] [64] Microsoft.AspNetCore.Mvc.Infrastructure.ObjectResultExecutor: Executing ObjectResult, writing value of type 'Jellyfin.Api.Controllers.ScheduledTasksController+<GetTasks>d__2'. [11:58:26] [INF] [64] Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker: Executed action Jellyfin.Api.Controllers.ScheduledTasksController.GetTasks (Jellyfin.Api) in 3.0568ms [11:58:26] [INF] [64] Microsoft.AspNetCore.Routing.EndpointMiddleware: Executed endpoint 'Jellyfin.Api.Controllers.ScheduledTasksController.GetTasks (Jellyfin.Api)' [11:58:26] [INF] [64] Microsoft.AspNetCore.Hosting.Diagnostics: Request finished HTTP/1.1 GET http://192.168.1.31:8096/ScheduledTasks?IsEnabled=true - 200 null application/json; charset=utf-8 5.6719ms [11:58:30] [INF] [68] Microsoft.AspNetCore.Hosting.Diagnostics: Request starting HTTP/1.1 GET http://192.168.1.31:8096/ScheduledTasks?IsEnabled=true - null null [11:58:30] [INF] [68] Microsoft.AspNetCore.Routing.EndpointMiddleware: Executing endpoint 'Jellyfin.Api.Controllers.ScheduledTasksController.GetTasks (Jellyfin.Api)' [11:58:30] [INF] [68] Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker: Route matched with {action = "GetTasks", controller = "ScheduledTasks"}. Executing controller action with signature System.Collections.Generic.IEnumerable 1[MediaBrowser.Model.Tasks.TaskInfo] GetTasks(System.Nullable 1[System.Boolean], System.Nullable1[System.Boolean]) on controller Jellyfin.Api.Controllers.ScheduledTasksController (Jellyfin.Api). 1[MediaBrowser.Model.Tasks.TaskInfo] GetTasks(System.Nullable1[System.Boolean], System.Nullable 1[System.Boolean]) on controller Jellyfin.Api.Controllers.ScheduledTasksController (Jellyfin.Api).[11:58:32] [INF] [69] Microsoft.AspNetCore.Mvc.Infrastructure.ObjectResultExecutor: Executing ObjectResult, writing value of type 'Jellyfin.Api.Controllers.ScheduledTasksController+<GetTasks>d__2'. [11:58:32] [INF] [69] Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker: Executed action Jellyfin.Api.Controllers.ScheduledTasksController.GetTasks (Jellyfin.Api) in 11.8148ms [11:58:32] [INF] [69] Microsoft.AspNetCore.Routing.EndpointMiddleware: Executed endpoint 'Jellyfin.Api.Controllers.ScheduledTasksController.GetTasks (Jellyfin.Api)' [11:58:32] [INF] [69] Microsoft.AspNetCore.Hosting.Diagnostics: Request finished HTTP/1.1 GET http://192.168.1.31:8096/ScheduledTasks?IsEnabled=true - 200 null application/json; charset=utf-8 14.4339ms [11:58:36] [INF] [73] Microsoft.AspNetCore.Hosting.Diagnostics: Request starting HTTP/1.1 GET http://192.168.1.31:8096/ScheduledTasks?IsEnabled=true - null null [11:58:36] [INF] [73] Microsoft.AspNetCore.Routing.EndpointMiddleware: Executing endpoint 'Jellyfin.Api.Controllers.ScheduledTasksController.GetTasks (Jellyfin.Api)' [11:58:36] [INF] [73] Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker: Route matched with {action = "GetTasks", controller = "ScheduledTasks"}. Executing controller action with signature System.Collections.Generic.IEnumerable 1[MediaBrowser.Model.Tasks.TaskInfo] GetTasks(System.Nullable 1[System.Boolean], System.Nullable`1[System.Boolean]) on controller Jellyfin.Api.Controllers.ScheduledTasksController (Jellyfin.Api).
RE: Excessive RAM consumption - Larsenv - 2025-05-13 Are any of you using jellystat? I'm thinking that excessive RAM usage seems to maybe be caused by it constantly running tasks to fully sync the library RE: Excessive RAM consumption - theguymadmax - 2025-05-13 The devs think they have tracked down one of the memory issues. You can follow this this thread. Currently there are 2 test docker images with a fix. I'm currently running the test version and it has fixed the excessive memory issues I was having. RE: Excessive RAM consumption - blubkatze - 2025-05-13 (2025-05-13, 05:48 PM)Larsenv Wrote: Are any of you using jellystat? I'm thinking that excessive RAM usage seems to maybe be caused by it constantly running tasks to fully sync the library Not using jellystat. only 2 libraries. One with 1717 entries, one with 5630 entries. 3.5G RAM, 16G swap docker jellyfin/jellyfin image only added AniDB and AniList plugin. Rest is stock RE: Excessive RAM consumption - gnattu - 2025-05-13 So if you are experiencing this and if the ram usage seems to be related to library scanning, you can try this container image: https://github.com/jellyfin/jellyfin/issues/11588#issuecomment-2875092633 We already received multiple reports that by using a memory allocator that avoids fragmentations better helps stabilizing the memory usage. RE: Excessive RAM consumption - MauveJellyFish - 2025-07-22 I'm having the same problems, DB bloating to gigabytes and it increases steadily crashing near daily. I don't think I have any library scans set to run. It takes a long while with my rather large collection. |