Automatic run backup of your Jellyfin Instance - Printable Version +- Jellyfin Forum (https://forum.jellyfin.org) +-- Forum: Support (https://forum.jellyfin.org/f-support) +--- Forum: Guides, Walkthroughs & Tutorials (https://forum.jellyfin.org/f-guides-walkthroughs-tutorials) +--- Thread: Automatic run backup of your Jellyfin Instance (/t-automatic-run-backup-of-your-jellyfin-instance) |
Automatic run backup of your Jellyfin Instance - cesar_bianchi - 2023-10-13 Hi Guys, Everyday I found one or more forum topics about "How to implement a backup solution for Jellyfin Instances" or "Requesting a Native Backup feature To Jellyfin Developers". So, based in this fact, I decided share my solution here with our community to help another users with the same question. So, first of all, is important to say: Bellow I'll present my solution, that's ok ? There are a lot of different possibilities to implement a backup solution for files and folders, and my goal here is just share one kind. A kind that I adopted here and is working fine. Second and not less important: This solution is applicable only in Linux environments, but with a few little adjustments you can extend to other types of environments too, like Windows, Docker, etc. Of course, if you proceed any adapt and your "new solution" works fine, please, don't forget to share in this thread too, to help our community! Third and final: I'm using an AWS Account to store my backup files. So, you'll need a valid AWS subscription to apply the same solution. The "Free Tiers" solutions provided by AWS won't works fine with this case. Yes, I can imagine your feels after read it, but you need to know: When you are looking for a safe solution for backup, probably you'll need spend some money for it. I can tell to you: This solution is not cost free but it's cheap. Totally low cost. Before start, let’s talk about the solution and architecture. In summary, this solution considering to store the Jellyfin files and folders related to instance configurations, properties, users, images and metadata, in a Public Cloud, using safe methods, to run automatically everyday, or a week, or a month. Consider basics "know how" in Linux too, as "sudo", "chmod", "nano" and other popular commands. Is important know also about Linux filesystem and paths. So, in the end, you will have a safe place, outside your phisical environment, with your jellyfin instance files, to restore in the future in case of disasters or accidental deletion. So, there is a important disclaimer here: We’ll won’t coverage as part of a backup schema your content media files, but after read this documentation, you can able to extend the same solution to coverage your content media files too. All the sample files (scripts) that you will need are attached in this post, in the end of the page as "sample_files.zip" So, let’s start! 1 - First step: How to create an AWS Subscription If you already have an AWS Subscription, please skip to "Step 2", but if not, you'll need create one. This is a simple step and you can find all instructions here: https://portal.aws.amazon.com/billing/signup To simplify this tutorial, I won't describe in details "How to create an AWS Subscription" because there are a lot of details inside the AWS Oficial Home Page. I suggest to you read direct in oficial source! If you don't know anything about AWS, you can read and learn more here: https://aws.amazon.com/pt/what-is-aws/ 2 - Creating the cloud resources and services to store your backup files in AWS. To simplify this tutorial, I'll use an AWS Solution called "AWS Cloud Formation" to created the necessary services quickly, but if you are an AWS heavy user you can create each manually and define your preferences also. Basically, we'll create through Cloud Formation a new S3 Bucket, a new IAM-Policy and a new IAM-User and then assign the new policy to the new user. To do it, I wrote a Cloud Formation Script (attached in this post). Please download it, extract the zip file and store the "CloudFormTemplate.json" in a local folder. Pay attention here: After download the script file, you'll need apply some adjustments before run it in Cloud Formation Web Console.
3 - Running Cloud Formation Script in AWS Console.
4 – Getting the User Key and Secret Key to access Cloud Storage. After running Cloud Formation Script and instance all necessary cloud services, we need collect the "user keys and secrets" to use in the futher, when we'll configure the local backup script in Linux. So, for then, open your AWS Console Management and in the top of screen, through “Search Field” find by “IAM”.
5 – Getting the S3 Bucket URI S3 URI is the unique resource identifier within the context of the S3 protocol. They follow this naming convention : S3://bucket-name/key-name For example , if you have a bucket named mybucket and a file called puppy.jpg within it. The S3 URI would appear as - **S3://mybucket/puppy.jpg ** In our case, we will use S3 URI with a parameter in our shell script as a “destination path” of our backup folder. So, for it we need collect the S3 URI for further use. To get your S3 URI, follow these steps:
6 – Setup your Linux environment with AWS CLI After proceed with all cloud configs, now we'll create and configure a local bash script to run automatically for upload the jellyfin files in AWS Cloud. For it, we will use a "Cronjob" and "AWS CLI". The first step is install and configure AWS CLI in your linux environment.
Now, your AWS CLI is already done to use. 7 – Create a shell script to run a local backup and sync in AWS After install and configure AWS CLI in your Linux environment, we need create a shell script file to perform backup. In this case, I'll put a simple sample in this post (attachments session), but you can increment this file with your preferences. To perform it, run these bellow commands in your Linux Terminal.
Copy and Past the sample script (attached in this post with name "backup_jellyfin_on_aws_script.sh") in your terminal. Attempting for these important notes bellow:
Finally, after replace the “origin” and “destination” paths, save the script file in your Linux environment. We suggest do not change the path where the file will save (“/etc/scripts/”) and do not change the name file too (“backup_jellyfin_on_aws_script.sh”)
7 – Last step: Configure Crontab to run your local script everyday.
The crontab config option will appear. Use the sample (copy and paste) attached here with name "crontab sample.txt" to schedule your script defined in “Section 6” of this tutorial to run automatically. In this case, the script was configured to run every day at 05:00 AM, but you can choose your preferred time. To do it, change the first parameters in line (minute, hour, week day, etc). After copy and paste the crontab example in your terminal, save the file and restart your machine. To check if the backups are perfoming right, log on in your AWS Console Management, open S3 services, open your bucket and navigate under the folders. You can download too any file or folder. For to do it, check (ou select the folders) that you want download and use the download buttons.
This tier is recommend for backup files with “unfrequent” access. So, if you one day need download your files, you will need first change the “storage tier” of file. You can do it reading these article: https://docs.aws.amazon.com/AmazonS3/latest/userguide/storage-class-intro.html Let me know if this solutions works to you! |