I’m planning on setting up a nas/home server (primarily storage with some jellyfin and nextcloud and such mixed in) and since it is primarily for data storage I’d like to follow the data preservation rules of 3-2-1 backups. 3 copies on 2 mediums with 1 offsite - well actually I’m more trying to go for a 2-1 with 2 copies and one offsite, but that’s besides the point. Now I’m wondering how to do the offsite backup properly.

My main goal would be to have an automatic system that does full system backups at a reasonable rate (I assume daily would be a bit much considering it’s gonna be a few TB worth of HDDs which aren’t exactly fast, but maybe weekly?) and then have 2-3 of those backups offsite at once as a sort of version control, if possible.

This has two components, the local upload system and the offsite storage provider. First the local system:

What is good software to encrypt the data before/while it’s uploaded?

While I’d preferably upload the data to a provider I trust, accidents happen, and since they don’t need to access the data, I’d prefer them not being able to, maliciously or not, so what is a good way to encrypt the data before it leaves my system?

What is a good way to upload the data?

After it has been encrypted, it needs to be sent. Is there any good software that can upload backups automatically on regular intervals? Maybe something that also handles the encryption part on the way?

Then there’s the offsite storage provider. Personally I’d appreciate as many suggestions as possible, as there is of course no one size fits all, so if you’ve got good experiences with any, please do send their names. I’m basically just looking for network attached drives. I send my data to them, I leave it there and trust it stays there, and in case too many drives in my system fail for RAID-Z to handle, so 2, I’d like to be able to get the data off there after I’ve replaced my drives. That’s all I really need from them.

For reference, this is gonna be my first NAS/Server/Anything of this sort. I realize it’s mostly a regular computer and am familiar enough with Linux, so I can handle that basic stuff, but for the things you wouldn’t do with a normal computer I am quite unfamiliar, so if any questions here seem dumb, I apologize. Thank you in advance for any information!

  • Jimmycakes@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 hours ago

    I use asustor Nas, one at my house south east US, one at my sister’s house northeast us. The asus os takes care of the backup every night. It’s not cheap but if you want it done right.

    Both run 4 drives in raid 5. Pictures backup to the hdd and a raid 1 set of nvme in the nas. The rest is just movies and TV shows for plex so I don’t really care about those. The pictures are the main thing. I feel like that’s as safe I can be.

  • doodledup@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    4 hours ago

    I’m just skipping that. How am I going to backup 48TB on an off-site backup?!

      • doodledup@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        edit-2
        2 hours ago

        In theory. But I already spent my pension for those 64TB drives (raidz2) xD. Getting off-site backup for all of that feels like such a waste of money (until you regret it). I know it isn’t a backup, but I’m praying the Raidz2 will be enough protection.

  • d00phy@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 hours ago

    My dad and I each have Synology NAS. We do a hyper sync backup from one to the other. I back up to his and vice versa. I also use syncthing to backup my plex media so he can mount it locally on his plex server.

  • hperrin@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    3 hours ago

    I just rsync it once in a while to a home server running in my dad’s house. I want it done manually in a “pull” direction rather than a “push” in case I ever get hit with ransomware.

  • ryannathans@aussie.zone
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    3 hours ago

    I use syncthing to push data offsite encrypted and with staggered versioning, to a tiny ITX box I run at family member’s house

    • rumba@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      The best part about sync thing is that you can set it to untrusted at the target. The data all gets encrypted and is not accessible whatsoever and the other side.

  • traches@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    6 hours ago

    NAS at the parents’ house. Restic nightly job, with some plumbing scripts to automate it sensibly.

  • Psychadelligoat@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 hours ago

    Put brand new drive into system, begin clone

    When clone is done, pull drive out and place in a cardboard box

    Take that box to my off-site storage (neighbors house) and bury it

    (In truth I couldn’t afford to get to the 1 off-site in time and have potentially tragically lost almost 4TB of data that, while replacable, will take time because I don’t fucking remember what I even had lol. Gonna take the drives to a specialist tho cuz I think the plates are fine and it’s the actual reading mechanism that’s busted)

    • treeofnik@discuss.online
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      4 hours ago

      For this I use a python script run via cron to output an html directory file that lists all the folder contents and pushes it to my cloud storage. This way if I ever have a critical failure of replaceable media, I can just refer to my latest directory file.

  • Matt The Horwood@lemmy.horwood.cloud
    link
    fedilink
    English
    arrow-up
    11
    ·
    7 hours ago

    There’s some really good options in this thread, just remember that whatever you pick. Unless you test your backups, they are as good as not existing.

    • dave@hal9000@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      Is there some good automated way of doing that? What would it look like, something that compares hashes?

      • thejml@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 minutes ago

        Have it sync the backup files from the -2- part. You can then copy them out of the syncthing folder to a local one with a cron to rotate them. That way you get the sync offsite and you can keep them out of the rotation as long as you want.

      • huquad@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        6 hours ago

        Agreed. I have it configured on a delay and with multiple file versions. I also have another pi running rsnapshot (rsync tool).

  • LandedGentry@lemmy.zip
    link
    fedilink
    English
    arrow-up
    21
    ·
    9 hours ago

    Cloud is kind of the default these days but given you’re on this community, I’m guessing you want to keep third parties out of it.

    Traditionally, at least in the video editing world, we would keep LTO or some other format offsite and pay for housing it or if you have multiple locations available to you just have those drives shipped back-and-forth as they are updated at regular intervals.

    I don’t know what you really have access to or what you’re willing to compromise on so it’s kind of hard to answer the question to be honest. Lots of ways to do it

  • rutrum@programming.dev
    link
    fedilink
    English
    arrow-up
    19
    ·
    9 hours ago

    I use borg backup. It, and another tool called restic, are meant for creating encrypted backups. Further, it can create backups regularly and only backup differences. This means you could take a daily backup without making new copies of your entire library. They also allow you to, as part of compressing and encrypting, make a backup to a remote machine over ssh. I think you should start with either of those.

    One provider thats built for being a cloud backup is borgbase. It can be a location you backup a borg (or restic I think) repository. There are others that are made to be easily accessed with these backup tools.

    Lastly, I’ll mention that borg handles making a backup, but doesn’t handle the scheduling. Borgmatic is another tool that, given a yml configuration file, will perform the borgbackup commands on a schedule with the defined arguments. You could also use something like systemd/cron to run a schedule.

    Personally, I use borgbackup configured in NixOS (which makes the systemd units for making daily backups) and I back up to a different computer in my house and to borgbase. I have 3 copies, 1 cloud and 2 in my home.

  • irmadlad@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    7 hours ago

    so if any questions here seem dumb

    Not dumb. I say the same, but I have a severe inferiority complex and imposter syndrome. Most artists do.

    1 local backup 1 cloud back up 1 offsite backup to my tiny house at the lake.

    I use Synchthing.