this post was submitted on 10 May 2025
176 points (99.4% liked)

Selfhosted

46685 readers
996 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I'm planning on setting up a nas/home server (primarily storage with some jellyfin and nextcloud and such mixed in) and since it is primarily for data storage I'd like to follow the data preservation rules of 3-2-1 backups. 3 copies on 2 mediums with 1 offsite - well actually I'm more trying to go for a 2-1 with 2 copies and one offsite, but that's besides the point. Now I'm wondering how to do the offsite backup properly.

My main goal would be to have an automatic system that does full system backups at a reasonable rate (I assume daily would be a bit much considering it's gonna be a few TB worth of HDDs which aren't exactly fast, but maybe weekly?) and then have 2-3 of those backups offsite at once as a sort of version control, if possible.

This has two components, the local upload system and the offsite storage provider. First the local system:

What is good software to encrypt the data before/while it's uploaded?

While I'd preferably upload the data to a provider I trust, accidents happen, and since they don't need to access the data, I'd prefer them not being able to, maliciously or not, so what is a good way to encrypt the data before it leaves my system?

What is a good way to upload the data?

After it has been encrypted, it needs to be sent. Is there any good software that can upload backups automatically on regular intervals? Maybe something that also handles the encryption part on the way?

Then there's the offsite storage provider. Personally I'd appreciate as many suggestions as possible, as there is of course no one size fits all, so if you've got good experiences with any, please do send their names. I'm basically just looking for network attached drives. I send my data to them, I leave it there and trust it stays there, and in case too many drives in my system fail for RAID-Z to handle, so 2, I'd like to be able to get the data off there after I've replaced my drives. That's all I really need from them.

For reference, this is gonna be my first NAS/Server/Anything of this sort. I realize it's mostly a regular computer and am familiar enough with Linux, so I can handle that basic stuff, but for the things you wouldn't do with a normal computer I am quite unfamiliar, so if any questions here seem dumb, I apologize. Thank you in advance for any information!

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 11 points 1 week ago

I don't πŸ™ƒ

[–] [email protected] 1 points 1 week ago* (last edited 1 week ago)

Veeam Backup&Replication with a NFR license for me.
My personal setup:
First backup: Just a back up to a virtual drive stored on my NAS
Offsite backup: Essentially an export of what is available and then creates a full or incremental backup to an external USB drive.
I have two of those. One I keep at home in case my NAS explodes. The second is at my work place.
The off-site only contains my most important pieces of data.
As for frequency: As often as I remember to make one as it requires manual interaction.

Our clients have (depending on their size) the following setups:
2 or more endpoints (excluding exceptions):
Veeam BR Server
First backup to NAS
Second backup (copy of the first) to USB drives (min. of 3. 1 connected, 2 somewhere stored in the business, 3 at home/off-site. Daily rotation)
Optionally a S3 compatible cloud backup.

Bigger customers maybe have mirroring but we have those cases very rarely.

Edit: The backups can be encrypted at all steps (first backup or backup copys)
Edit 2: Veeam B/R is not (F)OSS but very reasonable for the free community edition. Has support for Windows, mac and Linux (some distros, only x64/x86). The NFR license can be aquired relatively easy (from here and they didn't check me in any way.
I like the software as it's very powerful and versatile. Both geared towards Fortune>500 and small shops/deployments.
And the next version will see a full linux version both as a single install and a virtual appliance.
They also have a setup for hardened repositories.

[–] [email protected] 4 points 1 week ago* (last edited 1 week ago)

I use asustor Nas, one at my house south east US, one at my sister's house northeast us. The asus os takes care of the backup every night. It's not cheap but if you want it done right.

Both run 4 drives in raid 5. Pictures backup to the hdd and a raid 1 set of nvme in the nas. The rest is just movies and TV shows for plex so I don't really care about those. The pictures are the main thing. I feel like that's as safe I can be.

[–] [email protected] 4 points 1 week ago (1 children)

I use syncthing to push data offsite encrypted and with staggered versioning, to a tiny ITX box I run at family member's house

[–] [email protected] 5 points 1 week ago (1 children)

The best part about sync thing is that you can set it to untrusted at the target. The data all gets encrypted and is not accessible whatsoever and the other side.

[–] [email protected] 1 points 1 week ago (2 children)

This is exactly what I'm about to do (later this week when I visit their house)

I've been using syncthing for years, but any tips for the encryption?

I was going to use SendOnly at my end to ensure that the data at the other end is an exact mirror, but in that case, how would the restore work if it's all encrypted?

load more comments (2 replies)
[–] [email protected] 11 points 1 week ago (1 children)
load more comments (1 replies)
[–] [email protected] 16 points 1 week ago (3 children)

I'm just skipping that. How am I going to backup 48TB on an off-site backup?!

[–] [email protected] 16 points 1 week ago (4 children)

Only back up the essentials like photos and documents or rare media.
Don't care about stuff like Avengers 4K that can easily be reaquired

load more comments (4 replies)
[–] [email protected] 3 points 1 week ago (1 children)

Get a tiny ITX box with a couple 20TB refurbished HDDs, stick it at a friend's house

[–] [email protected] 4 points 1 week ago* (last edited 1 week ago) (4 children)

In theory. But I already spent my pension for those 64TB drives (raidz2) xD. Getting off-site backup for all of that feels like such a waste of money (until you regret it). I know it isn't a backup, but I'm praying the Raidz2 will be enough protection.

[–] [email protected] 10 points 1 week ago (2 children)

Just a friendly reminder that RAID is not a backup...

Just consider if something accidentally overwrites some / all your files. This is a perfectly legit action and the checksums will happily match that new data, but your file(s) are gone...

load more comments (2 replies)
[–] [email protected] 5 points 1 week ago (1 children)

Do you have to back up everything off site?

Maybe there are just a few critical files you need a disaster recovery plan for, and the rest is just covered by your raidz

load more comments (1 replies)
[–] [email protected] 3 points 1 week ago

Understanding the risks is half the battle, but we can only do what we can do.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 3 points 1 week ago* (last edited 1 week ago)

I just rsync it once in a while to a home server running in my dad’s house. I want it done manually in a β€œpull” direction rather than a β€œpush” in case I ever get hit with ransomware.

[–] [email protected] 2 points 1 week ago (1 children)

Idrive has built in local encryption you can enable.

load more comments (1 replies)
[–] [email protected] 3 points 1 week ago

My dad and I each have Synology NAS. We do a hyper sync backup from one to the other. I back up to his and vice versa. I also use syncthing to backup my plex media so he can mount it locally on his plex server.

[–] [email protected] 3 points 1 week ago (1 children)

Put brand new drive into system, begin clone

When clone is done, pull drive out and place in a cardboard box

Take that box to my off-site storage (neighbors house) and bury it

(In truth I couldn't afford to get to the 1 off-site in time and have potentially tragically lost almost 4TB of data that, while replacable, will take time because I don't fucking remember what I even had lol. Gonna take the drives to a specialist tho cuz I think the plates are fine and it's the actual reading mechanism that's busted)

[–] [email protected] 2 points 1 week ago* (last edited 1 week ago)

For this I use a python script run via cron to output an html directory file that lists all the folder contents and pushes it to my cloud storage. This way if I ever have a critical failure of replaceable media, I can just refer to my latest directory file.

[–] [email protected] 2 points 1 week ago

Rclone to dropbox. ( was cheapest for 2tb at the time )

[–] [email protected] 16 points 1 week ago* (last edited 1 week ago) (1 children)

NAS at the parents’ house. Restic nightly job, with some plumbing scripts to automate it sensibly.

[–] [email protected] 2 points 1 week ago

This is mine exactly. Mine send to backblaze b2

[–] [email protected] 8 points 1 week ago (1 children)

I used to say restic and b2; lately, the b2 part has become more iffy, because of scuttlebutt, but for now it's still my offsite and will remain so until and unless the situation resolves unfavorably.

Restic is the core. It supports multiple cloud providers, making configuration and use trivial. It encrypts before sending, so the destination never has access to unencrypted blobs. It does incremental backups, and supports FUSE vfs mounting of backups, making accessing historical versions of individual files extremely easy. It's OSS, and a single binary executable; IMHO it's at the top of its class, commercial or OSS.

B2 has been very good to me, and is a clear winner for this is case: writes and space are pennies a month, and it only gets more expensive if you're doing a lot of reads. The UI is straightforward and easy to use, the API is good; if it weren't for their recent legal and financial drama, I'd still unreservedly recommend them. As it is, you'd have you evaluate it yourself.

load more comments (1 replies)
[–] [email protected] 1 points 1 week ago

I spend my days working on a MacBook, and have several old external USB drives duplicating my important files, live, off my server (Unraid) via Resilio to my MacBook (yes I know syncthing exists, but Resilio is easier). My off-site backups are to a Hetzner Storage Box using Duplicacy which is amazing and supports encrypted snapshots (a cheap GUI alternative to Borgbackup).

So for me, Resilio and Duplicacy.

[–] [email protected] 1 points 1 week ago

My automated workflow is to package up backup sources into tars (uncompressed), and encrypt with gpg, then ship the tar.gpg off to backblaze b2 and S3 with rclone. I don't trust cloud providers so I use two just in case. I've not really been in the need for full system backups going off site, rather just the things I'd be severely hurting for if my home exploded.

But to your main questions, I like gpg because you have good options for encrypting things safely within bash/ash/sh scripting, and the encryption itself is considered strong.

And, I really like rclone because it covers the main cloud providers and wrangles everything down to an rsync-like experience which also pretty tidy for shell scripting.

[–] [email protected] 2 points 1 week ago

Right now I sneaker net it. I stash a luks encrypted drive in my locker at work and bring it home once a week or so to update the backup.

At some point I'm going to set up a RPI at a friend's house, but that's down the road a bit.

[–] [email protected] 5 points 1 week ago (1 children)

so if any questions here seem dumb

Not dumb. I say the same, but I have a severe inferiority complex and imposter syndrome. Most artists do.

1 local backup 1 cloud back up 1 offsite backup to my tiny house at the lake.

I use Synchthing.

[–] [email protected] -2 points 1 week ago (1 children)
load more comments (1 replies)
[–] [email protected] 25 points 1 week ago (2 children)

There's some really good options in this thread, just remember that whatever you pick. Unless you test your backups, they are as good as not existing.

[–] [email protected] 2 points 1 week ago (2 children)

How does one realistically test their backups, if they are doing the 3-2-1 backup plan?

I validate (or whatever the term used is) my backups, once a month, and trust that it means something 😰

[–] [email protected] 3 points 1 week ago (1 children)

Untill you test a backup it's not complete, how you test it is up to you.

If you upload to a remote location, pull it down and unpack it. Check that you can open import files, if you can't open it then the backup is not worth the dick space

load more comments (1 replies)
[–] [email protected] 3 points 1 week ago (1 children)

Deploy the backup (or some part of it) to a test system. If it can boot or you can get the files back, they work.

load more comments (1 replies)
[–] [email protected] 6 points 1 week ago (2 children)

Is there some good automated way of doing that? What would it look like, something that compares hashes?

load more comments (2 replies)
load more comments
view more: β€Ή prev next β€Ί