3-2-1 Backups: How do you do the 1 offsite backup?
from CatLikeLemming@lemmy.blahaj.zone to selfhosted@lemmy.world on 10 May 18:27
https://lemmy.blahaj.zone/post/25788600

I’m planning on setting up a nas/home server (primarily storage with some jellyfin and nextcloud and such mixed in) and since it is primarily for data storage I’d like to follow the data preservation rules of 3-2-1 backups. 3 copies on 2 mediums with 1 offsite - well actually I’m more trying to go for a 2-1 with 2 copies and one offsite, but that’s besides the point. Now I’m wondering how to do the offsite backup properly.

My main goal would be to have an automatic system that does full system backups at a reasonable rate (I assume daily would be a bit much considering it’s gonna be a few TB worth of HDDs which aren’t exactly fast, but maybe weekly?) and then have 2-3 of those backups offsite at once as a sort of version control, if possible.

This has two components, the local upload system and the offsite storage provider. First the local system:

What is good software to encrypt the data before/while it’s uploaded?

While I’d preferably upload the data to a provider I trust, accidents happen, and since they don’t need to access the data, I’d prefer them not being able to, maliciously or not, so what is a good way to encrypt the data before it leaves my system?

What is a good way to upload the data?

After it has been encrypted, it needs to be sent. Is there any good software that can upload backups automatically on regular intervals? Maybe something that also handles the encryption part on the way?

Then there’s the offsite storage provider. Personally I’d appreciate as many suggestions as possible, as there is of course no one size fits all, so if you’ve got good experiences with any, please do send their names. I’m basically just looking for network attached drives. I send my data to them, I leave it there and trust it stays there, and in case too many drives in my system fail for RAID-Z to handle, so 2, I’d like to be able to get the data off there after I’ve replaced my drives. That’s all I really need from them.

For reference, this is gonna be my first NAS/Server/Anything of this sort. I realize it’s mostly a regular computer and am familiar enough with Linux, so I can handle that basic stuff, but for the things you wouldn’t do with a normal computer I am quite unfamiliar, so if any questions here seem dumb, I apologize. Thank you in advance for any information!

#selfhosted

threaded - newest

[deleted] on 10 May 18:36 next collapse

.

rutrum@programming.dev on 10 May 18:44 next collapse

I use borg backup. It, and another tool called restic, are meant for creating encrypted backups. Further, it can create backups regularly and only backup differences. This means you could take a daily backup without making new copies of your entire library. They also allow you to, as part of compressing and encrypting, make a backup to a remote machine over ssh. I think you should start with either of those.

One provider thats built for being a cloud backup is borgbase. It can be a location you backup a borg (or restic I think) repository. There are others that are made to be easily accessed with these backup tools.

Lastly, I’ll mention that borg handles making a backup, but doesn’t handle the scheduling. Borgmatic is another tool that, given a yml configuration file, will perform the borgbackup commands on a schedule with the defined arguments. You could also use something like systemd/cron to run a schedule.

Personally, I use borgbackup configured in NixOS (which makes the systemd units for making daily backups) and I back up to a different computer in my house and to borgbase. I have 3 copies, 1 cloud and 2 in my home.

drkt@scribe.disroot.org on 10 May 18:47 next collapse

I rsync a copy of it to a friends house every night. It’s straight forward, simple and free.

diegantobass@lemmy.world on 10 May 18:59 next collapse

I rsync a copy to mom’s

qjkxbmwvz@startrek.website on 10 May 21:41 collapse

Same — rsync to a pi 3 with a (single) ZFS drive at family’s house. Retain some daily/weekly/monthly snapshots.

I have a (free) VPS with static IPv4 which is how I connect everything.

Both the VPS and the remote site have limited network speed (I think 50Mbps for VPS), so the initial sync was done sneakernet (well…“airplane net”). Nightly rsync is no problem bandwidth-wise, and is mostly just any new videos I’ve uploaded to my local Immich instance.

WeirdGoesPro@lemmy.dbzer0.com on 10 May 18:48 next collapse

My ratchet way of doing it is Backblaze. There is a docker container that lets you run the unlimited personal plan on Linux by emulating a windows environment. They let you set an encryption key so that they can’t access your data.

I’m sure there are a lot more professional and secure ways to do it, but my way is cheap, easy, and works.

BlueEther@no.lastname.nz on 10 May 20:25 next collapse

I use backblaze as well, got an link to the docker container - that may save me a few dollar bucks a week and thus keep SWMBO happier

turmacar@lemmy.world on 10 May 21:01 next collapse

Probably a me problem but kept having problems with that docker on unraid, it’s just in the community apps ‘store’. The vm seemed to just crash randomly.

I switched over to their B2 storage and just use rclone to an encrypted bucket and it’s ~<$5/mo which I’m good with. Biggest cost is if I let it run too often and it spends a bunch of their compute time listing files to see if it needs to update them.

WeirdGoesPro@lemmy.dbzer0.com on 11 May 00:42 collapse
redbr64@lemmy.world on 11 May 00:40 collapse

What’s the container’s name? I was about to get backblaze and then was frustrated at the cost difference between the desktop personal plan and the one for deploying on my server

WeirdGoesPro@lemmy.dbzer0.com on 11 May 00:42 collapse
Onomatopoeia@lemmy.cafe on 10 May 18:50 next collapse

As others have said, use tools like borg and restic.

Shop around for cloud storage with good pricing for your use-case. Many charge for different usage patterns, like restoring data or uploading.

Check out storj.io, I like their pricing - they charge for downloading/restore (IIRC), and I figure that’s a cost I can live with if I need to restore.

Otherwise I keep 3 local copies of data:

1 is live, and backed up to storj.io

2 is mirrored from 1 every other week

3 is mirrored from 1 every other week, opposite 2

This works for my use-case, where I’m concerned about local failures and mistakes (and don’t trust my local stores enough to use a backup tool), but my data doesn’t change a lot in a week. If I were to lose 1 week of changes, it would be a minor issue. And I’m trusting my cloud backup to be good (I do test it quarterly, and do a single file restore test monthly).

This isn’t an ideal (or even recommended approach), just works with the storages I currently have, and my level of trust of them.

neidu3@sh.itjust.works on 10 May 18:53 next collapse

A huge tape archive in a mountain. It’s pretty standard for geophysical data. I have some (encrypted) personal stuff on a few tapes there.

tuhriel@infosec.pub on 10 May 18:54 next collapse

I have a rpi4 awith an external hdd at my parents house, which I connect via a wireguard vpn, mount and decrypt the external hdd and then it triggers a restic backup to a restic-rest server as append only.

The whole thing is done via a python script

I chose the rest-server because it allows “append only”, so the data can’t be deleted easily from my side of the vpn.

SirMaple__@lemmy.ca on 10 May 18:58 next collapse

I use Proxmox PBS for all my backups. Datastore is on my file server at home. I sync the datastore daily to a little NAS at a family members house and to a super cheap storage VPS on the other side of the country. I also do a manual sync to an external drive that keep offline at home.

Any super important documents such as tax records, health related files, backup of the data volume from vaultwarden, or anything related to wills & estates get backed up as well to 2 USB thumb drives that are LUKS encrypted. I keep 1 in my go bag and another is hidden somewhere… Thumb drives get updated once a month, or sooner if anything major changes.

dan@upvote.au on 10 May 19:00 next collapse

For storing the backups, I use a storage VPS. I got one from HostHatch a few years ago during Black Friday sales, with 10TB space for $10/month. Hetzner have good deals with their storage boxes, too - they offer 5TB space for $13/month if you’re in the USA (you need to add VAT if you’re in Europe).

A good rule of thumb is to never pay more than $5/TB/month, and during Black Friday it’s closer to $2/TB/month. The LowEndTalk forum has the best Black Friday deals.

I use Borgbackup for backups, and Borgmatic to handle scheduling them. Borgbackup is a fantastic piece of software.

Borgmatic has an “append only” mode which lets you configure particular SSH keys to only be able to add data to the backup, not delete it. Even if someone/something (ransomware, malicious users, etc) gains access to your system and tries to delete the backups, they can’t. Essentially, this is protection against ransomware.

This is a very common issue with other backup solutions - the client has full access to the backup, so malware on the client system could potentially delete all the backups.

I have two backup copies of most things. One copy on my home server and one copy on my storage VPS. If you do do multiple backups, Borgbackup recommend doing two separate backups rather than doing one then rsyncing it to another server.

cron@feddit.org on 10 May 19:05 next collapse

RClone to a cloud storage (hetzner in my case). Rclone is easy to configure and offers full encryption, even for the file names.

As the data is only uploaded once, a daily backup uploads only the added or changed files.

Just as a side note: make sure you can retrieve your data even in case your main system fails. Make sure you have all the passwords/crypto keys available.

Fermiverse@gehirneimer.de on 10 May 19:28 collapse

I do the same using rclone, partly encrypted partly just dump.

I use batch scripts ln cron_daily to start this

hendrik@palaver.p3x.de on 10 May 19:10 next collapse

Next to paying for cloud storage, I know people who store an external hdd at their parent's or with friends. I don't do the whole backup thing for all the recorded TV shows and ripped bluerays... If my house burns down, they're gone. But that makes the amount of data a bit more manageable. And I can replace those. I currently don't have a good strategy. My data is somewhat scattered between my laptop, the NAS, an external hdd which is in a different room but not off-site, one cheap virtual server I pay for and critical things like the password manager are synced to the phone as well. Main thing I'm worried about is one of the mobile devices getting stolen so I focus on having that backed up to the NAS or synced to Nextcloud. But I should work on a solid strategy in case something happens to the NAS.

I don't think the software is a big issue. We got several good backup tools which can do incremental or full backups, schedules, encryption and whatever someone might need for backups.

ladfrombrad@lemdro.id on 10 May 20:02 next collapse

Yeah me too, photos and videos I’ve recorded are the only things I’m bothered about. Backing up off-site all my arrrrr booty is redundant since I’ve shared it to a 2.1 ratio already and hopefully can download it again from people with larger storage than my family member has.

It’s how I handle backing up those photos / videos thou. I bought them a 512GB card and shoved that in a GLi AP they have down there which I sync my DCIM folder to (app was removed from Play Store since it didn’t need updating but Googles stupid policies meant it went RIP…), and I also backup that to the old Synology NAS I handed down to them. I suppose I could use Syncthing but I like that old app since the adage if it’s not broke don’t fix it applies.

Along with them having Tailscale on a Pi4 (on a UPS and is their/my backup TVHeadend server) and their little N100 media box I don’t even bother them with my meager photo collection and works good.

tburkhol@lemmy.world on 10 May 20:54 collapse

It really depends on what your data is and how hard it would be to recreate. I keep a spare HD in a $40/year bank box & rotate it every 3 months. Most of the content is media - pictures, movies, music. Financial records would be annoying to recreate, but if there’s a big enough disaster to force me to go to the off-site backups, I think that’ll be the least of my troubles. Some data logging has a replica database on a VPS.

My upload speed is terrible, so I don’t want to put a media library in the cloud. If I did any important daily content creation, I’d probably keep that mirrored offsite with rsync, but I feel like the spirit of an offsite backup is offline and asynchronous, so things like ransomware don’t destroy your backups, too.

hendrik@palaver.p3x.de on 10 May 21:10 collapse

Sure. With data that might be skipped, I meant something like the Jellyfin server, which probably consists of pirated TV and music or movie rips. Those tend to be huge in size and easy to recreate. With personal content, pictures and videos there is no chance of getting it back. And I'd argue with a lot of documents and data it's not even worth the hassle to decide which might be stored somewhere else, maybe in paper form... Just back them up, storage is cheap and most people don't generate gigabytes worth of content each month. For large data that doesn't change a lot, something like one or two rotated external disks might do it. And for smaller documents and current projects which see a lot of changes, we have things like Nextcloud, Syncthing and a $80 a year VPS or other cloud storage solutions.

SreudianFlip@sh.itjust.works on 10 May 19:11 next collapse

Most of my work is with Macs, and even one server is running macOS, so for those who don’t know how it works ‘over there’, one runs Time Machine which is a versioning system keeping hourlies for a day, dailies for a week, then just weeklies after that. It accommodates using multiple disks, so I have a networked drive that services all the mac computers, and each computer also has a USB drive it connects to. Each drive usually services a couple of computers.

Backups happen automatically without interruption or drama.

I just rotate the USB drives out of the building into a storage unit once a month or so and bring the offsite drives back in to circulation. The timemachine system nags you for missing backup drives if it’s been too long, which is great.

It’s not perfect but very reliable and I wish everyone had access to a similar system, it’s very easy, apple got this one thing right.

piefood@feddit.online on 10 May 19:14 next collapse

I use LUKS and backup to a usb-drive that I have at home. I rsync those backups to my work once a week. Not everyone can backup to their office, but as others have said, backing up to a friend/family member's house is doable. The nice thing about rsync is that you can limit the bandwidth, so that even though it takes longer, it doesn't saturate their internet connection.

bandwidthcrisis@lemmy.world on 10 May 19:16 next collapse

I use rsync.net

It’s not the lowest price, but I like the flexibility of access.

For instance, I was able to run rclone on their servers to do a direct copy from OneDrive to rsync.net, 400Gb without having to go through my connection.

I can mount backups with sshfs if I want to, including the daily zfs snapshots.

amorpheus@lemmy.world on 10 May 19:19 next collapse

External drives that I keep in my office at work. Also cloud storage.

henfredemars@infosec.pub on 10 May 20:42 next collapse

Same I just throw it in a desk at work. It’s encrypted anyway.

harsh3466@lemmy.ml on 10 May 21:00 collapse

Same here. luks encrypted drive in my work locker.

huquad@lemmy.ml on 10 May 19:50 next collapse

Syncthing to a pi at my parents place.

Malatesta@lemmy.world on 10 May 20:10 next collapse

Low power server in a friends basement running syncthing

AtariDump@lemmy.world on 10 May 20:55 next collapse

But doesn’t that sync in real-time? Making it not a true backup?

huquad@lemmy.ml on 10 May 21:48 next collapse

Agreed. I have it configured on a delay and with multiple file versions. I also have another pi running rsnapshot (rsync tool).

AtariDump@lemmy.world on 11 May 01:03 collapse

How’d you do that?

rumba@lemmy.zip on 11 May 02:04 next collapse

Edit the share, enable file versioning, choose which flavor.

huquad@lemmy.ml on 11 May 03:37 collapse

For the delay, I just reduce how often it checks for new files instead of instantaneously.

thejml@lemm.ee on 11 May 03:14 next collapse

Have it sync the backup files from the -2- part. You can then copy them out of the syncthing folder to a local one with a cron to rotate them. That way you get the sync offsite and you can keep them out of the rotation as long as you want.

Appoxo@lemmy.dbzer0.com on 11 May 07:20 next collapse

In theory you could setup a cron with a docker compose to fire up a container, sync and once all endpoint jobs are synced to shut down.
As it seemingly has an API it should be possible.

sugar_in_your_tea@sh.itjust.works on 11 May 15:08 collapse

You could use scheduled snapshots to provide the backup portion.

Matriks404@lemmy.world on 11 May 07:30 next collapse

Huh, that’s a pretty good idea. I already have a Raspberry Pi setup at home, and it wouldn’t be hard to duplicate in other location.

SorteKanin@feddit.dk on 11 May 07:30 next collapse

A pi with multiple terabytes of storage?

huquad@lemmy.ml on 11 May 11:16 collapse

My most critical data is only ~2-3TB, including backups of all my documents and family photos, so I have a 4TB ssd attached which the pi also boots from. I have ~40TB of other Linux isos that have 2-drive redundancy, but no backups. If I lose those, i can always redownload.

dave@lemmy.wtf on 11 May 14:29 collapse

using a meshVPN like tailscale or netbird would another option as well. it would allow you to use proper backup software like restic or whatever, and with tailscale on both devices, it would allow restic to be able to find the pi device even if the other person moved to a new house. (although a pi with ethernet would be preferable so all they have to do is plug it in to their new network and everything would be good. if it was a pi zero then someone would have to update the wifi password)

huquad@lemmy.ml on 11 May 20:44 collapse

Funny you mention it. This is exactly what I do. Don’t use the relay servers for syncthing, just my tailnet for device to device networking.

umbrella@lemmy.ml on 10 May 20:03 next collapse

you can rent a vps and set up encrypted backups

mhzawadi@lemmy.horwood.cloud on 10 May 20:47 next collapse

There’s some really good options in this thread, just remember that whatever you pick. Unless you test your backups, they are as good as not existing.

redbr64@lemmy.world on 11 May 00:38 next collapse

Is there some good automated way of doing that? What would it look like, something that compares hashes?

mhzawadi@lemmy.horwood.cloud on 11 May 08:16 next collapse

That very much depends on your backup of choice, that’s also the point. How do you recover your backup?

Start with a manual recover a backup and unpack it, check import files open. Write down all the steps you did, how do you automate them.

sugar_in_your_tea@sh.itjust.works on 11 May 15:10 collapse

I don’t trust automation for restoring from backup, so I keep the restoration process extremely simple:

  1. automate recreating services - have my podman files in a repository
  2. manually download and extract data to a standard location
  3. restart everything and verify that each service works properly

Do that once/year in a VM or something and you should be good. If things are simple enough, it shouldn’t take long (well under an hour).

Showroom7561@lemmy.ca on 11 May 06:04 collapse

How does one realistically test their backups, if they are doing the 3-2-1 backup plan?

I validate (or whatever the term used is) my backups, once a month, and trust that it means something 😰

Appoxo@lemmy.dbzer0.com on 11 May 07:21 next collapse

Deploy the backup (or some part of it) to a test system. If it can boot or you can get the files back, they work.

Showroom7561@lemmy.ca on 11 May 14:54 collapse

For context, I have a single Synology NAS, so recovering and testing the entire backup set would not be practical in my case.

I have been able to test single files or entire folders and they work fine, but obviously I’d have no way of testing the entire backup set due to the above consideration. It is my understanding that the verify feature that Synology uses is to ensure that there’s no bit rot and that the file integrity is intact. My hope is that because of how many isolated backups I do keep, the chance of not being able to recover is slim to none.

mhzawadi@lemmy.horwood.cloud on 11 May 08:13 collapse

Untill you test a backup it’s not complete, how you test it is up to you.

If you upload to a remote location, pull it down and unpack it. Check that you can open import files, if you can’t open it then the backup is not worth the dick space

Jakeroxs@sh.itjust.works on 11 May 16:07 collapse

Heh

irmadlad@lemmy.world on 10 May 20:52 next collapse

so if any questions here seem dumb

Not dumb. I say the same, but I have a severe inferiority complex and imposter syndrome. Most artists do.

1 local backup 1 cloud back up 1 offsite backup to my tiny house at the lake.

I use Synchthing.

ryannathans@aussie.zone on 11 May 00:16 collapse

+1 syncthing

ryannathans@aussie.zone on 11 May 15:04 collapse

Weird down votes?

harsh3466@lemmy.ml on 10 May 21:00 next collapse

Right now I sneaker net it. I stash a luks encrypted drive in my locker at work and bring it home once a week or so to update the backup.

At some point I’m going to set up a RPI at a friend’s house, but that’s down the road a bit.

Getting6409@lemm.ee on 10 May 21:02 next collapse

My automated workflow is to package up backup sources into tars (uncompressed), and encrypt with gpg, then ship the tar.gpg off to backblaze b2 and S3 with rclone. I don’t trust cloud providers so I use two just in case. I’ve not really been in the need for full system backups going off site, rather just the things I’d be severely hurting for if my home exploded.

But to your main questions, I like gpg because you have good options for encrypting things safely within bash/ash/sh scripting, and the encryption itself is considered strong.

And, I really like rclone because it covers the main cloud providers and wrangles everything down to an rsync-like experience which also pretty tidy for shell scripting.

q7mJI7tk1@lemmy.world on 10 May 21:06 next collapse

I spend my days working on a MacBook, and have several old external USB drives duplicating my important files, live, off my server (Unraid) via Resilio to my MacBook (yes I know syncthing exists, but Resilio is easier). My off-site backups are to a Hetzner Storage Box using Duplicacy which is amazing and supports encrypted snapshots (a cheap GUI alternative to Borgbackup).

So for me, Resilio and Duplicacy.

sxan@midwest.social on 10 May 21:07 next collapse

I used to say restic and b2; lately, the b2 part has become more iffy, because of scuttlebutt, but for now it’s still my offsite and will remain so until and unless the situation resolves unfavorably.

Restic is the core. It supports multiple cloud providers, making configuration and use trivial. It encrypts before sending, so the destination never has access to unencrypted blobs. It does incremental backups, and supports FUSE vfs mounting of backups, making accessing historical versions of individual files extremely easy. It’s OSS, and a single binary executable; IMHO it’s at the top of its class, commercial or OSS.

B2 has been very good to me, and is a clear winner for this is case: writes and space are pennies a month, and it only gets more expensive if you’re doing a lot of reads. The UI is straightforward and easy to use, the API is good; if it weren’t for their recent legal and financial drama, I’d still unreservedly recommend them. As it is, you’d have you evaluate it yourself.

foobaz@lemmy.world on 11 May 12:39 collapse

I’m running the same setup, restic -> b2. Offsite I have a daily rclone job to pull (the diffs) from b2. Works perfectly, cost is < 1€ per month.

traches@sh.itjust.works on 10 May 21:20 next collapse

NAS at the parents’ house. Restic nightly job, with some plumbing scripts to automate it sensibly.

makingStuffForFun@lemmy.ml on 10 May 21:42 collapse

This is mine exactly. Mine send to backblaze b2

palarith@aussie.zone on 10 May 22:36 next collapse

Rclone to dropbox. ( was cheapest for 2tb at the time )

gamermanh@lemmy.dbzer0.com on 10 May 22:50 next collapse

Put brand new drive into system, begin clone

When clone is done, pull drive out and place in a cardboard box

Take that box to my off-site storage (neighbors house) and bury it

(In truth I couldn’t afford to get to the 1 off-site in time and have potentially tragically lost almost 4TB of data that, while replacable, will take time because I don’t fucking remember what I even had lol. Gonna take the drives to a specialist tho cuz I think the plates are fine and it’s the actual reading mechanism that’s busted)

treeofnik@discuss.online on 10 May 23:24 collapse

For this I use a python script run via cron to output an html directory file that lists all the folder contents and pushes it to my cloud storage. This way if I ever have a critical failure of replaceable media, I can just refer to my latest directory file.

d00phy@lemmy.world on 10 May 23:18 next collapse

My dad and I each have Synology NAS. We do a hyper sync backup from one to the other. I back up to his and vice versa. I also use syncthing to backup my plex media so he can mount it locally on his plex server.

surewhynotlem@lemmy.world on 10 May 23:30 next collapse

Idrive has built in local encryption you can enable.

Tiger@sh.itjust.works on 11 May 13:59 collapse

I also am liking drive for cloud backup storage.

hperrin@lemmy.ca on 10 May 23:33 next collapse

I just rsync it once in a while to a home server running in my dad’s house. I want it done manually in a “pull” direction rather than a “push” in case I ever get hit with ransomware.

doodledup@lemmy.world on 10 May 23:41 next collapse

I’m just skipping that. How am I going to backup 48TB on an off-site backup?!

ryannathans@aussie.zone on 11 May 00:15 next collapse

Get a tiny ITX box with a couple 20TB refurbished HDDs, stick it at a friend’s house

doodledup@lemmy.world on 11 May 01:21 collapse

In theory. But I already spent my pension for those 64TB drives (raidz2) xD. Getting off-site backup for all of that feels like such a waste of money (until you regret it). I know it isn’t a backup, but I’m praying the Raidz2 will be enough protection.

jimerson@lemmy.world on 11 May 01:59 next collapse

Understanding the risks is half the battle, but we can only do what we can do.

PeriodicallyPedantic@lemmy.ca on 11 May 06:09 next collapse

Do you have to back up everything off site?

Maybe there are just a few critical files you need a disaster recovery plan for, and the rest is just covered by your raidz

doodledup@lemmy.world on 11 May 20:04 collapse

I do backup like 1TB off-site. But it would still be a major blow if I lost the rest of it. I just try to live with that risk I’m fully aware exists.

Cyber@feddit.uk on 11 May 07:25 next collapse

Just a friendly reminder that RAID is not a backup…

Just consider if something accidentally overwrites some / all your files. This is a perfectly legit action and the checksums will happily match that new data, but your file(s) are gone…

sugar_in_your_tea@sh.itjust.works on 11 May 15:06 next collapse

That’s a great use-case for snapshots! RAID still isn’t a backup, but it can be quite robust.

doodledup@lemmy.world on 11 May 18:12 collapse

I do weekly ZFS snapshots though and I’m very diligent on my smart tests and scrubs. I also have a UPS and a lot of power surge protection. And ECC Ram. It’s as safe as it gets. But having a backup would definitely be better, you’re right. I just can’t afford it for this much storage.

cwista@lemmy.world on 11 May 12:07 collapse

The cost of storage is always more than double the sticker price. The hidden fee is that you need a second and maybe a third one and a system to put it all in. Most our operational lab cost is backups. I can’t replace the data if it’s lost.

Appoxo@lemmy.dbzer0.com on 11 May 07:17 next collapse

Only back up the essentials like photos and documents or rare media.
Don’t care about stuff like Avengers 4K that can easily be reaquired

dave@lemmy.wtf on 11 May 14:12 next collapse

a “poor mans” backup can be useful for things like this, movie/tv/music collections, and will only be a few MB instead of TB.

if things go south at least you can rebuild your collection in time. obviously if theres some rare files that were hard to get then you can backup those ones, but even at that it will probably still be a small backup

sugar_in_your_tea@sh.itjust.works on 11 May 15:05 collapse

Yup. My “essential” backups are well under 1TB, the rest can be reacquired.

nfreak@lemmy.ml on 11 May 15:41 next collapse

This is what I’m currently doing, I use backblaze b2 for basically everything that’s not movies/shows/music/roms, along with backing up my docker stacks etc to the same external drive my media’s currently on.

I’m looking at a few good steps to upgrade this but nothing excessive:

  • NAS for media and storing local backups
  • Regular backups of everything but media to a small USB drive
  • Get a big ass external HDD that I’ll update once a month with everything and keep in my storage unit and ditch backblaze

Not the cleanest setup but it’ll do the job. The media backup is definitely gonna be more of a 2-1-Pray system LMAO but at least the important things will be regularly taken care of

doodledup@lemmy.world on 11 May 18:08 collapse

I don’t have Avengers 4K. It’s all just Linux ISOs.

Censed@lemmy.zip on 11 May 16:34 collapse

You ought to only be 3-2-1ing you irreplaceable/essential files like personal photos, videos, and documents. Unless you’re a huge photography guy i can believe that takes up 48TB

dataprolet@lemmy.dbzer0.com on 10 May 23:44 next collapse

Hetzner Storagebox

Carol2852@discuss.tchncs.de on 11 May 09:37 collapse

Just recently moved from an S3 cloud provider to a storagebox. Prices are ok and sub accounts help clean things up.

ryannathans@aussie.zone on 11 May 00:14 next collapse

I use syncthing to push data offsite encrypted and with staggered versioning, to a tiny ITX box I run at family member’s house

rumba@lemmy.zip on 11 May 02:00 collapse

The best part about sync thing is that you can set it to untrusted at the target. The data all gets encrypted and is not accessible whatsoever and the other side.

Cyber@feddit.uk on 11 May 07:18 collapse

This is exactly what I’m about to do (later this week when I visit their house)

I’ve been using syncthing for years, but any tips for the encryption?

I was going to use SendOnly at my end to ensure that the data at the other end is an exact mirror, but in that case, how would the restore work if it’s all encrypted?

rumba@lemmy.zip on 11 May 23:24 collapse

www.youtube.com/watch?v=hT373XZHNvk

You add another node to the untrusted node (or the reinstalled current node) and let it sync back.

alternatively there’s a decrypt command, you could go to the untrusted mode, copy the data off to a disk and decrypt it with the tool but I think that f’s up the structure.

Cyber@feddit.uk on 12 May 06:18 collapse

Ah, ok if Tom Lawrence has made a video then I know it’ll work!

Thanks

kebab@endlesstalk.org on 11 May 00:30 next collapse

I rent a cheap 4.5 TB, 60€/year VPS. Encryption synchronization is handled by Proxmox as I have Proxmox Backup Server installed on it. I am planning to change it and just buy a miniPC with a cheap HDD that I’ll place at my family’s place to save money

Jimmycakes@lemmy.world on 11 May 01:28 next collapse

I use asustor Nas, one at my house south east US, one at my sister’s house northeast us. The asus os takes care of the backup every night. It’s not cheap but if you want it done right.

Both run 4 drives in raid 5. Pictures backup to the hdd and a raid 1 set of nvme in the nas. The rest is just movies and TV shows for plex so I don’t really care about those. The pictures are the main thing. I feel like that’s as safe I can be.

Appoxo@lemmy.dbzer0.com on 11 May 07:09 next collapse

Veeam Backup&Replication with a NFR license for me.
My personal setup:
First backup: Just a back up to a virtual drive stored on my NAS
Offsite backup: Essentially an export of what is available and then creates a full or incremental backup to an external USB drive.
I have two of those. One I keep at home in case my NAS explodes. The second is at my work place.
The off-site only contains my most important pieces of data.
As for frequency: As often as I remember to make one as it requires manual interaction.

Our clients have (depending on their size) the following setups:
2 or more endpoints (excluding exceptions):
Veeam BR Server
First backup to NAS
Second backup (copy of the first) to USB drives (min. of 3. 1 connected, 2 somewhere stored in the business, 3 at home/off-site. Daily rotation)
Optionally a S3 compatible cloud backup.

Bigger customers maybe have mirroring but we have those cases very rarely.

Edit: The backups can be encrypted at all steps (first backup or backup copys)
Edit 2: Veeam B/R is not (F)OSS but very reasonable for the free community edition. Has support for Windows, mac and Linux (some distros, only x64/x86). The NFR license can be aquired relatively easy (from here and they didn’t check me in any way.
I like the software as it’s very powerful and versatile. Both geared towards Fortune>500 and small shops/deployments.
And the next version will see a full linux version both as a single install and a virtual appliance.
They also have a setup for hardened repositories.

Matriks404@lemmy.world on 11 May 07:29 next collapse

I don’t 🙃

InFerNo@lemmy.ml on 11 May 09:59 next collapse

I bring 1 of my backup disks to my inlaws. I go there regularly so it’s a matter of swapping them when I’m there.

pinguin@fault.su on 11 May 10:18 next collapse

I tend to just store all my backups off-site in multiple geographically distant locations, seems to work well

sugar_in_your_tea@sh.itjust.works on 11 May 15:04 collapse

So, you’re geocaching a bunch of USB drives then?

LifeInMultipleChoice@lemmy.world on 11 May 23:40 collapse

I’ve got a box of 800 flash drives I’ve picked up while hiking in the garage, I was wondering who’s they were.

/s, I cant afford a garage)

7rokhym@lemmy.ca on 11 May 10:51 next collapse

I use Linux, so encryption is easy with LUKS, and Free File Sync to drives that rotate to a safety deposit box at the bank for catastrophic event, such as a house fire. Usually anything from the last few months are still on my mobile devices.

pHr34kY@lemmy.world on 11 May 11:21 next collapse

I have a job, and the office is 35km away. I get a locker in my office.

I have two backup drives, and every month or so, I will rotate them by taking one into the office and bringing the other home. I do this immediately after running a backup.

The drives are LUKS encrypted btrfs. Btrfs allows snapshots and compression. LUKS enables me to securely password protect the drive. My backup job is just a btrfs snapshot followed by an rsync command.

I don’t trust cloud backups. There was an event at work where Google Cloud accidentally deleted an entire company just as I was about to start a project there.

iknowitwheniseeit@lemmynsfw.com on 11 May 11:56 next collapse

I just use restic.

I’m pretty sure it uses checksums to verify data on the backup target, so it doesn’t need to copy all of the data there.

lightnsfw@reddthat.com on 11 May 14:16 next collapse

It’s not all my data but I use backblaze for offsite backup. One of the reasons I can’t drop Windows. I don’t have anywhere I travel often enough to do a physical drop off and when I tried setting a file server up at my parents but they would break shit by fucking with their router every time they had an internet outage or moving it around (despite repeated being told to call me first).

kalpol@lemm.ee on 11 May 15:29 next collapse

Same - can sync snapshots from Truenas to Backblaze.

If you want to get real fancy you could stash an N40L cube server at your mom’s house where she will never find it and VPN back to your local network and replicate snapshots to it

Scrollone@feddit.it on 11 May 15:40 collapse

I can’t stand the fact that they don’t support Linux

sudneo@lemm.ee on 11 May 18:53 collapse

Objdct storage is anyway something I prefer over their app. Restic(/rustic) does the backup client side. B2 or any other storage to just save the data. This way you also have no vendor lock.

liliumstar@lemmy.dbzer0.com on 11 May 14:36 next collapse

I have a storage VPS and use Borg backup with Borgmatic. In my case, I have multiple systems in different repos on the remote. There are several providers, such as hetzner, borgbase, and rsync.net that offer borg storage, in the event you don’t want to manage the server yourself.

merthyr1831@lemmy.ml on 11 May 16:43 next collapse

Rsync to a Hetzner storage box. I dont do ALL my data, just the nextcloud data. The rest is…linux ISOs… so I can redownload at my convenience.

corsicanguppy@lemmy.ca on 11 May 18:31 next collapse

  • wireguard
  • rsync
  • zfs
frozencow@lemmy.world on 11 May 20:01 next collapse

I also had been contenplating this for a while. The solution I implemented recently is:

The system itself is a RPI on NixOS. The system can be reproduced from the NixOS configuration. The NixOS configuration is stored on GitHub. Since I can reproduce the sdcard image (and full system) from the configuration I opted to not do any backup of the sdcard/system itself.

I’ve also opted to not use raid, as I can replace/add a RPI without too much hassle.

The real backups for me are for photos. Those are stored on a M.2 storage. A second (similar) RPI is placed at my dad’s place. The rpis run tailscale and syncthing. Syncthing syncs using staggered mode (stores 1 version for the last day/week/year) and the RPI at my dad is untrusted, so the backup files are sent/stored encrypted there.

This setup hasn’t run very long yet, so I won’t recommend it, but it seems to check quite a lot of boxes for me. Maybe it gives some ideas. I’m also interested what alternative solutions others came up with.

SpatchyIsOnline@lemmy.world on 11 May 20:11 next collapse

I built a near identical server for my parents and just sync my nextcloud folder to theirs using syncthing

glizzyguzzler@lemmy.blahaj.zone on 11 May 21:56 next collapse

I got my parents to get a NAS box, stuck it in their basement. They need to back up their stuff anyway. I put in 2 18 TB drives (mirrored BTRFS raid1) from server part deals (peeps have said that site has jacked their prices, look for alts). They only need like 4 TB at most. I made a backup samba share for myself. It’s the cheapest symbology box possible, their software to make a samba share with a quota.

I then set up a wireguard connection on an RPi, taped that to the NAS, and wireguard to the local network with a batch script. Mount the samba share and then use restic to back up my data. It works great. Restic is encrypted, I don’t have to pay for storage monthly, their electricity is cheap af, they have backups, I keep tabs on it, everyone wins.

Next step is to go the opposite way for them, but no rush on that goal, I don’t think their basement would get totaled in a fire and I don’t think their house (other than the basement) would get totaled in a flood.

If you don’t have a friend or relative to do a box-at-their-house (peeps might be enticed with reciprocal backups), restic still fits the bill. Destination is encrypted, has simple commands to check data for validity.

Rclone crypt is not good enough. Too many issues (path length limits, password “obscured” but otherwise there, file structure preserved even if names are encrypted). On a VPS I use rclone to be a pass-through for restic to backup a small amount of data to a goog drive. Works great. Just don’t fuck with the rclone crypt for major stuff.

Lastly I do use rclone crypt to upload a copy of the restic binary to the destination, as the crypt means the binary can’t be fucked with and the binary there means that is all you need to recover the data (in addition to the restic password you stored safely!).

Ulrich@feddit.org on 11 May 21:57 next collapse

I assume daily would be a bit much considering it’s gonna be a few TB worth of HDDs which aren’t exactly fast

What is the concern here?

dirtycrow@programming.dev on 11 May 22:02 next collapse

Amazon AWS Glacier

Edit: I was downvoted for this, but it’s genuinely a more affordable alternative to Backblaze whose finances are questionable.

thecoffeehobbit@sopuli.xyz on 11 May 23:12 next collapse

I have an external storage unit a couple kilometers away and two 8TB hard drives with luks+btrfs. One of them is always in the box and after taking backups, when I feel like it, I detach the drive and bike to the box to switch. I’m currently researching btrbk for updating the backup drive on my pc automatically, it’s pretty manual atm. For most scenarios the automatic btrfs snapshots on my main disks are going to be enough anyway.

TrumpetX@programming.dev on 12 May 01:02 next collapse

Look into storj and tardigrade. It’s a crypto thing, but don’t get scared. You back up to S3 compatible endpoints and it’s super cheap (and pay with USD credit card)

Bassman1805@lemmy.world on 12 May 01:19 next collapse

The easiest offsite backup would be any cloud platform. Downside is that you aren’t gonna own your own data like if you deployed your own system.

Next option is an external SSD that you leave at your work desk and take home once a week or so to update.

The most robust solution would be to find a friend or relative willing to let you set up a server in their house. Might need to cover part of their electric bill if your machine is hungry.

randombullet@programming.dev on 12 May 06:42 next collapse

My friend has 1G/1G Internet. I have a rsync cron job backing up there 2 times a week.

It has a 8TB NVMe drive that I use bulk data backup and a 2TB os drive for VM stuff.

toe@lemmy.world on 12 May 08:03 next collapse

LTO8 in box elsewhere

The price per terabyte became viable when a drive was on sale for half off at a local retailer.

Works well and it was a fun learning experience.

AustralianSimon@lemmy.world on 12 May 08:25 next collapse

I have two large (8 Bay) Synology NAS. They backup certain data between each other and replicate internally and push to Back blaze. $6/mo.

dieTasse@feddit.org on 14 May 16:02 next collapse

If you are gonna go for TrueNAS, try Storj with TrueNAS Cloud task. TrueNAS made a partnership with Storj and the price is very good. www.truenas.com/truecloud-backup/

TlDr; The data is encrypted with restic and sent to Storj S3 storage that is further fragmenting it (and encrypting it too - so double encryption) into multiple pieces (with redundancy) and storing on other peoples TrueNASes (you can also provide your unused space btw and gain some small money back).

I am in process of setting this up (already run a working test backup) and I didn’t find anything that’s better than this integrated solution. Very cool!

fmstrat@lemmy.nowsci.com on 14 May 22:55 collapse

If you use ZFS this becomes easy, because you can do incremental backups at the block level.

I have my home lab server and do snapshots and sends to a server at my fathers house. Then I also have an external drive that I snapshot to as well.