Advice wanted: Making reliable private cloud backups with Kopia.
from neon_nova@lemmy.dbzer0.com to selfhosted@lemmy.world on 17 Apr 05:18
https://lemmy.dbzer0.com/post/42390019
from neon_nova@lemmy.dbzer0.com to selfhosted@lemmy.world on 17 Apr 05:18
https://lemmy.dbzer0.com/post/42390019
I’ve been using Kopia to backup my Windows work machine, Linux personal computer, and my wife’s Macbook.
Right now, It is just backing up to my NAS, but I would like to have it backup to a cloud solution.
I figured I would get some S3 storage somewhere and point Kopia at that to make the backup. I do not need a lot of space. I think 500gb would be enough. I do not want costs to be too high.
Do I have the right plan, or is there a better option?
Thanks in advance.
threaded - newest
B2
Options if it’s to protect against local disasters such as fire:
Personally I would probably go for option two and bring the usb drive with me for a weekly coffee with my parents, they’d enjoy the visit and I enjoy knowing that my backup isn’t in the hands of Amazon. I’d go for option 1 if my internet was better.
For option 1, the NAS could even be an old router flashed with OpenWRT or a cheap $80 mini PC that has a portable or internal 2.5” disk attached.
Seems like a good way to do it.
Keep in mind Kopia has some weirdness when it comes to transferring repos between filesystem and S3, so you’d probably want to only keep one repo.
kopia.discourse.group/t/…/3560
Backblaze B2 is a cheap S3 provider. Hetzner storage box is even cheaper, but it doesn’t support S3 natively, so you’re likely to run into issues with the kopia repo compatibility I mentioned.
So, I would need 3 different s3 buckets? One for each system?
If that’s the case, I might go the raspberry pi in a relatives house route.
Ah, no, Kopia uses a shared bucket.
I am looking for a solution for a ~1TB collection, and the Glacier Deep Archive storage tier is barely above 1$/m for the lot. You may want to look into it ! If I remember correctly, the retrieval (if you one day need to get your data back) was around 20$ to get the data in a few hours, or 2$ to get it in a couple days.
That is interesting. I was not aware of that.
I’m using Kopia with AWS S3 for about 400GB and it runs a bit less than $4/mo. If you set up a .storageconfig file it will allow you to set a storage level based on the file names. Kopia conveniently makes the less frequently accessed files begin with “p” so you can set them to the “infrequently accessed” level while files that are accessed more often stay in standard storage: