What is you backup tool of choice?

dustyData@lemmy.world to Linux@lemmy.ml – 124 points –

I don't mean system files, but your personal and work files. I have been using Mint for a few years, I use Timeshift for system backups, but archived my personal files by hand. This got me curious to see what other people use. When you daily drive Linux what are your preferred tools to keep backups? I have thousands of pictures, family movies, documents, personal PDFs, etc. that I don't want to lose. Some are cloud backed but rather haphazardly. I would like to use a more systematic approach and use a tool that is user friendly and easy to setup and program.

124

Timeshift is nice to make things easy. I simply use good old-fashioned rsync tied to a cron job.

This is the way. A few test runs with non-critical files is always highly suggested to make sure you've got your syntax right.

So, just today actually, i wiped ubuntu and isntalled pop_os with btrfs. Basically using this walk through and setup Timeshift to manage snapshots.

https://mutschler.dev/linux/pop-os-btrfs-22-04/

but thats not really a backup.

I have a backup box i use for files with rsync and the like. Need to figure out a full backup method to may backup location though.

Might just setup an ansible deployment and call it a day.

I have to say that I used to be a timeshift fan but I’ve started moving to snapper instead. Both are very similar but with snapper you can have multiple configs, one per sub vol. each with different settings. I like having a separate root and home schedules set up. Means I can restore one or the other independently. Works a treat.

Nice. I’ll check it out for sure. That post I followed also i a link to the authors scripts to run a btrfs snap before apt runs.

Frankly I just moved some configs over before I did the wipe. My Linux desktops aren’t too customized.

I had to work around his how to a bit since I use nvme and a pre-partitioned disk that I had to pre-format lvm to (he used a default install run to pre-format the disks)

Borg Backup (specifically using Vorta front end)

Syncthing. I don't want to invest into a NAS and put some load into my already greedy power bill, so I chose something decentralized. Syncthing really just works like Torrent but for your personal files: Whatever happens on the computer, also does on the phone, and on the laptop. Each have about 1TB of space and 3 times redundancy? Hell yea buddy dig in.

I just found out about syncthing yesterday and it really is superb, it's so easy to use even crossplatform. unison is another syncing tool that I like, I find it better for bidirectional syncing

But that's not really backup, is it? It just synchronizes folders.

Yes but it is a automated backup solution if you want it to. I just put important stuff in the Syncthing folder and rest assured its also on the phone incase the computers SSD caughs fire.

I think you are confusing synchronizing with backup. If you delete a file in your Syncthing folder and the deletion gets synchronized, that file is lost. If you do the same in a folder backed up by, say, Borg, you can roll back the deletion and restore the file.

I may be wrong about Syncthing, though. I haven't used it yet, but will probably use it in the future. Just not for backup :)

This is true if you leave it at defaults but I make use of file versioning. When you flick that one on, files that are otherwise replaced or deleted will actually move to a offline .stversions folder. That is very vital I must say in case a host catches some encryptor malware eheh

I didn't know that was a possibility. Still, it seem kind of not really what Syncthing is intended for. I mean, they even state it in their FAQ:

No. Syncthing is not a great backup application because all changes to your files (modifications, deletions, etc.) will be propagated to all your devices. You can enable versioning, but we encourage you to use other tools to keep your data safe from your (or our) mistakes.

Rsync

I use rsync personally, but for low tech family and especially cross platform backup to network locations, Carbon Copy Cloaner is a nice interface and runs a series of rsyncs under the hood.

Kopia repo on a separate disk dedicated to backups. Have Kopia on my servers as well sending to my local s3 gateway and second copy to wasabi.

Wholly off topic.

I feel like you should know about this if you don't already.

Not trying to out myself, but I may be one of the few people that actually owned that shirt lol

YOU CANNOT DOXX WHAT YOU CANNOT SEE.

I may have one on now.

I am using Borg for years. So far, the tool has not let me down. I store the backups on external hard drives that are only used for backups. In addition, I save really important data at rsync.net and at Hetzer in a storage box. Which is not a problem because Borg automatically encrypts locally and for decryption in my case you need a password and a key file.

Generally speaking, you should always test whether you can restore data from a backup. No matter which tool you use. Only then you have a real backup. And an up-to-date backup should always additionally be stored off-site (cloud, at a friend's or relative's house, etc.). Because if the house burns down, the external hard drive with the backups next to the computer is not much use.

By the way, I would advise against using just rsync because, as the name suggests, rsync only synchronizes, so you don't have multiple versions of a file. Which can be useful if you only notice later that a file has become defective at some point.

Restic and borg are the best I’ve tried for remote, encrypted backups.

I personally use Restic for my remote backups and rsync for my local.

Restic beats out borg for me because there are a lot more compatible storage options.

Switched to Restic because then I don’t need any extra software on the server (Synology NAS in my case).

+1 rsync, to an external harddrive. Superfast. Useful also in case I need a backup of a single file that I changed or deleted by mistake. Work files are also backed up to the cloud on mega.nz, which is very useful also for cross-computer sync. But I don't trust personal files to the cloud.

Don't forget that a local backup is as bad as no backup at all in the case of a fire or other disaster. Not trusting the cloud is fine (though strong encryption can make this very safe), but looking into some kind of off site backup is important. Could be as simple as a second hard drive that you swap out weekly stored in a safe deposit, or a nas at a trusted friends house.

Completely agree! I didn't mention this, but I keep the back-up hard drive in another apartment.

This reminds me of a story that happened in some university in England: they had two backups of some server in two different locations. One day one back-up drive failed, and the second failed the day after. Apparently they were the same brand & model. The moral was: use also different back-up hardware brands or means!

3 2 1 3 different backups 2 different mediums 1 off-site

Haven't seen that not be good move yet.

I almost never see rdiff-backup in such threads, so I am bringing it up now. Somehow I really like how it works and provides incremental backup with folder structures and file access still accessible directly. Works well enough for me.

I love rdiffbackup.

I use it to backup a 30 TB array and it completes in like 20 minutes if there are no changes.

There's dozens of us! I started using it while I wrote my thesis, running a backup like every hour while writing.

Absolutely - rdiff-backup onto a local mirror set of disks. As you say, the big advantage is that the last "current" entry in the backup is available just by browsing, but I have a full history just a command away. Backups are no use if you can't access them, and people really under-rate ease of access when evaluating their backup strategy.

It also works over ssh. :)

I used to be mostly restic but I've since moved over to Kopia - having the central server on the nas and shipping those files to B2 is easy enough for my level of laziness.

At work/for business, you can't beat Veeam. It's the gold standard and there is literally nothing better.

At home, Duplicity. Set it up once and then just let it go, and it supports a million different backup targets you can ship your backups off to, including the local filesystem. Has auto-aging/removal rules, easy restores, incrementals, etc. Encrypts by default too.

My local backups are handled by rdiff-backup to a mirror set of disks. That means my data is versioned but easily accessible for immediate restore, and now on three disks (my SSD, and two rotating rust drives). It also makes restores as simple as copying a file if I want the latest version, or an easy command if I want an older version. And testing backups is as easy as a diff command to compare the backup version with the live version.

Having your files just be files in your backup solution is very handy. At work I don't mind having to use an application like Veeam, because I'm being paid to do that. At home I want to see my backups quickly and easily, because I'd rather be working on my files than wrestling with backup software...

Remote backups are handled by SpiderOak, who have been fine for me for almost a decade. I also use them to synchronise my desktop and laptop computer. On my desktop SpiderOak also backs up some files in an archive area on the rotating rust mirror set - stuff that's large and I don't access often, so don't need to put on my laptop but do want backed up.

I also have a USB thumbdrive that's encrypted and used when I'm travelling to back up changes on my laptop via a simple rsync copy - just in case I have limited internet access and SpiderOak can't do its thing...

I did also have a NAS in the mix once, but I realised that it was a waste of energy - both mine and electricity. In normal circumstances my data is on 5 locations (desktop SSD, laptop SSD, desktop mirror set, SpiderOak's storage) and in the very worst case it's in two locations (laptop SSD, USB thumbdrive). Rdiff-backup to the NAS was simply overkill once I'd added the local mirror set into my desktop, so I retired it.

I'd added the local mirror set because I was working with large files - data sets and VM images - and backups over the network to the NAS were taking an age. A local set of cheap disks in my desktop tower was faster and yet still fairly cheap.

Here's my advice for your consideration:

  • Simple is better than complicated.
  • How you restore is more important than how you backup; perform test restores regularly.
  • Performance matters; backups that take ages are backups you won't run.
  • Look to meet the 3-2-1 criteria; 3 copies, on 2 different storage systems, with at least 1 in a different geographic location. Cloud storage helps with this.

Good luck with your backup strategy!

⬆️ for rdiff-backup since it keeps the last backup easily readable.

I had before (and I think I'll implement it again) snapshot capable filesystem where to I rsynced my stuff. Then once a day did a snapshot of the backups. It has the advantage of all the backups being easily readable as long as your backup filesystem is intact and your kernel can mount it.

Timesift and a usb drive

I use this and then for each 2 weeks rsync to my cold storage. Some data I also use rclone bisync to backup to cloud, in case I need it so bad, when I'm hitting the road.

I almost never see FreeFileSync mentioned in those threads. It's the only GUI based app I know that also gives you options to not copy file deletions for example. Also has the option to be automated with crontab. Backups are not fragmented or repackaged so you can browse them just fine. Encryption can be done with Veracrypt.

Git for projects, NAS for 3D printing stuff, mods for games and unofficial game translations, Google Photos for photos (looking to migrate away from that when I have time). I don't much care about anything else.

Git as backup?! Ok...

Git for projects

I assume the original comment meant code based projects, for which git, if repo is pushed to a remote, is a very sane choice.

Yep, that's what I meant. If it's a public project, it's on my GitHub, if it's a private one, it's on my private GitLab instance.

Yeah, git without LFS isn't optimal for non-text files.

Meaning that as long as you're regularly committing your work to Github/Gitlab/wherever, you don't need to backup your source directory.

I like Pika Backup. It's a frontend for borgbackup that also let's you mount and browse your archive with a few clicks. I think it's pretty handy on a desktop PC. And since it uses borgbackup you also get encryption with it.

I use timeshift for local backups, then duplicati for backing up to Amazon glacier monthly.

I use dirvish a text based cron enabled rsync front end. Read dirvish.org for details about it.

I use this to clone and hold time based backups to external disks which I can verify or use offsite.

Rock solid for years.

@dustyData I have hundreds of thousands of files that need to be backed up locally and in the cloud. I use either Vorta or Pika. Both are interfaces for Borg. Easy to use and their deduplication feature manages to save a lot of diskspace. I tried so many backup solutions and none worked as reliably.

I do 2 backups

Veeam system image daily; this is a fully bootable image of every drive on my system, kept for things like hardware failure or "oops" moments. It just goes to my NAS for fast local storage.

Online backup of important files daily; this has changed a few times, I was using Restic to B2, then Duplicati to Wasabi S3, now I'm using iDrive to see how that is.

My favorite tools are definitely Veeam and Duplicati, because they both have a good UI and are easy to use, both automatically run in the background and handle scheduling entirely on their own. Browsing snapshots is easy and finding the files you want at a specific date/time is quick.

Restic and Kopia I've used as well, they're much harder to use especially for restores, finding files is a nightmare via CLI. Scheduling is a pretty involved step, and you have to figure out how to run them in the background yourself. Both also performed really slowly for me on my ~3TB backup set of about 50k files, compared to Veeam and Duplicati which are very fast.

+1 for Veeam. I am a backup administrator and this is our tool of choice. I use it for my home machines as well and it works great.

Just remember, you don’t have a backup unless you have tested it.

I’ve found Restic great once dialed in. I have a systemd service run backups automatically. Super fast thanks to only backing up diffs; only the initial backup is slow.

Yes making a script and service isn’t for everyone.

Finding files in the backup is easy… you just mount the backup and search any way you want, just like any other directory. Not sure why that’s hard?

I've found restores really slow mostly, initial backups are slow but not too bad.

As far as mounting the backup and searching it, mostly it's just a lot of steps to remember.

Ah. I also made another script where I type loadbackup in bash and everything is just there. I guess I’ve just made it easier for myself lol.

I also load Restic variables in bash so I’m not typing out paths etc. Password is kept in gnome keyring and is requested automatically.

I forget the annoying steps cause I’ve had this for awhile.

Yeah I mean it's all stuff that can be solved, but I just don't have the time or urge to deal with it lol

GNOME Disk Utility for backing up the whole hard drive. Otherwise, I use BackInTime.

KDE user so for my personal files I backup with both Kups and Bups (install both) and you get the choice of cloning type or only changed files with going back in time choices. Integrates into KDE taskbar/system settings.

For redundancy, I back up my main sync folder on the desktop to my laptop using Syncthing over my WiFi/network.

Borg backup (via Pika Backup (Libadwaita gnome app) frontend) to one of my physical drive and also to borgbase.com (free tier 10 gb free)

Restic in the homelab and Veeam at work. I’m pretty happy with both!

Restic (local repo) which I sync onto a Hetzner Storagebox using rclone.

Well it was duplicati, until it pulled this bullshit on me. I had a critical local failure of my data a month ago, 2.8TB lost. Pulled the backup off AWS S3 with my linux server, asked Duplicati to restore it, and it's failed 4 times for random reasons, taking a week to get there each time. Once I can get this backup to finally restore, I'm moving over to Duplicity.

I just map my entire documents, pictures and other important home folders to subfolders inside Dropbox. This propagates all of my files across all of my computers via the cloud and makes everything accessible from my phone as well.

I don't worry about backing up my operating system, though important configuration file locations are also mapped into Dropbox for easily setting things up again. Complete portable apps are also located in Dropbox.

Duplicity over SSH to my backup NAS, which then backs up to a cloud service iDrive weekly.

My phone and tablet are both Samsung, which uses OneDrive for backups

An external hard drive works 100%. And relying on .dotfiles to redownload the whole thing back.

...I mean, it takes like less than 3 minutes to redownload and 5 reconfiguring everything manually, so eh.

I have no relevant data locally. My Documents is a symlink to a Nextcloud directory running on my Synology NAS on a RAID1 that backups to cloud storage via one of their tools (forgot which one).

I never liked having to backup working machines. If it breaks I'm fine with having to install again. I won't lose data though.

I've used a combination of

  • Managing ZFS snapshots with pyznap
  • Plain old rsync to copy important files that happen not to be on ZFS filesystems to ZFS.

If I were doing this over today, I'd probably consider https://zrepl.github.io/ instead of pyznap, as pyznap is no longer receiving real active development.

In the past I've used rdiff-backup, which is great but it's hard to beat copy-on-write snapshots for speed and being lightweight.

I use boring old zfs snapshot + zfs send -i.
It's not pretty, but it's reliable.

I’ve recently started using proxmox -backup-client. Works well. Goes to my backup server along with my vm image backups. Works nicely with full deducing and such. Quite good savings if you are backing up multiple machines.

I the. Rsync this up to cloud once a day.

Time shift with rsync, and on occasion I clonezilla the drive and save it to my NAS.

At this moment I use too many tools.

For user data on my PC and on home server I mostly use Duplicacy. It is fast and efficient. All data backed up locally on NAS box over SFTP, and a subset of that data is backed up to S3 cloud storage.

I have a Mac, this one is using TimeMachine, storing data on NAS, then it's synced to S3 cloud storage one a day.

And on top of that VMs and containers from home server are backed up by Proxmox built in tool to NAS. These mostly exclude user data.

Truenas on a inexpensive server with RAID. I have several computers in different rooms in the house I like to make music on, and on these pc's my network drives all have the same drive letters for the sample libraries, recordings, projects, and backup. So my projects can run from any computer without missing files. I always save locally and on the Truenas.

I just use MegaSync, which backsup my config folder and documents folder.

On phone, I use syncthing to backup to home server (I never knew syncthing can backup over WAN), then synced to MegaSync. I also keep all the files on MegaSync on my server just in case megasync suddenly goes down one day.

rsync (laptop -> external HDD, workstation -> dedicated backup HDD)
Syncthing (laptop <-> desktop)

A hand-made combination of tar, rsync and rclone, to a set of portable drives and remote systems.

After having suffering the breakage of computers since the 80's, I want to have the easiest way of restoring backups as possible.

Dejadup backup is neat if you need a GUI. But TBH, you really don't need a GUI, restic will work just fine as long as you target a few folders. It mostly boils down to file/folder hygiene.