The migration of large communities from Reddit to Lemmy is like a world-renowned band performing an acoustic set in a library for 50 people.

Favrion@lemmy.world to Fediverse@lemmy.world – 658 points –

The fanbase is still large, but the Lemmy community hasn't quite caught up yet, and now there is a transitional period where the audience is smaller.

140

You are viewing a single comment

This is a concern, but luckily this isn't required. I set up hobbit.world to host my Tolkien related communities. It only costs $6 a month plus the $35/yr for the domain name to host a tiny instance like this. I don't need to depend on anyone but my hosting provider.

To be safe I should download backups once a month or so.

But the point is that for big communities that people put a lot of time into, there should be an instance for each one owned by one of the mods.

Edit: Meant to reply to the person concerned about the centralization of communities.

To be safe I should download backups once a month or so.

Please do it more often if you have users other than yourself. One backup on the same server is barely a backup at all.

Fair enough. I'll look into automating it using some sort of storage from another provider.

Even just a cronjob or scheduled task to download the backups to a machine at another location would be a big improvement. Then you can do it far more often because it's automated.

But personally I like to have both a copy on a PC and a cloud backup, in addition to the server.

I'm using the easy Lemmy script to run the docker instance. How do I take a backup of a running docker instance.

The backups I've done so far are full shard backups. But I don't have a way to automate that.

The page here explains getting a database dump on a running instance (and how to restore): https://join-lemmy.org/docs/administration/backup_and_restore.html

Then just back up the other files in the volumes directory where Lemmy is installed (everything except postgres, which is what the database dump does).

The pictrs volume includes both the uploaded images and the image cache. I have no idea how to separate out the uploaded images so you don't have to back up the cache, I just back it all up.

this is the bash script I use to create backups

#! /bin/bash
# https://join-lemmy.org/docs/administration/backup_and_restore.html#a-sample-backup-script
now=$(date +"%Y-%m-%d_%H.%M.%S")

cd ~/lemmy && (docker-compose exec -T postgres pg_dumpall -c -U lemmy 1> dump.sql 2> dump.errors)
cd ~/lemmy && zip -r9 ~/bak-lemmy-$now.zip ./ --exclude "volumes/postgres/*"
rm -f ~/lemmy/dump.sql

it creates very small zip files as a result so it's very efficient

I made a cron for it to run every 3 hours, like

0 */3 * * * ~/lemmy/backup.sh

I figured out how to do this with docker container, but that's not ideal for a script.

Using docker compose it just fails with: Service "postgres" is not running container #1

I can see lemmy-easy-deploy if I do: docker compose ls

The service name is postgres in the docker-compose.yml file. Any idea what the issue might be?