How will Lemmy handle illegal content?

jk47@lemmy.world to No Stupid Questions@lemmy.world – 138 points –

How will Lemmy handle illegal content like drug dealing, child porn, snuff movies etc? On Reddit, the corporation is accountable so they will make an effort to ensure non of this exists on their platform. And they would face legal troubles if they failed to act.

But as Lemmy is decentralised this isn't really possible. Sure the main instances can defederate from bad instances, but those instances could still operate and be accessible on the web. Especially if they're hosted in countries outside of the western sphere of influence. Multiple bad instances could federate and duplicate the illegal content pretty easily, making it difficult for the authorities to keep it shut down. Has this been thought about already?

37

"Lemmy" can't handle anything. That's by design.

"Lemmy" is really just a piece of software that people can use to run forums that will federate with other forums and so forth and so on. There is no central "Lemmy" authority that could do anything, and that's by design, and a lot of the point. It means that there can never be a Lemmy spez or Musk or Zuckerberg, fucking things up for everyone.

The highest authorities are the individual instance owners, so it will fall on them to deal with illegal content as they see fit. Presumably they'll generally work to keep it off of their own instances through active moderation, and they'll block other instances that they have reason to believe do not maintain acceptable standards.

And like it or not, some share of responsibility will fall on individual users to manage their own activities in order to avoid problematic instances.

The trade-off for having no central authority that can fuck things up for everyone is that there's no big mommy/daddy to watch over you and protect you. The fediverse is better suited for people who are okay with that.

It's like they host those sites on a apache server.

Lemmy is just the subjacent software that runs an instance.

Authorities would track the illegal content the same way they do on any other website I wouldn't worry too much about it. Also descentralized illegal content exist since P2P protocols exist. I don't see anything new with lemmy.

It just makes it easier to find. Instance owners/admins are legally bound by the host country's laws, so the smarter ones know which countries they should host their stuff.

My first guess would be filtering out illegal content is something that the operator/admins of the given instance have to take care according to the law of the country the server is hosted in.

This becomes a problem though when, like in my case at the moment I'm just me running my own instance. Even if other users join my instance, how many are going to want to moderate the content? Not many I'd bet. There is so much content coming in right now. As of right now I've received 405 new threads, 5652 new comments and information like avatars (which could also be explicit) from 6058 users.

How does one person moderate all of that? I'm not sure (not tested, but I doubt it) whether a moderator deleting a comment or a post on the owning instance is also sent to other instances federating. I think it should (and maybe have it configurable whether you honour them or not, but it should be on by default). That way the moderator team on the hosting instance is removing stuff. Sure on a per instance level reported content can also be deleted. But that should be the outlier.

Reddit mods worked for free, there's no reason why it would be different here. The tricky part will be picking the competent and fair ones out of all the applicants.

Each Fediverse instance is just a server running software (I.e. Lemmy, Kbin) that uses the ActivityPub protocol to communicate with other instances.

There is no central authority, that's kinda the whole point of being federated and decentralised. Each instance is it's own website, it's own island. It'd be like asking how "Email" or "Https" would handle illegal content. It doesn't, it's up to the hosts themselves to do so.

On a fundemental level, each instance is just a website, so an illegal instance would be tracked down and prosecuted in the same way as any other website/forum doing illegal stuff.

Best you can do is encourage your instance's admin to defederate from those illegal instances if they haven't already. Let the authorities handle the rest.

The responsibility would fall on the instance admin(s). Moderators could also share the responsibility, as well as any users involved in the illegal activity.

Images aren't federated. You're viewing them from their host server.

So it's not really duplicated anywhere

I'm not so sure this is the case. The images folder on my single user instance is over 1gb, and I haven't posted any images.

Well actually you're caching thumbnails, but I think those only stick around for a month.

I'm going to have to disagree. I just remoted in to check it out. I have 2560x1440 resolution images weighing 2.9 MiB in there. It's pretty clear that these are full size images.

Well that's interesting as you'll never see those images actually served to users - find a remote post where the image URL is your local server. Doesn't happen.

Interesting indeed. I spent some time combing through the DB and pictrs folder to try to figure it out, but I'm at a loss so far.

I’d say moderation is up to each instance, but it does leave the question if anyone is legally responsible for ensuring no instances have illegal content - I’d guess no. If it’s open source then the perpetrators will be solely responsible for illegal content on that instance, as they used the open source platform for nefarious purposes… I’m thinking individual instances have a legal obligation to keep that content blocked, but I’m not sure. I think this is a really good question

The owners of the servers would be responsible to the laws of whatever country the servers are located.

1 more...

With Mastodon, there are blocklists that make it fairly easy for moderators to block instances that post illegal content (or anything else, for that matter). I imagine something similar could be done with Lemmy.

Who will the blocklist affect? mods don't control what's being posted on another instance and if there are instances where that's commonplace they'll just post there

Like

Im sure it would be easy to track on behalf of authorities, no more than downloading a large zip and uploading it elsewhere is. Authorities can notice the unique characteristics of files and find them across the web.

As for actually stopping this, right now a lot of instances federate with many instances, I'd be worried about bad actors fwderating with big instances and then posting illegal shit, meaning big instances download onto their servers by accident. More moderation tools need to be developed into fedi.

Illegal content.. well it depends. Illegal informations and piracy tricks? Already happening. But beyond that.. illegal medias need to be stored and at large somewhere. It's still better to do illegal shit on WhatsApp or Telegram (as it still happens) because messages, unlike here, are end to end private. I don't think this is so novel, it isn't in fact, that authorities or admins don't know what to do. Also illegal is too generic. Laws vary from country to country as well. I don't see Lemmy becoming any sort of terrorist hub anytime soon.

What's illegal information?

Stolen data. Credit cards, phone numbers, etc. Illegal procedures (e.g. h0w to buiId a b#mb). Stole accounts.

ask the warthunder forums, lol

::: spoiler spoiler

they repeatedly leaked classified military documentation to win arguments

:::

Cloudflare offers abuse imagery scanning which seems ideal for Lemmy. If you're using them as a CDN already it's something you can simply apply for.

Instances that don't want to take any action against such content would likely be prosecuted eventually.

The creators of Lemmy keep telling us about the "federation" of the social network. A federation is something that has a federal center that exercises a governing function.

Just because we all have not been informed of the existence of such a center does not mean that it does not exist.

Don't look up for illegal content? Then you won't get to see it. #unpoplaropinion

There was a discussion a few weeks ago where the NSFW instance made it a rule that illegal material would not be removed.
There was then a clarification that if the material was obviously illegal it would be removed but if it was questionable it would be allowed to stay.
This makes everyone that allows NSFW material to show up on their feed open to seeing illegal content.
OP's question is a valid concern.