Authorized Fetch Circumvented by Alt-Right Developers

Sean Tilley@lemmy.world to Fediverse@lemmy.world – 145 points –
Authorized Fetch Circumvented by Alt-Right Developers
wedistribute.org

Authorized Fetch (also referred to as Secure Mode in Mastodon) was recently circumvented by a stupidly easy solution: just sign your fetch requests with some other domain name.

70

Repeat after me: anything I write on the internet should be treated as public information. If I want to keep any conversation private, I will not post it in a public website.

I agree with you, however there are issues with not just privacy but also authenticity. I should be able to post as me, even in public, and have a way to prove it. Nobody else should be posting information as me, if that makes sense.

For that, we should start bringing our own private keys to the server, instead of trusting the server to control everything.

And if we start doing that, pretty soon we will end up asking ourselves why do we need the server in the first place, and we will evolve to something like what nostr is doing.

I'm all for it.

...evolve to something like what nostr is doing.

Giving places for cryptobros to wank without being pointed at and laughed at by their betters?

No. You are thinking of Discord.

I'm pretty sure that 99.44% of nostr is cryptobros.

My friend, you suck at trolling. Can you just let it go?

When I see nostr users that number more than, say, six who aren't also cryptobros, I'll drop the nostr disrespect. Until then ... 🤷‍♂️

You are doing nothing but a strawman. Lemmy is developed by shit-for-brains tankies, yet there is no denying that their work has brought progress to the distributed web.

Same thing for nostr. Whether you like it or not, nostr "cryptobros" have shown a bunch of things that need improvement on the Fediverse and they are backing their words with actions and working code. You on the other hand have nothing but smug, pretentious bullshit to throw around.

Again, when you can show me a cryptobro concentration lower than 99.44%, I'll take nostr seriously. And when you can show it not turning into a Hellhole worse than Xhitter and Farcebook combined because of the very philosophy underpinning it, then I'll think it's actually worth looking at. (Hint: this is not possible.)

Until then I'll call it what it is: a place for cryptobros to wank to their faux-libertarian fantasies.

2 more...
2 more...
2 more...
2 more...
2 more...
2 more...
2 more...
2 more...

Sure, but that's already solved on the fediverse by using HTTP Signatures and isn't related to Authorized Fetch.

I meant to say generally, for folks that might read this comment and think problems surrounding the platform and security are solved.

Clear sign every post using a third-party application. Make your public keys known far and wide. Authenticity solved.

And now we're dealing with key management instead

You always need key management if you have decentralized authentication.

You always need key management if you have decentralized authentication.

2 more...

Seriously. Bobthenazi could just go to an aligned server and make an account Bobthenotzi and boom -- perfectly able to follow whoever he wants.

One more reason to argue that we should drop the idea of "aligned" servers and that we are moving to a future where it is better to charge (small) amounts from everyone instead of depending on (large) donations from a few.

Ideally, a distributed fediverse wouldn't need much in terms of donations because it's a bunch of small instances instead of a few huge ones.

Not the point. The point that instances that are open for everyone will be open for bad actors as well.

If the mere act of signing up to an instance requires a small payment, you are automatically preventing the absolute majority of spammers, "spray and pray" scammers and channer trolls.

To add a bit of important nuance to this idea (particularly how this argument comes up with regards to threads). This does not apply to legal rights over your content. That is to say, of course you should treat any information you put out there as out of your control with regards to access but if somebody tries to claim legal rights over your content they are probably breaking the law.

Right. Publicly available does not mean in public domain. But the issue here is not of copyright, but merely of gated access.

Totally. I'm just trying to bring it up whenever I see folks having this discussion because some people don't seem to make the distinction. Worries me that some are so willing to cede that big social will illegally hoover up our data and there's nothing we can do about it.

anything I write on the internet should be treated as my private information. If I want to keep any conversation private, I will still post it in a public website.

EDIT: I'm so sorry that my stupid comment offended some people. Always forget how special some people can be on this website. Once again I'm sorry for my lack of better judgement.

Wow, why are you so triggered just because some people didn't think you were funny?

About as triggered as those who downvoted me.

No, you're right. Everyone who downvoted probably also went on an angry tirade first, but they just didn't type it out. Totally the same. đź‘Ť

?

He thought he was funny, he repeated what the above poster said to repeat.

I don't think your comment was offensive per se. It was just ridiculously naive. If we are trying to build practical tools, they have to fit how things work in the real world, not how they work in anybody's dreams. If you want to have private conversations on a public website, use encryption.

2 more...

And so, once again, people discover the unsolvable dilemma of DRM.

You can't both publish your data where it can be seen by computers that are not under your control and somehow keep control of that data. Anything that purports to do so is either a temporary bandaid soon to be bypassed or nothing but placebo to begin with.

What I think a lot of conversations about privacy and security on the Fediverse miss is that the Fediverse is radically public.

A protocol that sends everything you share to a long list of servers that haven't been pre-screened and could be anything from a professionally-managed instance of vanilla Mastodon to an ad hoc, informally-specified, bug-ridden, slow implementation of half of ActivityPub running on a jailbroken smart light bulb can only ever be radically public. It's possible to block most interactions with someone you don't want to talk to, but not to reliably prevent them from seeing content you share to anything more than a short list of vetted followers.

There probably isn't any reasonable way to change that while keeping the open federation model, though it's possible to build closed networks on top of ActivityPub for those who want the formats it supports for a curated group. This isn't a problem to be solved in my view, but an inherent reality: the Fediverse is for things you want to make public.

Exactly! The only way that we can make sure that the Internet is not controlled by anyone is to make it available for everyone. If we are fighting for an open internet, we need to understand that this type of thing will be part of the package.

We, by which I mean some loose group of people who want decentralized tools to thrive should also be building things for secure, private communication, and we are. Matrix, for example offers strongly end-to-end encrypted federated chat rooms and private messages. It also has a kind of rough UX and, IIRC resource-intensive server software. We should work toward improving that.

I'm not advocating against privacy at all. I want people to understand as clearly as possible that Mastodon, Lemmy, and anything that works like them isn't private and can't be private when part of an open federated network so they can decide whether that's a good fit for how they're using it. The block evasion described in the link is just run a server on a domain that isn't blocked, and I imagine any other mitigations bolted onto Mastodon that don't break open federation will be little better.

tools to thrive should also be building things for secure, private communication.

Sure, but this should not be seen on the same class of software of "social media" or even "the web".

Matrix [...] has a kind of rough UX and, IIRC resource-intensive server software. We should work toward improving that.

Except that I get the vibes from the Matrix community that the shit UX is part of the attraction because it does a wonderful job of gatekeeping.

I don't hold out much hope for Matrix working out ever, but perhaps someday someone will use it as inspiration for making something that doesn't suck.

Matrix is the protocol. You can have whatever client you like. There are mobile apps that are similar to discord and connect to matrix servers.

I'm kind of tired of social networks offering even the pretense of privacy. Just loudly proclaim that everything is public but clients can filter out shit you don't wanna see.

That doesn't work for vulnerable minorities. Manually filtering each shitty person after you step in their shit gets old. Coupled with the fact that not shutting down shitty people just means more shitty people are likely to turn up.

It's not sustainable

I think in this context it's meant on a technical level: as far as the fediverse is concerned, there's not a whole lot instances can do. Anyone can just spin up an instance and bypass blocks unless it works on an allowlist basis, which is kind of incompatible with the fediverse if we really want to achieve a reasonable amount of decentralization.

I agree that we shouldn't pretend it's safe for minorities: it's not. If you're a minority joining Mastodon or Lemmy or Mbin, you need to be aware that blocking people and instances has limitations. You can't make your profile entirely private like one would do on Twitter or any of Meta's products. It's all public.

You can hide the bad people from the users but you can't really hide the users from the bad people. You can't even stop people from replying to you on another instance. You can refuse to accept the message on the user's instance, but the other instance can still add comments that don't federate out. Which is kind of worse because it can lead to side discussions you have no way of seeing or participate in to defend yourself and they can be saying a lot of awful things.

You can’t make your profile entirely private like one would do on Twitter or any of Meta’s products.

Even those are not private.

It's the unfortunate reality. Social networks simply cannot in offer privacy. If they were upfront about it, then people could make rational decisions about what they share.

But instead they (including Mastodon) pretend like they can offer privacy, when they in fact cannot, resulting in people sharing things that they would not otherwise share.

It's not as black and white as you make it. The options aren't "perfect security" and "no security".

The option that most people that experience regular harassment want is "enough security to minimise the shit we have to deal with to a level that is manageable even if it's imperfect"

While you're theoretically right, we've seen in practice that nobody really offers even the imperfect privacy you describe, and on decentralized systems it only becomes harder to solve.

A Facebook style centralized network where you explicitly grant access to every single person who can see your content - is as close as we can get. But nobody is trying to make that kind of social network anymore, because there isn't much demand for it.

If you want a soapbox (Twitter/mastodon/bluesky, Reddit/Lemmy/kbin, Instagram/pixelfed, YouTube/toktok/peertube) then privacy is going to be a dream, especially if decentralized.

Vulnerable folk are looking for community, not a soap box. The goal is to connect with other folk whilst being as free as possible from harassment.

It's absolutely possible to achieve that without perfect privacy controls.

Privacy and being free of (in-context) harassment aren't the same thing. Your posts can all be public but your client can filter out any harassment, for example.

If the goal is privacy so that people who aren't in the community don't know that you're in the community, and don't know what the community is even talking about, I'm skeptical that it's practical. Especially for a decentralized network, I think that the sacrifices needed to make this happen would make the social network unappealing to users. For example, you'd need to make it invite only and restrict who can invite, or turn off any kind of discovery so that you can't find people who aren't already in your circle. At that point you might as well just use a group chat.

Privacy and being free of (in-context) harassment aren't the same thing.

They're related. Often, the ability to limit your audience is about making it non trivial for harassers to access your content rather than impossible.

If the goal is privacy so that people who aren't in the community don't know that you're in the community

That's not the goal. The goal is to make a community that lets vulnerable folk communicate whilst keeping the harassment to a manageable level and making the sensitive content non trivial to access for random trolls and harassers.

It's not about stopping dedicated individuals, because they can't be stopped in this sort of environment for all the reasons you point out. It's about minimising harassment from the random drive by bigots

Hmmm I think I understand the intent. I'll have to think on it some more.

My gut tells me that protecting people from drive-by bigotry is antithetical to content/community discovery. And what is a social network without the ability to find new communities to join or new content to see?

Perhaps something like reddit where they can raise the bar for commenting/posting until you've built up karma within the community? That's not a privacy thing though.

What would this look like to you, and how does it relate to privacy? I've got my own biases that affect how I'm looking at the problem, so I'd be interested in getting another perspective.

You're thinking about this in an all or nothing way. A community in which everyone and everything they post is open to everyone isn't safe.

A community in which no one can find members or content unless they're already connected to that community stagnates and dies.

A community where some content and some people are public and where some content and some people are locked down is what we need, and though it's imperfect, things like authorised fetch brings us closer to that, and that's the niche that future security improvements on the Fediverse need to address.

No one is looking for perfect, at least not in this space.

I don't think I'm looking for perfect, I'm looking for "good enough" and while authorized fetch is better than nothing, it's nowhere near "good enough" to be calling anything "private".

I'm thinking that maybe we need to reevaluate or reconsider what it looks like to protect people from harassment, in the context of social media. Compare that to how we're currently using half-functional privacy tools to accomplish it.

I'm not saying existing features are good enough.

I'm saying that they're better than the alternative that started this conversation.

"Just loudly proclaim that everything is public but clients can filter out shit you don't wanna see"

That's what Twitter does right now. It's also a hate filled cesspit.

The Fediverse though, even though it has hate filled cesspits, gives us tools that put barriers between vulnerable groups and those spaces. The barriers are imperfect, they have booked holes and be climbed over by people who put the effort in, but they still block the worst if it.

5 more...
5 more...
5 more...
5 more...
5 more...
5 more...
5 more...
5 more...
5 more...
7 more...

It’s not sustainable to keep offering poorly designed solutions. People need to understand some basic things about the system they're using. The fediverse isn't a private space and fediverse developers shouldn't be advertising pseudo-private features as private or secure.

9 more...
9 more...

I still could use an ELI5 about what this authorized fetch feature was supposed to do. Was it supposed to basically disengage the Mastodon network from Threads? To stop Threads crap from showing up on Mastodon? Or to stop Mastodon discussions from showing up in Threads? Or something different?

Authorised Fetch existed long before Instagram Threads. When it is turned on, an instance will require any other server to sign their request to fetch any post. This prevents "leaking" of posts via ActivityPub to blocked instances.

This setting is turned off by default, because some software are incompatible with it (like /kbin, Pixelfed before June 2023, maybe Lemmy too), because it makes server load higher, and it may make some replies missing (at least on microblogging side).

When it is turned on, an instance will require any other server to sign their request to fetch any post. This prevents “leaking” of posts via ActivityPub to blocked instances.

Oh I see. Yeah that sounds pretty hopeless. Does it use the fetching site's domain validated TLS certificate? Is the idea to permit fetching unless the fetching domain is on a blacklist? If yes, someone didn't have their thinking cap on. The whole concept is dumb though, there is no way to prevent posts from leaking. The saying is that once 3 people know a secret, it is no longer secret.

Stop asking for pseuso-privacy features. The Fediverse is public by nature. Any "measures" to control access to the public posts on it are just lying to users.

Server owners should be able to control who can access their servers - but that is NOT - and should NOT be - treated as a privacy feature.

“The Net interprets censorship as damage and routes around it.” - John Gilmore