robots.txt is a suggestion

tekeous@usenet.lol to Lemmy Shitpost@lemmy.world – 616 points –
25

You are viewing a single comment

TBF, pushing a site to the public while adding a "no scrapping" rule is a bit of a shitty practice; and pushing it and adding a "no scrapping, unless you are Google" is a giant shitty practice.

Rules for politely scrapping the site are fine. But then, there will be always people that disobey those, so you must also actively enforce those rules too. So I'm not sure robots.txt is really useful at all.

No it's not, what a weird take. If I publish my art online for enthusiasts to see it's not automatically licensed to everyone to distribute. If I specifically want to forbid entities I have huge ethical issues with (such as Google, OpenAI et. al.) from scraping and transforming my work, I should be able to.

Nothing in my post (or in robots.txt) has any relation to distributing your content.

What else would they scrape your data for? Sure some could be for personal use but most of the time it will be to redistribute in a new medium. Like a recipe app importing recipes.

Indexing is what "scrapers" mostly do.

That's how search engines work. If you don't allow any scraping don't be surprised if you get no visitors.

Search engine scrapers index. But that's a subset of scrapers.

There are data scrapers and content scrapers, and these are becoming more prolific as AI takes off and ppl need to feed it data.

This post is specifically about AI scrapers.

How would a site make itself acessible to the internet in general while also not allowing itself to be scraped using technology?

robots.txt does rely on being respected, just like no tresspassing signs. The lack of enforcement is the problem, and keeping robots.txt to track the permissions would make it effective again.

I am agreeing, just with a slightky different take.

User agent catching is rather effective. You can serve different responses based on UA.

So generally people will use a robots.txt to catch the bots that play nice and then use useragents to manage abusers.

People really should be providing a sitemap.xml file