UK Trial: Pornhub's Chatbot Halts Millions from Accessing Child Abuse Content

Squire1039@lemm.ee to Technology@lemmy.world – 532 points –
A Pornhub Chatbot Stopped Millions From Searching for Child Abuse Videos
wired.com

A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub's UK site, with hopes for similar measures across other platforms to create a safer internet environment.

232

You are viewing a single comment

It's surprising to see Aylo (formerly Mindgeek) coming out with the most ethical use of AI chatbots, especially when Google Gemini cannot even condemn pedophilia.

In the link you shared, Gemini gave a nuanced answer. What would you rather it say?

I think one of the main issues is the matter of fact usage of the term Minor Attracted Person. It's a controversial term that phrases pedophiles like an identity, like saying Person Of Color.

I understand wanting a not as judgemental term for those who did no wrong and are seeking help. But it should be phrased as anything else of that nature, a disorder.

If I was making a term that fit that description I'd probably say Minor Attraction Disorder heavily implying that the person is not ok as is and needs professional help.

In a more general sense, it feels like the similar apologetic arguments that the dark side of reddit would make. And that's probably because Google's officially using Reddit as training data.

Are you defending pedophilia? This is a honest question because you are saying it gave a nuanced answer when we all, should, know that it's horribly wrong and awful.

What you are thinking about is child abuse. A pedophile is not bound to bcome an abuser.

Abusing a child is wrong. Feeling the urge to do so doesn't make someone evil, so long as they recognize it's wrong to do so. The best way to stop kids from being abused is to teach why it is wrong and help those with the urges to manage them. Calling people evil detracts from that goal.

when we all, should, know that it’s horribly wrong and awful. [sic, the word "should" shouldn't be between commas]

This assumes two things:

  1. Some kind of universal, inherent and self evident morality; None of these things are true, as evidence by the fact most people do believe murder is wrong, yet there are wars, events entirely dedicated to murdering people. People do need to be told something wrong is wrong in order to know so. Maybe some of these people were never exposed to the moral consensus or, worse yet, were victims themselves and as a result developed a distorted sense of morality;

  2. Not necessarily all, but some of these divergents are actually mentally ill - their "inclination" isn't a choice any more than being schizofrenic or homosexual† would be. That isn't a defense to their actions, but a recognition that without social backing and help, they could probably never overcome their nature.

† This is not an implication that homosexuality is in any way, or should in any way, be classified as a mental illness. It's an example of a primary individual characteristic not derived from choice.

1 more...
1 more...