Are We Ready For This Site's Endless Feed of AI-Generated Porn?

stopthatgirl7@kbin.social to Technology@lemmy.world – 331 points –
futurism.com

A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

184

Could be interesting. I mean for one thing no real person being expolited and those with strange fetishist can be satisfied. But could be very disturbing as well. Wonder how long until AI video porn?

I think it is just a matter of time for the disturbing stuff to circulate. If people can create their darkest desires they will.

Then cue debates and political campaigns about AI in general and if we should allow anyone or anything to create depraved images, pronographic or not.

That’s so good, sissy. You got even better after I amputated your legs.

“We don’t intend to police the use of developing technologies at this time.”

That’s so good, sissy, blind that billionaire with your acidic piss.

“We cannot allow our children to be exposed to such grotesque videos.”

Yeah, although I think part of the missing nuance is that people already did that, the difference being that now anyone can, in theory, create what's inside their head, regardless of their actual artistic talent. Now that creation is accessible though, everyone's having another moral panic over what should be acceptable for people to create.

If anything, moving the more disturbing stuff from the real world to the digital seems like an absolute win. But I suppose there will always be the argument, much like video games making people violent, that digital content will become real.

People have been able to draw for eons. Imagine trying to ban the public from drawing because the odd few people are a little mixed up.

AI is just a tool, like the pencil, charcoal or paints.

I know you aren't suggesting we ban it from the public, just my side of the hypothetical debate that your right in saying will arrive

"eh I'll take a look"

first thing I see is a woman on her back with her legs behind her head, smooth skin where her genitals should be and nipples in the middle of her buttocks.

"alright then"

Once again pornography is setting unrealistic standards for women.

if xkcd was right about jpeggy porn being niche, i'd bank on terrible AI porn becoming a niche in the future too.

Yeh this site has nothing on some of the insane ai creators on pixiv

i prefer the pregnant woman with huge boobs instead of a pregnant stomach (and also less huge boobs where they normally are)

I'm not sure if I have the strength to clutch these pearls any harder over these articles. I peaked 8 articles ago.

Have a sandwich and a nap then I'm sure you'll be ready to go again

Weakling.

I've been clutching so hard, the pearls fused into my flesh years ago. I've bankrupted myself buying more pearls, inserted one by one into my clenched fist.

Luckily the mere sight of me - a lurching pearlescent beast with glinting pearls for eyes - causes clams to voluntarily offer their own in reverance, my own unending supply.

I personally can't wait for AI to cause a thotmarket collapse.

But then who will you treat like shit?

The AI eGirl, but it'll be alright because I'll pay CeoGPT an extra $5/month for a model that's into that shit.

Gotcha. You're fine with paying someone to pretend they'd willingly fuck you, you're just not comfortable with the money for it going anywhere except into an old white billionaires pocket.

I'm sure there's nothing to unpack there.

Technically if you're versed enough then you can already do that but takes some effort

People who insist on real flesh porn will ultimately be viewed as weirdo’s out of touch with reality like people who insist everything sounds better on vinyl.

Fast forward 25 years past the first Ai war and a ragged but triumphant humanity must rediscover the lost art of waxing.

Why would I want to encourage the flesh trade where real women are hurt? And are limited to what humans are physically capable of?

When I can have AI generated people who are able to do anything imaginable and no one gets hurt?

They'll be arguments that 'once people get used to the fantasies they'll want to try it in real life' but we all know that that just isn't true fr 40 years of video games. There hasn't been any uptick in the events of people eating mushrooms and jumping on turtles or - what ever the fuck a goomba is -

At what point was porn NOT graphic, but now this thing IS GRAPHIC. Are we talking all caps, or just a small difference between the live stuff and the AI shit? Inquiring minds want to know.

1 more...

"Are we ready", in the sense that for now it's 95% garbage and 5% completely generic but passable looking stuff? Eh.

But, as this will increase in quality, the answer would be… who cares. It would suffer from the same major issues of large models : sourcing data, and how we decide the rights of the output. As for it being porn… maybe there's no point in focusing on that specific issue.

When I first heard Stable Diffusion was going open source, I knew this would happen. The only thing I'm surprised at is that it took almost 2 years.

1 more...

Not sure how people will be so into this shit. It's all so generic looking

The actual scary use case for AI porn is that if you can get 50 or more photos of the same person's face (almost anyone with an Instagram account), you can train your own LoRA model to generate believable images of them, which means you can now make "generic looking" porn with pretty much any person you want to see in it. Basically the modern equivalent of gluing cutouts of your crush's face onto the Playboy centerfold, only with automated distribution over the Internet...

18 more...

AI is still a brand new tech. It's like getting mad at AM radio for being staticy and low quality. It'll improve with time as we get better tech.

Personally I can't wait to see what the future holds for AI porn. I'm imagining being able to get exactly what you want with a single prompt, and it looks just as real as reality. No more opening 50 tabs until you find the perfect video. Sign me the fuck up.

Cam girls are going to lose their jobs.

The article mentioned that at least one OnlyFans creator reached out to make a model of their own content and also mentioned that some OnlyFans creators outsource writers to chat with fans. I don't think this will meaningfully affect cam girls' jobs. Once we are able to make live animated images realtime with convincing speech and phrases then probably.

18 more...

Does it say something about society that our automatons are better at creating similated genitals than they are at hands?

It says that we are biologically predisposed to sex, which we are, like animals, which we are.

It doesn't say anything about society, it just confirms the human condition.

But... why are hands so difficult?

I'm sure it would have the same difficulty generating five properly articulating penises attached to a limb.

I'll admit I have some difficulty with that concept myself.

Ever watch the "Everything Everywhere All at Once" hotdog scenes?

On a visual level, we are more interested in genitals than hands? Also, faces.

4 more...

Went and had a look and it’s some of the funniest stuff I’ve seen all day! A few images come close to realism but a lot of them are the sort AI fever dream stuff that you could not make up.

Ai has no ficks to give to people who are not ready.

Meh. It's all only women and so samey samey. Not sexy IMO, but I don't think fake is necessarily not hot, art can be, certainly.

You can change it to men, but most of the results are intersex(?) or outright women anyway. I guess the training data is heavily weighted toward examples of women.

AI porn for the longest time has just looked so off to me, idk what it is

Some things I was able to put my finger on after looking at a bunch of the images in the feed:

  • It doesn't do skin well, treating it more like a smooth plastic than a surface with pores, wrinkles, and fine hairs.
  • It doesn't understand lighting, so shadows don't agree with highlights or even each other.
  • It doesn't understand anatomy. A lot of the images were fine in this regard but others had misplaced muscles, bones, and impossible limb positioning/truncation.
  • It has no idea how to draw vaginas. Nipples are also not well understood, though it does better on average with those. They still look more like a plastic than skin, but most of them were passable at least, while I didn't see a single vagina that looked even close to right.

To be fair, so many humans drawing porn don't understand anatomy or what vaginas look like. It's hard to train when your input data is already bad.

Yeah, a lot of the vaginas on that site look like hentai vaginas. I can understand it more with AIs since vaginas have a ton of variance to them, so trying to make an "average" vagina will end up not looking like any actual one.

But the artists that draw them like that just disappoint me (though with the caveat that I have no reason to believe I'd do any better if I were to draw that). There's a ton of inspiration out there and many do an amazing job with the rest of the body, why do they make one of the important parts (for porn) look like an afterthought?

Though the anime style ones aren't as bad as some others where the AI seems like it's trying to average boy and girl parts lol.

Tbh, if people don't know the difference from a vagina and a vulva, I don't expect AI to do a great job generating any good porn.

Using a term casually rather than medically won't affect the quality of AI porn. Though maybe ensuring it knows the difference between a labia, clit, urethra, vagina, and asshole will produce better results.

To be fair, the focus of this era's porn has hardly ever been the vagina, under any amount of scrutiny. A few aspects of sex in porn prioritized higher are, in no particular order: bounce, sounds, phrasing, setting, texture, animism/passion, power roles, etc. Hell, I'd rather that more effort has been made to visually hide prophylactic use than to focus on the vagina itself.

I'm not in any way saying I agree with this, simply pointing out the facts as they are these days.

4 more...

I am a little surprised that no one had created a site like this for child pornography.

I am not a legal expert, but my layman's understanding of Ashcroft v Free Speech Coalition https://en.wikipedia.org/wiki/Ashcroft\_v.\_Free\_Speech\_Coalition is that as long as there is no person being harmed by it CSAM is legal.

Maybe later rulings have changed this. One can hope.

CivitAI is a pretty perverted site at the best of times. But there's a disturbing amount of age adjustment plugins to make images of children on the same site they have plugins to make sex acts. It's clear some people definitely are.

Some models also prefer children for some reason and then you have to put mature/adult in positive prompt and child in negative

1 more...
1 more...

I remember reading that this may be already happening to some extent, eg people sharing tips on creating it on the deep web, maybe through prompt engineer, fine tuning or pretraining.

I don’t know how those models are made, but I do wonder the ones that need retraining/finetuning by using real csam can be classified as breaking the law.

8 more...

Hentai maybe. But realistic shit is 100% illegal, even just making such an AI would require breaking the law as you'd have to use real CSAM to train it.

There was an article the other day about underage girls in France having AI nudes spread around based on photos as young as 12. Definitely harm there.

Surely we should know, right? Cartoons or hentai or whatever must have gone through this at some point?

Typically, the laws get amended so that anything that looks like CSAM is now CSAM. Expect porn generators tuned for minor characters to get outlawed very quickly.

1 more...
1 more...

You haven't been to 4chan lately because that's exactly what it's being used for

Well, to develop such a service, you need training data, i.e. lots of real child pornography in your possession.

Legality for your viewers will also differ massively around the world, so your target audience may not be very big.

And you probably need investors, which likely have less risky projects to invest into.

Well, and then there's also the factor of some humans just not wanting to work on disgusting, legal grey area stuff.

yup, just like the ai needed lots of pictures of astronaughts on horses to make pictures of those...

Exactly. Some of these engines are perfectly capable of combining differing concepts. In your example, it knows basically what a horse looks like, and what a human riding on horseback looks like. It also knows that an astronaut looks very much like a human without a space suit and can put the two together.

Saying nothing of the morality, In this case, I suspect that an AI could be trained using pictures of clothed children perhaps combined with nude images of people who are of age and just are very slim or otherwise have a youthful appearance.

While I think it's repugnent in concept, I also think that for those seeking this material, I'd much rather it be AI generated than an actual exploited child. Realistically though, I doubt that this would actually have any notable impact to the prevalence of CSAM, and might even make it more accessible.

Furthermore, if the generative AI gets good enough, it could make it difficult to determine whether an image is real or AI generated. That would make it more difficult for police to find the child and offender to try to remove them from that situation. So now we need an AI to help analyze and separate the two.

Yeah... I don't like living in 2023 and things are only getting worse. I've put way more thought into this than I ever wanted to.

Aren't AI generated images pretty obvious to detect from noise analysis? I know there's no effective detection for AI generated text, and not that there won't be projects to train AI to generate perfectly realistic images, but it'll be a while before it does fingers right, let alone invisible pixel artifacts.

As a counterpoint, won't the prevalence of AI generated CSAM collapse the organized abuse groups, since they rely on the funding from pedos? If genuine abuse material is swamped out by AI generated imagery, that would effectively collapse an entire dark web market. Not that it would end abuse, but it would at least undercut the financial motive, which is progress.

That's pretty good for 2023.

With StableDiffusion you can intentionally leave an "invisible watermark" that machines can easily detect but humans cannot see. The idea being that in the future you don't accidentally train on already AI generated images. I'd hope most sites are doing that but it can be turned off easily enough. Apart from that I'm not sure.

I could have sworn I saw an article talking about how there were noise artifacts that were fairly obvious, but now I can't turn anything up. The watermark should help things, but outside of that it looks like there's just a training dataset of pure generative AI images (GenImage) to train another AI to detect generated images. I guess we'll see what happens with that.

Unfortunately, no, you just need training data on children in general and training data with legal porn, and these tools can combine it.

It's already being done, which is disgusting but not surprising.

People have worried about this for a long time. I remember a subplot of a sci-fi series that got into this. (I think it was The Lost Fleet, 15 years ago).

You'd also have to convince them that it's not real. It'll probably end up creating laws tbh. Then there are weird things like Japan where lolis are legal, but uncensored genitals aren't, even drawn.

10 more...

While amazing, most of these are hilariously wrong. Let's just say the girl with 4 bellybuttons was the more tame incorrect thing I saw.

There's a sub in lemmynsfw for it so post away there.

Eh it's still very obvious.

I predict that small imperfections will get even hotter as time goes by

Whelp it's all humans being generated, so I'm safe. Realistic enough looking humans ain't my jam.

Here is an alternative Piped link(s):

[https://www.youtube.com/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit]](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

Are We Ready For This Site's Endless Feed of AI-Generated Piped Links?

Here is an alternative Piped link(s):

[https://www.youtube.com/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit]](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D))

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

Here is an alternative Piped link(s):

[https://www.youtube.com/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit]](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)](https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D)%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D%5D(https://piped.video/watch?v=Fg6JzoCEWx8&t=24s&ab_channel=edit%5D))

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

3 more...
3 more...
3 more...
3 more...
3 more...
3 more...
3 more...

I have been waiting this day for decades ever since I first heard about AI generated images a decade or so ago.

As long as there will be simps, there will be a need for this trash.

Super hot and spicy take incoming: AI will be able to make very realistic child porn and we might actually see huge drop in sexual child abuse.

I hate to even type that sentence, btw.

It already is being used to make CSAM. I work for a hosting provider and just the other day we closed an account because they were intentionally hosting AI generated CSAM.

Can I ask why AI generated media is considered CSAM if there are no victims? I don't like furry porn but it's not beastiality. I don't like loli shit but it's not CP (well technically it is but it's not real kids is my point). How is it any different?

Is it gross? Obviously, but I'm biased as I don't like kiddie shit but no one is getting hurt and if it helps reduce sexual abuse cases against kids, why wouldn't you be in favor of it?

I don't understand how this is unreasonable. If AI generated CP increased the stats of kids being harmed then I'd be vehemently opposed. I know it's a touchy subject but you can't just write it off if it works for the greater good, no?

The report came from a (non-US) government agency. It wasn't reported as AI generated, that was what we discovered.

But it highlights the reality - while AI generated content may be considered fairly obvious for now, it won't be forever. Real CSAM could be mixed in at some point, or, hell, the characters generating it could be feeding it real CSAM to have it recreate it in a manner that makes it harder to detect.

So what does this mean for hosting providers? We continuously receive reports for a client and each time we have to review it and what, use our best judgement to decide if it's AI generated? We add the client to a list and ignore CSAM reports for them? We have to tell the government that it's not "real CSAM" and expect it to end there?

No legitimate hosting provider is going to knowingly host CSAM, AI generated or not. We aren't going to invest legal resources into defending that, nor are we going to jeopardize the mental well-being of our staff by increasing the frequency of those reports.

But it highlights the reality - while AI generated content may be considered fairly obvious for now, it won't be forever. Real CSAM could be mixed in at some point, or, hell, the characters generating it could be feeding it real CSAM to have it recreate it in a manner that makes it harder to detect.

Very true and I would like to look into it further. Being able to disguise real content with an AI label could make things harder for people that detect and report these types of issues.

So what does this mean for hosting providers? We continuously receive reports for a client and each time we have to review it and what, use our best judgement to decide if it's AI generated? We add the client to a list and ignore CSAM reports for them? We have to tell the government that it's not "real CSAM" and expect it to end there?

I don't understand the logic behind this. If it's your job to analyze and deduce whether certain content is or is not acceptable, why shouldn't you make assessments on a case by case basis? Even if you remove CSAM from the equation you still have to continuously sift through content and report any and all illegal activities - regardless of its frequency.

No legitimate hosting provider is going to knowingly host CSAM, AI generated or not. We aren't going to invest legal resources into defending that, nor are we going to jeopardize the mental well-being of our staff by increasing the frequency of those reports.

And it's the right of any website or hosting provider to not show any content they deem unsuitable for it's viewers. But this is a non sequitur - illegal activities will never stop and it's the duty of people like you to help and combat the distribution of such materials. I appreciate all the work people like you do and it's a job I couldn't handle. CP exists and will continue to exist. It's just an ugly truth. I'm just asking a very uncomfortable question that will hopefully result in a very positive answer: can AI generated CP reduce the harm done to children?

Here's a very interesting article of the potential positive effects of AI generated CP

Btw I appreciate your input in all of this. It means a lot coming from someone actually involved with this sort of thing.

Edit: and to your point, the article ends with a very real warning:

"Of course, using AI-generated images as a form of rehabilitation, alongside existing forms of therapy and treatment, is not the same as allowing its unbridled proliferation on the web.

“There’s a world of difference between the potential use of this content in controlled psychiatric settings versus what we’re describing here, which is just, anybody can access these tools to create anything that they want in any setting,” said Portnoff, from Thorn."

I don’t understand the logic behind this. If it’s your job to analyze and deduce whether certain content is or is not acceptable, why shouldn’t you make assessments on a case by case basis?

The bit about "ignoring it" was more in jest. We do review each report and handle it in a case by case basis, my point with this statement is that someone hosting questionable content is going to generate alot of reports, regardless of whether it is illegal or not, and we won't take an operating loss and let them keep hosting with us.

Usually we try and determine if it was intentional or not, if someone is hosting CSAM and is quick and responsive with resolving the issue, we generally won't immediately terminate them for it. But even if they (our client) is a victim, we are not required to host for them and after a certain point we will terminate them.

So when we receive a complaint about a user hosting CSAM, we review it and see they are hosting a site advertising itself as intended to allow users to distribute AI generated CP, we aren't going to let him continue hosting with us.

Even if you remove CSAM from the equation you still have to continuously sift through content and report any and all illegal activities - regardless of its frequency.

This is not an accurate statement, at least in the U.S. where we are based. We are not (yet) required to sift through any and all content uploaded on our servers (not to mention the complexity of such an undertaking making it virtually impossible at our level). There have been a few laws proposed that would have changed that, as we've seen in the news from time to time. We are required to handle reports we receive about our clients.

Keep in mind when I say we are a hosting provider, I'm referring to pretty high up the chain - we provide hosting to clients that would say, host a Lemmy instance, or a Discord bot, or a personal NextCloud server, to name a few examples. A common dynamic is how much abuse is your hosting provider willing to put up with, and if you recall with the CSAM attacks on Lemmy instances part of the discussion was risking getting their servers shutdown.

Which is valid, hosting providers will only put up with so much risk to their infrastructure, reputation, and / or staff. Which is why people who run sites like Lemmy or image hosting services do usually want to take an active role in preventing abuse - whether or not they are legally liable won't matter when we pull the plug because they are causing us an operating loss.

And it’s the right of any ... [continued]

I'm just going to reply to the rest of your statement down here, I think I did not make my intent/purpose clear enough. I originally replied to your statement talking about AI being used to make CP in the future by providing a personal anecdote about it already happening. To which you asked a question as to why I defined AI generated CP as CSAM, and I clarified. I wasn't actually responding to the rest of that message. I was not touching the topic or discussion of what impact it might have on the actual abuse of children, merely providing my opinion as to why, whether legal or not, hosting providers aren't ever going to host that content.

The content will be hosted either way, but whether it is merely relegated to "offshore" providers but still accessible via normal means and not criminal content, or becomes another part of the dark web, will be determine at some point in the future. It hasn't become a huge issue yet but it is rapidly approaching that point.

The fact that it can make CP at all is the reason why it needs to be banned outright.

EDIT: Counting 8 9 10 11 butthurt pedophiles afraid their new CP source will be banned

Nah, just hook it up to some predator drones and build a pedo hunting skynet.

No one is butthurt. I have no interest in CP (thank fucking god) but if it means people get their rocks off at home without *hurting any kids then I'm all for it.

What's interesting is you have a strong disdain for fake porn but no real argument against it other than "heeeyuck kiddy porn bad aaahheeeyuuck". 😂

Edit: no real arguments and just downvotes? Seems like a typical facts vs feelings argument ¯_(ツ)_/¯

2 more...

I think words like “thirsty” and “thirst trap” are self defeating. Actual thirst is your body’s biological response to a need for water. Without water, you’ll die. Calling porn sites “thirst traps” suggests that people have a critical need for porn - which is often contrary to the point the author is trying to make.

Arguing about the ethics or morality of something that’s “necessary” for survival is irrelevant. Anyone who’s against pornography is perfectly within their rights to share their opinion, but they should avoid the word “thirst”.

Edit: I know the issue of pornography can evoke strong feelings in some, but I’m only talking about word choice, folks.

I’m also not one of those tiresome people people who refuses to admit that they’re wrong, and acts like a child when faced with sound opposition. If someone would actually put their opinion into words, instead of just downvoting, I’d appreciate it.

Man, you really do sound like a redditor.

A redditor would try to hide their gross opinion behind a thin veneer of logic. I’m not getting into the ethics of porn at all.

I’m just talking about phrasing. I think “thirst” is a poor analogy.

I think words like “thirsty” and “thirst trap” are self defeating. Actual thirst is your body’s biological response to a need for water. Without water, you’ll die. Calling porn sites “thirst traps” suggests that people have a critical need for porn - which is often contrary to the point the author is trying to make.

a thin veneer of logic

He's right. You're not actually listening to what he's saying at all. You're the one who sounds like a redditor.

Consider another association with thirst: Desperation. In the mind of the author porn consumption is negative so anyone consuming porn is doing this out of desperation, despite knowing better. It essentially describes people being controlled by their base instincts. And thus this site is a trap, luring people against their will.

That is how I would interpret the word thirst in this context anyway. It's not about a critical need, it's about thirst being irrational and highly compulsive.

Awesome! Thank you! This is the kind of thought provoking response I was hoping for.

Additionally, it’s a really good point. I totally missed this interpretation, and I think it’s better than mine.

Sure, no worries! I haven't been disappointed yet by responding to downvoted comments so I will keep doing it :)

Just another similar metaphor: Power hungry. Not the stove kind but the dictator kind. To be honest, there are quite a lot of body related metaphors, e.g. drowning in trouble, blinded by ambition. I guess it comes down to evoking some strong emotion.

I think a lot of people (myself included) are used to the combative, hostile environment of reddit. An actual friendly disagreement is a foreign concept to them. Personally, I enjoy it when other people present different perspectives. It’s both interesting and a great way to learn.

And yeah, “power hungry” has a similar vibe. Nice!

Totally agree, though I haven't really participated that much on Reddit. Seems like any disagreement is quickly framed as trolling over there.

Yeah. People are very defensive, because even the insinuation of disagreement is taken as a personal attack. It gets into your head, and takes a while to get over. You forget that reasonable, well-meaning conversations are even possible. It sucks. I’ll never go back.

That's hardly even an interpretation. Thirst and hunger as verbs have always meant to desire something.

It's slang. There are tons of words in every language that don't make sense but get used a certain way because that's how people use them. Next thing you're gonna tell me is that in the 1980s when people called things bad you were upset because they actually meant good.

There's also the fact that not all porn is exploitative

Make up your mind. Last week every porn was exploitative because reasons, and not it’s not because people could lose their job.

Little known fact - there's more the one person out there, and, those extra people have lots of differing opinions!

But it's funny how people never care about nuance when they're trying to make a point.......only when someone else is trying to make a point.

Then why do they say it’s a fact?

Because it is a fact, and anyone who says “all porn is exploitive” is projecting their morals and idealism across a whole industry with zero regard for nuance.