A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.
“eh I’ll take a look”
first thing I see is a woman on her back with her legs behind her head, smooth skin where her genitals should be and nipples in the middle of her buttocks.
“alright then”
Once again pornography is setting unrealistic standards for women.
Great, more man made horrors beyond comprehension
Non-Euclidean anatomy
BBC Escher.
World’s first mandelboobs - fractal porn enthusiast community incoming
if xkcd was right about jpeggy porn being niche, i’d bank on terrible AI porn becoming a niche in the future too.
Yeh this site has nothing on some of the insane ai creators on pixiv
woman with a dinosaur and a woman without legs for me
i prefer the pregnant woman with huge boobs instead of a pregnant stomach (and also less huge boobs where they normally are)
I personally can’t wait for AI to cause a thotmarket collapse.
But then who will you treat like shit?
We’ve programmed a robot to be treated like shit.
I also gave it clynical depression!
The AI eGirl, but it’ll be alright because I’ll pay CeoGPT an extra $5/month for a model that’s into that shit.
Gotcha. You’re fine with paying someone to pretend they’d willingly fuck you, you’re just not comfortable with the money for it going anywhere except into an old white billionaires pocket.
I’m sure there’s nothing to unpack there.
Technically if you’re versed enough then you can already do that but takes some effort
At what point was porn NOT graphic, but now this thing IS GRAPHIC. Are we talking all caps, or just a small difference between the live stuff and the AI shit? Inquiring minds want to know.
deleted by creator
When I first heard Stable Diffusion was going open source, I knew this would happen. The only thing I’m surprised at is that it took almost 2 years.
deleted by creator
“Are we ready”, in the sense that for now it’s 95% garbage and 5% completely generic but passable looking stuff? Eh.
But, as this will increase in quality, the answer would be… who cares. It would suffer from the same major issues of large models : sourcing data, and how we decide the rights of the output. As for it being porn… maybe there’s no point in focusing on that specific issue.
Not sure how people will be so into this shit. It’s all so generic looking
The actual scary use case for AI porn is that if you can get 50 or more photos of the same person’s face (almost anyone with an Instagram account), you can train your own LoRA model to generate believable images of them, which means you can now make “generic looking” porn with pretty much any person you want to see in it. Basically the modern equivalent of gluing cutouts of your crush’s face onto the Playboy centerfold, only with automated distribution over the Internet…
deleted by creator
So how will any progressive politician be able to be elected then? Because all the fascists would have to do is generate porn with their opponent’s likeness to smear them.
Or even worse, deepfake evidence of rape.
Or even worse than that, generate CSAM with their likeness portrayed abusing a child.
They could use that to imprison not only their political opponents, but anyone for anything, and people would think whoever is being disappeared this week actually is a pedophile or a rapist and think nothing of it.
Actual victims’ movements would be chopped off at the knee, because now there’s no definitive way to prove an actual rape happened since defendants could credibly claim real videos are just AI generated crap and get acquitted. No rape or abuse claims would ever be believed because there is now no way to establish objective truth.
This would leave the fascists open to do whatever they want to anybody with no serious consequences.
But no one cares because they want AI to do their homework for them so they don’t have to think, write, or learn to be creative on their own. They want to sit around on their asses and do nothing.
People will have to learn to stop believing everything they see. This has been possible with Photoshop for even more than a decade now. All that’s changed is that it takes less skill and time now.
That’s not possible with AI-generated images impossible to distinguish from reality, or even expertly done photoshops. The practice, and generative AI as a whole, needs to be banned. They’re putting AI in photoshop too so ban that garbage too.
It has to stop. We can’t allow the tech industry to enable fascism and propaganda.
This reads like satire.
You’re proposing to build a dam with duct tape.
Nah, that Thanos I-am-inevitable shit doesn’t work on me. They can ban AI, you all just don’t want it because generative AI allows you to steal other people’s talent so you can pretend you have your own
Can’t tell whether this is bait or if you are seriously that much of a Luddite.
Oh look at that, they just released pictures of you raping a 4-year-old, off to prison with you. Never mind they’re not real. That’s the world you wanted and those are the consequences you’re going to get if you don’t stop being lazy and learn to reject terrible things on ethical grounds.
We’re going to go back to the old model of trust, before videos and photos existed. Consistent, coherent stories from sources known to be trustworthy will be key. Physical evidence will be helpful as well.
But then people will say “Well how do we know they’re not lying?” and then it’s back to square 1.
Victims might not ever be able to get justice again if this garbage is allowed to continue. Society’s going so off-track.
How often does video evidence of rape exist, though? I don’t think this really changes anything for most victims.
See Stuebenville, Ohio where the dumb motherfuckers date raped a girl and put the video on Facebook.
People do shit like that.
Because that’s called Libel and is very much illegal in practically any country on earth - and depending on the country it’s either easy or trivial to put forth and win a case of libel in court, since it’s the onus of the defendant to prove what they said was entirely true, and “just trust me and this actress I hired, bro” doesn’t cut it.
So what happens when evildoers give AI-generated deepfakes to news media so they can avoid liability?
The burden of liability will then fall on the media company, which can then be sued for not carrying out due dilligance in reporting.
deleted by creator
AI is still a brand new tech. It’s like getting mad at AM radio for being staticy and low quality. It’ll improve with time as we get better tech.
Personally I can’t wait to see what the future holds for AI porn. I’m imagining being able to get exactly what you want with a single prompt, and it looks just as real as reality. No more opening 50 tabs until you find the perfect video. Sign me the fuck up.
Cam girls are going to lose their jobs.
There will always be a market for the “real thing”
Sure, but a much smaller one.
The article mentioned that at least one OnlyFans creator reached out to make a model of their own content and also mentioned that some OnlyFans creators outsource writers to chat with fans. I don’t think this will meaningfully affect cam girls’ jobs. Once we are able to make live animated images realtime with convincing speech and phrases then probably.
I really couldn’t care less.
Good
Does it say something about society that our automatons are better at creating similated genitals than they are at hands?
It says that we are biologically predisposed to sex, which we are, like animals, which we are.
It doesn’t say anything about society, it just confirms the human condition.
But… why are hands so difficult?
I’m sure it would have the same difficulty generating five properly articulating penises attached to a limb.
I’ll admit I have some difficulty with that concept myself.
Ever watch the “Everything Everywhere All at Once” hotdog scenes?
Fully Articulated Hand Penises!
deleted by creator
They suck quite a lot at genitals
Are we still doing “phrasing”?
Said Ripley to the android Bishop.
On a visual level, we are more interested in genitals than hands? Also, faces.
Went and had a look and it’s some of the funniest stuff I’ve seen all day! A few images come close to realism but a lot of them are the sort AI fever dream stuff that you could not make up.
Meh. It’s all only women and so samey samey. Not sexy IMO, but I don’t think fake is necessarily not hot, art can be, certainly.
You can change it to men, but most of the results are intersex(?) or outright women anyway. I guess the training data is heavily weighted toward examples of women.
Ai has no ficks to give to people who are not ready.
AI porn for the longest time has just looked so off to me, idk what it is
deleted by creator
This looks good, untill you try and figure out the fingers https://civitai.com/images/1967206?modelVersionId=138176&prioritizedUserIds=6357&period=AllTime&sort=Most Reactions&limit=20
and the toes here… https://civitai.com/images/1967203?modelVersionId=138176&prioritizedUserIds=6357&period=AllTime&sort=Most+Reactions&limit=20
The second one is pretty good until you notice the non-euclidian nightmare doors.
You can do multiple subjects but you have to be more creative than just throwing a prompt into a generator. You have to be willing to do it in multiple steps with in painting.
PixAI is better than pornpen IMO
deleted by creator
No, I’m not ready for this one: KC7ROYwDs2lwWwnSz1Ds
This is like a Trainwreck. I can’t look away!
I am a little surprised that no one had created a site like this for child pornography.
I am not a legal expert, but my layman’s understanding of Ashcroft v Free Speech Coalition https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalition is that as long as there is no person being harmed by it CSAM is legal.
Maybe later rulings have changed this. One can hope.
Hentai maybe. But realistic shit is 100% illegal, even just making such an AI would require breaking the law as you’d have to use real CSAM to train it.
There was an article the other day about underage girls in France having AI nudes spread around based on photos as young as 12. Definitely harm there.
You haven’t been to 4chan lately because that’s exactly what it’s being used for
Surely we should know, right? Cartoons or hentai or whatever must have gone through this at some point?
Typically, the laws get amended so that anything that looks like CSAM is now CSAM. Expect porn generators tuned for minor characters to get outlawed very quickly.
You’d also have to convince them that it’s not real. It’ll probably end up creating laws tbh. Then there are weird things like Japan where lolis are legal, but uncensored genitals aren’t, even drawn.
I’m sure they’re out there on the deep web.
Well, to develop such a service, you need training data, i.e. lots of real child pornography in your possession.
Legality for your viewers will also differ massively around the world, so your target audience may not be very big.
And you probably need investors, which likely have less risky projects to invest into.
Well, and then there’s also the factor of some humans just not wanting to work on disgusting, legal grey area stuff.
yup, just like the ai needed lots of pictures of astronaughts on horses to make pictures of those…
Exactly. Some of these engines are perfectly capable of combining differing concepts. In your example, it knows basically what a horse looks like, and what a human riding on horseback looks like. It also knows that an astronaut looks very much like a human without a space suit and can put the two together.
Saying nothing of the morality, In this case, I suspect that an AI could be trained using pictures of clothed children perhaps combined with nude images of people who are of age and just are very slim or otherwise have a youthful appearance.
While I think it’s repugnent in concept, I also think that for those seeking this material, I’d much rather it be AI generated than an actual exploited child. Realistically though, I doubt that this would actually have any notable impact to the prevalence of CSAM, and might even make it more accessible.
Furthermore, if the generative AI gets good enough, it could make it difficult to determine whether an image is real or AI generated. That would make it more difficult for police to find the child and offender to try to remove them from that situation. So now we need an AI to help analyze and separate the two.
Yeah… I don’t like living in 2023 and things are only getting worse. I’ve put way more thought into this than I ever wanted to.
Aren’t AI generated images pretty obvious to detect from noise analysis? I know there’s no effective detection for AI generated text, and not that there won’t be projects to train AI to generate perfectly realistic images, but it’ll be a while before it does fingers right, let alone invisible pixel artifacts.
As a counterpoint, won’t the prevalence of AI generated CSAM collapse the organized abuse groups, since they rely on the funding from pedos? If genuine abuse material is swamped out by AI generated imagery, that would effectively collapse an entire dark web market. Not that it would end abuse, but it would at least undercut the financial motive, which is progress.
That’s pretty good for 2023.
With StableDiffusion you can intentionally leave an “invisible watermark” that machines can easily detect but humans cannot see. The idea being that in the future you don’t accidentally train on already AI generated images. I’d hope most sites are doing that but it can be turned off easily enough. Apart from that I’m not sure.
I could have sworn I saw an article talking about how there were noise artifacts that were fairly obvious, but now I can’t turn anything up. The watermark should help things, but outside of that it looks like there’s just a training dataset of pure generative AI images (GenImage) to train another AI to detect generated images. I guess we’ll see what happens with that.
Unfortunately, no, you just need training data on children in general and training data with legal porn, and these tools can combine it.
It’s already being done, which is disgusting but not surprising.
People have worried about this for a long time. I remember a subplot of a sci-fi series that got into this. (I think it was The Lost Fleet, 15 years ago).
Eh it’s still very obvious.
I predict that small imperfections will get even hotter as time goes by



















