A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

  • funkless_eck@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    74
    ·
    2 years ago

    “eh I’ll take a look”

    first thing I see is a woman on her back with her legs behind her head, smooth skin where her genitals should be and nipples in the middle of her buttocks.

    “alright then”

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    2 years ago

    At what point was porn NOT graphic, but now this thing IS GRAPHIC. Are we talking all caps, or just a small difference between the live stuff and the AI shit? Inquiring minds want to know.

  • randon31415@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    2 years ago

    When I first heard Stable Diffusion was going open source, I knew this would happen. The only thing I’m surprised at is that it took almost 2 years.

  • cley_faye@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    2 years ago

    “Are we ready”, in the sense that for now it’s 95% garbage and 5% completely generic but passable looking stuff? Eh.

    But, as this will increase in quality, the answer would be… who cares. It would suffer from the same major issues of large models : sourcing data, and how we decide the rights of the output. As for it being porn… maybe there’s no point in focusing on that specific issue.

  • Sume@reddthat.com
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    4
    ·
    2 years ago

    Not sure how people will be so into this shit. It’s all so generic looking

    • BreakDecks@lemmy.ml
      link
      fedilink
      English
      arrow-up
      24
      ·
      2 years ago

      The actual scary use case for AI porn is that if you can get 50 or more photos of the same person’s face (almost anyone with an Instagram account), you can train your own LoRA model to generate believable images of them, which means you can now make “generic looking” porn with pretty much any person you want to see in it. Basically the modern equivalent of gluing cutouts of your crush’s face onto the Playboy centerfold, only with automated distribution over the Internet…

        • pinkdrunkenelephants@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          7
          ·
          2 years ago

          So how will any progressive politician be able to be elected then? Because all the fascists would have to do is generate porn with their opponent’s likeness to smear them.

          Or even worse, deepfake evidence of rape.

          Or even worse than that, generate CSAM with their likeness portrayed abusing a child.

          They could use that to imprison not only their political opponents, but anyone for anything, and people would think whoever is being disappeared this week actually is a pedophile or a rapist and think nothing of it.

          Actual victims’ movements would be chopped off at the knee, because now there’s no definitive way to prove an actual rape happened since defendants could credibly claim real videos are just AI generated crap and get acquitted. No rape or abuse claims would ever be believed because there is now no way to establish objective truth.

          This would leave the fascists open to do whatever they want to anybody with no serious consequences.

          But no one cares because they want AI to do their homework for them so they don’t have to think, write, or learn to be creative on their own. They want to sit around on their asses and do nothing.

          • hyperhopper@lemmy.ml
            link
            fedilink
            English
            arrow-up
            7
            ·
            2 years ago

            People will have to learn to stop believing everything they see. This has been possible with Photoshop for even more than a decade now. All that’s changed is that it takes less skill and time now.

            • pinkdrunkenelephants@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              4
              ·
              2 years ago

              That’s not possible with AI-generated images impossible to distinguish from reality, or even expertly done photoshops. The practice, and generative AI as a whole, needs to be banned. They’re putting AI in photoshop too so ban that garbage too.

              It has to stop. We can’t allow the tech industry to enable fascism and propaganda.

                • pinkdrunkenelephants@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  4
                  ·
                  2 years ago

                  Nah, that Thanos I-am-inevitable shit doesn’t work on me. They can ban AI, you all just don’t want it because generative AI allows you to steal other people’s talent so you can pretend you have your own

              • CoolCat38@lemmy.worldBanned
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                2 years ago

                Can’t tell whether this is bait or if you are seriously that much of a Luddite.

                • pinkdrunkenelephants@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  2 years ago

                  Oh look at that, they just released pictures of you raping a 4-year-old, off to prison with you. Never mind they’re not real. That’s the world you wanted and those are the consequences you’re going to get if you don’t stop being lazy and learn to reject terrible things on ethical grounds.

          • Liz@midwest.social
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 years ago

            We’re going to go back to the old model of trust, before videos and photos existed. Consistent, coherent stories from sources known to be trustworthy will be key. Physical evidence will be helpful as well.

            • pinkdrunkenelephants@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 years ago

              But then people will say “Well how do we know they’re not lying?” and then it’s back to square 1.

              Victims might not ever be able to get justice again if this garbage is allowed to continue. Society’s going so off-track.

          • Silinde@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            2 years ago

            Because that’s called Libel and is very much illegal in practically any country on earth - and depending on the country it’s either easy or trivial to put forth and win a case of libel in court, since it’s the onus of the defendant to prove what they said was entirely true, and “just trust me and this actress I hired, bro” doesn’t cut it.

              • Silinde@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                2 years ago

                The burden of liability will then fall on the media company, which can then be sued for not carrying out due dilligance in reporting.

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      6
      ·
      2 years ago

      AI is still a brand new tech. It’s like getting mad at AM radio for being staticy and low quality. It’ll improve with time as we get better tech.

      Personally I can’t wait to see what the future holds for AI porn. I’m imagining being able to get exactly what you want with a single prompt, and it looks just as real as reality. No more opening 50 tabs until you find the perfect video. Sign me the fuck up.

  • themeatbridge@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    2 years ago

    Does it say something about society that our automatons are better at creating similated genitals than they are at hands?

  • joelfromaus@aussie.zone
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 years ago

    Went and had a look and it’s some of the funniest stuff I’ve seen all day! A few images come close to realism but a lot of them are the sort AI fever dream stuff that you could not make up.

  • RBWells@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    2 years ago

    Meh. It’s all only women and so samey samey. Not sexy IMO, but I don’t think fake is necessarily not hot, art can be, certainly.

    • Zerfallen@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 years ago

      You can change it to men, but most of the results are intersex(?) or outright women anyway. I guess the training data is heavily weighted toward examples of women.

  • Armen12@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    3
    ·
    2 years ago

    AI porn for the longest time has just looked so off to me, idk what it is

    • Rustmilian@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      2 years ago

      Hentai maybe. But realistic shit is 100% illegal, even just making such an AI would require breaking the law as you’d have to use real CSAM to train it.

    • mrnotoriousman@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      2 years ago

      There was an article the other day about underage girls in France having AI nudes spread around based on photos as young as 12. Definitely harm there.

      • Jesus_666@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        Typically, the laws get amended so that anything that looks like CSAM is now CSAM. Expect porn generators tuned for minor characters to get outlawed very quickly.

    • 👁️👄👁️@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 years ago

      You’d also have to convince them that it’s not real. It’ll probably end up creating laws tbh. Then there are weird things like Japan where lolis are legal, but uncensored genitals aren’t, even drawn.

    • Knusper@feddit.de
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      8
      ·
      2 years ago

      Well, to develop such a service, you need training data, i.e. lots of real child pornography in your possession.

      Legality for your viewers will also differ massively around the world, so your target audience may not be very big.

      And you probably need investors, which likely have less risky projects to invest into.

      Well, and then there’s also the factor of some humans just not wanting to work on disgusting, legal grey area stuff.

      • Womble@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        ·
        2 years ago

        yup, just like the ai needed lots of pictures of astronaughts on horses to make pictures of those…

        • JonEFive@midwest.social
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 years ago

          Exactly. Some of these engines are perfectly capable of combining differing concepts. In your example, it knows basically what a horse looks like, and what a human riding on horseback looks like. It also knows that an astronaut looks very much like a human without a space suit and can put the two together.

          Saying nothing of the morality, In this case, I suspect that an AI could be trained using pictures of clothed children perhaps combined with nude images of people who are of age and just are very slim or otherwise have a youthful appearance.

          While I think it’s repugnent in concept, I also think that for those seeking this material, I’d much rather it be AI generated than an actual exploited child. Realistically though, I doubt that this would actually have any notable impact to the prevalence of CSAM, and might even make it more accessible.

          Furthermore, if the generative AI gets good enough, it could make it difficult to determine whether an image is real or AI generated. That would make it more difficult for police to find the child and offender to try to remove them from that situation. So now we need an AI to help analyze and separate the two.

          Yeah… I don’t like living in 2023 and things are only getting worse. I’ve put way more thought into this than I ever wanted to.

          • Ryantific_theory@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 years ago

            Aren’t AI generated images pretty obvious to detect from noise analysis? I know there’s no effective detection for AI generated text, and not that there won’t be projects to train AI to generate perfectly realistic images, but it’ll be a while before it does fingers right, let alone invisible pixel artifacts.

            As a counterpoint, won’t the prevalence of AI generated CSAM collapse the organized abuse groups, since they rely on the funding from pedos? If genuine abuse material is swamped out by AI generated imagery, that would effectively collapse an entire dark web market. Not that it would end abuse, but it would at least undercut the financial motive, which is progress.

            That’s pretty good for 2023.

            • JackbyDev@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 years ago

              With StableDiffusion you can intentionally leave an “invisible watermark” that machines can easily detect but humans cannot see. The idea being that in the future you don’t accidentally train on already AI generated images. I’d hope most sites are doing that but it can be turned off easily enough. Apart from that I’m not sure.

              • Ryantific_theory@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 years ago

                I could have sworn I saw an article talking about how there were noise artifacts that were fairly obvious, but now I can’t turn anything up. The watermark should help things, but outside of that it looks like there’s just a training dataset of pure generative AI images (GenImage) to train another AI to detect generated images. I guess we’ll see what happens with that.

      • d13@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 years ago

        Unfortunately, no, you just need training data on children in general and training data with legal porn, and these tools can combine it.

        It’s already being done, which is disgusting but not surprising.

        People have worried about this for a long time. I remember a subplot of a sci-fi series that got into this. (I think it was The Lost Fleet, 15 years ago).

  • 👁️👄👁️@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    2 years ago

    Eh it’s still very obvious.

    I predict that small imperfections will get even hotter as time goes by