• @bleistift2@feddit.de
    link
    fedilink
    English
    261 year ago

    Isn’t it a good thing for pedophiles to have an outlet for their desires that doesn’t involve harming children? Am I not seeing an obvious downside?

    • PorkRollWobbly
      link
      fedilink
      321 year ago

      Pedophilia is not a sexuality and CSAM, AI generated or not, is not a healthy outlet. Pedophilia should be treated as a disease, and pedophiles should receive treatment for that instead.

      • idunnololz
        link
        fedilink
        7
        edit-2
        1 year ago

        AFAIK you can’t “cure” pedophilia the same way you can’t cure homosexuality. The best you can do is teach people not to act on their desires.

      • @bleistift2@feddit.de
        link
        fedilink
        English
        21 year ago

        pedophiles should receive treatment for that instead

        In a world where many people cannot afford basic healthcare or – if they can afford it – where healthcare isn’t available in the required quantity, does your argument still hold?

    • @PotatoKat@lemmy.world
      link
      fedilink
      121 year ago

      If I’m not mistaking I remember reading that consuming CSAM increases the likelihood of offense since it normalizes the act/makes the fantasies more vivid. It makes them more want to act out what they see instead of removing desires.

    • @klingelstreich@feddit.de
      link
      fedilink
      51 year ago

      It depends on whether you hold a world view where every person is valuable and needs help and understanding to become their best self or one where there are good and bad people and the baddies need to be punished and locked away so everyone else can live their life in peace.

            • @Kusimulkku@lemm.ee
              link
              fedilink
              11 year ago

              I’m not saying it’s better alternative, I’m saying it might not make sense to talk about it “involving minors”.

                • Norgur
                  link
                  fedilink
                  11 year ago

                  That’s not picky about wording.
                  While I agree that stuff like that should not exist at all in no way whatsoever, there is a vast difference between it existing because someone abused a child, recorded that and thus scarred the child for life, or if someone made a computer make up pixels in a way that is disgusting.

            • Norgur
              link
              fedilink
              11 year ago

              That’s a rather useless contribution to the discussion. The initial argument was a line of reasoning why artificial csam might be a benefit so people can vent their otherwise harmful behavior without harming actual people. You just flat out responded “it is enabling and doesn’t stop distribution”. So you just responded with “no, u wrong”. Care to tell us you reasons behind your stance?

                • @bleistift2@feddit.de
                  link
                  fedilink
                  English
                  11 year ago

                  “it is enabling it doesn’t stop distribution“

                  Norgur’s point is that you didn’t provide any reasoning why that should be the case.

  • @OsrsNeedsF2P@lemmy.ml
    cake
    link
    fedilink
    9
    edit-2
    1 year ago

    On one hand, yes, but on the other, Stable Horde developed a model to detect CSAM thanks to Stable Diffusion, and that’s being used to combat pedos globally

  • neuropean
    link
    fedilink
    31 year ago

    What’s interesting is that mammals from mice to dogs don’t draw a distinction between arbitrary ages before trying to copulate. On the other hand, they don’t try to fuck the equivalent of pre-pubescent members of their species either, nothing natural about that.