• @bleistift2@feddit.de
    link
    fedilink
    English
    261 year ago

    Isn’t it a good thing for pedophiles to have an outlet for their desires that doesn’t involve harming children? Am I not seeing an obvious downside?

    • PorkRollWobbly
      link
      fedilink
      321 year ago

      Pedophilia is not a sexuality and CSAM, AI generated or not, is not a healthy outlet. Pedophilia should be treated as a disease, and pedophiles should receive treatment for that instead.

      • idunnololz
        link
        fedilink
        7
        edit-2
        1 year ago

        AFAIK you can’t “cure” pedophilia the same way you can’t cure homosexuality. The best you can do is teach people not to act on their desires.

      • @bleistift2@feddit.de
        link
        fedilink
        English
        21 year ago

        pedophiles should receive treatment for that instead

        In a world where many people cannot afford basic healthcare or – if they can afford it – where healthcare isn’t available in the required quantity, does your argument still hold?

    • @PotatoKat@lemmy.world
      link
      fedilink
      121 year ago

      If I’m not mistaking I remember reading that consuming CSAM increases the likelihood of offense since it normalizes the act/makes the fantasies more vivid. It makes them more want to act out what they see instead of removing desires.

    • @klingelstreich@feddit.de
      link
      fedilink
      51 year ago

      It depends on whether you hold a world view where every person is valuable and needs help and understanding to become their best self or one where there are good and bad people and the baddies need to be punished and locked away so everyone else can live their life in peace.

            • @Kusimulkku@lemm.ee
              link
              fedilink
              11 year ago

              I’m not saying it’s better alternative, I’m saying it might not make sense to talk about it “involving minors”.

                • Norgur
                  link
                  fedilink
                  11 year ago

                  That’s not picky about wording.
                  While I agree that stuff like that should not exist at all in no way whatsoever, there is a vast difference between it existing because someone abused a child, recorded that and thus scarred the child for life, or if someone made a computer make up pixels in a way that is disgusting.

                  • @bleistift2@feddit.de
                    link
                    fedilink
                    English
                    1
                    edit-2
                    1 year ago

                    someone made a computer make up pixels in a way that is disgusting

                    I like that take. It lends itself to comparison: The Saw movies were well-received (at first), even though most people would abhor hurting others in this way.

            • Norgur
              link
              fedilink
              11 year ago

              That’s a rather useless contribution to the discussion. The initial argument was a line of reasoning why artificial csam might be a benefit so people can vent their otherwise harmful behavior without harming actual people. You just flat out responded “it is enabling and doesn’t stop distribution”. So you just responded with “no, u wrong”. Care to tell us you reasons behind your stance?