foros.fediverso.gal
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
@Adequately_Insane@lemmy.world to Memes@lemmy.ml • 2 years ago

It will only go downhill from here

lemmy.world

message-square
38
fedilink
43

It will only go downhill from here

lemmy.world

@Adequately_Insane@lemmy.world to Memes@lemmy.ml • 2 years ago
message-square
38
fedilink
  • BolexForSoup
    link
    fedilink
    8•2 years ago

    deleted by creator

    • massive_bereavement
      link
      fedilink
      5•2 years ago

      Based on this article, it seems that teens were using an app: https://www.msn.com/en-us/money/other/ai-generated-child-sexual-abuse-images-could-flood-the-internet-a-watchdog-is-calling-for-action/ar-AA1iMZj5

      Is that your reference?

      • BolexForSoup
        link
        fedilink
        4•
        edit-2
        2 years ago

        deleted by creator

        • massive_bereavement
          link
          fedilink
          1•2 years ago

          Stable Difussion still has some steep learning curve and requires some money investment onto hardware or cloud GPU access. Meaning they have probably several hours to re-think how stupid is what they’re doing.

          A simple app you can download into your phone and do this shit is a pretty easy and quick way of ruining two lives (probably).

          Then again, the hammer should fall onto the developers and the app store that allowed it on the first place. (IMO)

          • BolexForSoup
            link
            fedilink
            2•2 years ago

            deleted by creator

    • @bleistift2@feddit.de
      link
      fedilink
      English
      1•2 years ago

      And what happens when they start making requests of real underage people?

      That’s the whole point of my argument. They don’t need to make request for real people if they can get fake ones of equal quality. Your argument reads like “We can’t let people have meat. What if they start eating live cows?”

      • BolexForSoup
        link
        fedilink
        2•
        edit-2
        2 years ago

        deleted by creator

        • Norgur
          link
          fedilink
          3•2 years ago

          First if all: that is exactly how you treat addicts. https://harmreductionjournal.biomedcentral.com/articles/10.1186/s12954-019-0340-4

          Secondly: no, we don’t have evidence that this might decrease the danger for pedophiles to act on their desires, since the technology is rather new.

          Of course we should not enable urges like that. Yet, we have to be realistic: there will always be those that can’t be treated. Do you want those who cannot be stopped from indulging in their desires to do so on children’s images by real, abused children, or do you want them to vent on made up images?

        • @bleistift2@feddit.de
          link
          fedilink
          English
          1•2 years ago

          your argument is implying that if somehow we allow them to use AI generated child porn that it will somehow stop them from seeking the real stuff out or is somehow “better.”There is literally no evidence that suggests that in the slightest.

          Of course. How would you procure such evidence? Give a group of pedophiles access to AI generated content and check if they molest children significantly less than a control group?

          Pedophilia is an illness. […] You really need to […] take a macro view of what you are arguing in favor of.

          I’m not defending pedophilia. Given that access to pedophilia treatment and prevention of sexual abuse is often lacking, I was starting a discussion of whether AI-generated content might be part of the prevention of sexual abuse of minors. After all, there are similar programs for drug abusers. Take methadone substitution as an example. Or establishments that are called “Drückerstube” in German (a very lacking translation would be “injection rooms”) – clean rooms where drug addicts have access to clean utensils for consuming drugs.

          • BolexForSoup
            link
            fedilink
            3•
            edit-2
            2 years ago

            deleted by creator

            • @bleistift2@feddit.de
              link
              fedilink
              English
              0•2 years ago

              […Methadone] relieves cravings and removes withdrawal symptoms. Withdrawal management using methadone can be accomplished […] or simply maintained for the rest of the patient’s life.

              https://en.wikipedia.org/wiki/Methadone (emphasis mine)

              You also ignored my other example.

              • BolexForSoup
                link
                fedilink
                3•2 years ago

                deleted by creator

                • @bleistift2@feddit.de
                  link
                  fedilink
                  English
                  1•2 years ago

                  That’s unfortunate. I was hoping to find arguments in favor of your point of view that are more substantial than proof by assertion.

                  While I obviously don’t agree with your take on the subject, I’m still glad you voiced your opinions, so we could examine and evaluate them.

                  • BolexForSoup
                    link
                    fedilink
                    2•
                    edit-2
                    2 years ago

                    deleted by creator

    • TheEntity
      link
      fedilink
      1•2 years ago

      It’s still fake. But if it looks like a person in real life, what difference does the distinction make?

      I’m pretty sure there is a quite a difference between an actual human being abused and a victimless depiction of such act. Not unlike watching a violent movie. Such people obviously still need help and treatment, but to me it seems vastly better than the alternative.

      • BolexForSoup
        link
        fedilink
        2•2 years ago

        deleted by creator

        • TheEntity
          link
          fedilink
          1•2 years ago

          It very much might be an either/or situation for many, even if it’s not in all the cases.

          • BolexForSoup
            link
            fedilink
            2•
            edit-2
            2 years ago

            deleted by creator

            • TheEntity
              link
              fedilink
              1•2 years ago

              Back at you. We’re both speculating.

              • BolexForSoup
                link
                fedilink
                2•
                edit-2
                2 years ago

                deleted by creator

                • TheEntity
                  link
                  fedilink
                  1•2 years ago

                  Previously you claimed it’s enabling the behavior. Is it? Or is it merely a speculation?

              • LinkOpensChest.wav
                link
                fedilink
                0•2 years ago

                Maras and Shapiro argue that VCSAM does not prevent the escalation of pedophilic behavior. Conversely, it can progress CSAM addiction. VCSAM can fuel the abuse of children by legitimizing and reinforcing one’s views of children. The material can also be used in the groom- ing of children, reducing the inhibitions of children, and normalizing and desensitiz- ing the sexual demands

                I removed the parenthetical citations because I’m not good at markdown, but you can find them in the linked paper.

                • BolexForSoup
                  link
                  fedilink
                  1•2 years ago

                  Appreciate your linking this

Memes@lemmy.ml

!memes@lemmy.ml

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !memes@lemmy.ml

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.
  • 3.47K users / day
  • 4.62K users / week
  • 7.2K users / month
  • 18.3K users / 6 months
  • 53K subscribers
  • 15.8K Posts
  • 323K Comments
  • Modlog
  • mods:
  • ghost_laptop
  • @sexy_peach@feddit.de
  • Cyclohexane
  • Arthur Besse
  • BE: 0.19.3
  • Modlog
  • Legal
  • Instances
  • Docs
  • Code
  • join-lemmy.org