• hemko@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    45
    ·
    10 months ago

    Only people who believe they’d benefit from regulating deepfakes are some high profile and/or internet narcissists.

    “Boohoo someone made a video of Trump’s hemorrhoids and Biden licking them” Everyone already knows you can easily fake some video without using “AI” for it, we have a whole fucking industry for it pumping hundred movies out every Saturday. We already know you shouldn’t believe everything you see.

    • 520@kbin.social
      link
      fedilink
      arrow-up
      29
      arrow-down
      1
      ·
      10 months ago

      Goes a bit beyond that nowadays. Deep fakes can be used to create false evidence for example

      • DrownedRats@lemmy.world
        link
        fedilink
        English
        arrow-up
        34
        ·
        10 months ago

        Deepfakes are already being used on an industrial scale for scams and conning people.

        It’s not a case of them needing regulating because they offend peoples sensibilities, it’s because they’re actively being used to harm people.

          • Dadd Volante@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            4
            ·
            10 months ago

            The same way cracking down on CP helps make it harder to access by pedos.

            Y’all are seriously looking creepy

            • General_Effort@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              9
              ·
              10 months ago

              Good one. You want to lock people up but people who believe in the first amendment are creepy. Nice spoof of moral panic populism.

                • General_Effort@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  5
                  ·
                  10 months ago

                  True. Freedom of speech and of the press is a peculiarly American thing. In virtually all other countries… No, wait. That’s the 2nd amendment. What were we talking about?

              • Dadd Volante@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                4
                ·
                edit-2
                10 months ago

                Good one. You want the freedom to create any porn you want regardless of who it hurts without any personal accountability.

                This is a weird hill to die on but I’ve seen worse. Not really.

        • General_Effort@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          2
          ·
          10 months ago

          Yeah, fraud used to be such a fun pastime for the whole family. Now we need to regulate it. Technology ruins everything.

        • loobkoob@kbin.social
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          10 months ago

          The past month or so I’ve started encountering quite a few deepfakes on dating sites. I honestly can’t tell they’re deepfakes just by looking; the only reason I’ve realised tell is because they were very obviously Instagram model photos. I reverse image searched them to find where they were taken from and confirm my suspicions that the profile’s using stolen photos, only to find that the original photos aren’t quite the same. It’ll be the exact same shot with the same body but a different face, and with identifying tattoos removed, moles adds, etc.

          If they weren’t obvious modelling shots that made me want to reverse image search them, I wouldn’t have known at all. It makes me wonder how many deepfaked images I’ve encountered on dating sites already and just not known about because they’ve been fairly innocuous-looking photos…

        • BakerBagel@midwest.social
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          2
          ·
          10 months ago

          So you have no issues with me distributing deepfakes of you burning crosses across your neighborhood?

          • FiskFisk33@startrek.website
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            2
            ·
            10 months ago

            I’m not saying deepfakes should not be regulated.

            I’m saying the examples are poor because scamming people is already illegal.

            • BakerBagel@midwest.social
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              5
              ·
              10 months ago

              So you aren’t actually syaing anything at all. You’re just being contrarian for the sake of it.

              • FiskFisk33@startrek.website
                link
                fedilink
                English
                arrow-up
                4
                ·
                10 months ago

                Not exactly. Arguments like “they should be regulated because they can be used for illegal stuff” are moot, since those usages are already regulated. I’m on the fence on the whole regulation thing and I’ve yet to see any actual realistic examples on how regulation would look.

                Is it even logical to regulate ai images specifically, or should we lump it in together with any form of image manipulation?

      • hemko@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        15
        ·
        10 months ago

        Okay but can you tell the difference between legal real evidence and illegal false evidence?

        The technology is there to create this type of false evidence, it’s not going back to the Pandora’s box anymore. The truth is that you can’t trust a single videotape as 100% evidence alone.