A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

  • Merlin404@lemmy.world
    link
    fedilink
    arrow-up
    122
    arrow-down
    4
    ·
    10 months ago

    Tragic that they were a celebrity that had to go through it for them to do something. But when children or others have it happened to them, they just shrug…

      • Viking_Hippie@lemmy.world
        link
        fedilink
        arrow-up
        58
        arrow-down
        12
        ·
        10 months ago

        Probably helps that she’s super white too.

        This has been happening to AOC constantly since before she was first sworn in and it’s been crickets.

        When it happens once to the media’s favourite white billionaire, though? THAT’S when they start to take it seriously.

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    77
    arrow-down
    9
    ·
    edit-2
    10 months ago

    What a weird populist law tbh. There’s already an established law framework that covers this: defamation. Not a lawyer but it seems like this should be addressed instead of writing up some new memes.

    They’ll use this as an opportunity to sneak in more government spyware/control is my guess.

    • quindraco@lemm.ee
      link
      fedilink
      arrow-up
      17
      arrow-down
      3
      ·
      10 months ago

      It’s not defamation. And the new law will likely fail to hold up to 1A scrutiny, if the description of it is accurate (it often is not, for multiple reasons that include these bills generally changing over time). This is more of a free speech issue than photoshopping someone’s head onto someone else’s nude body, because no real person’s head or body is involved, just an inhumanly good artist drawing a nude, and on top of that the law punishes possession, not just creation.

      An example question any judge is going to have for the prosecutor if this goes to trial is how the image the law bans is meaningfully different from writing a lurid description of what someone looks like naked without actually knowing. Can you imagine going to jail because you have in your pocket a note someone else wrote and handed you that describes Trump as having a small penis? Or a drawn image of Trump naked? Because that’s what’s being pitched here.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 months ago

        It actually proposes “possession with the intention to distribute” which just show what a meme law this is. How do you determine the intention to distribute for an image?

        And I disagree with your take that this can’t be defamation. Quick googling says the general consensus is that this would fall in the defamation family of laws which makes absolute sense since a deepfake is an intentional misrepresentation.

        • Sagifurius@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          10 months ago

          I guess if you have AI generate the senate house speaker fucking her in the ass in an alley full of trash while she holds money bags, it’s then political satire and protected?

    • General_Effort@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      10 months ago

      Even better: Intentional infliction of emotional distress

      There are business interests behind this. There is a push to turn a likeness (and voice, etc.) into an intellectual property. This bill is not about protecting anyone from emotional distress or harm to their reputation. It is about requiring “consent”, which can obviously be acquired with money (and also commercial porn is an explicit exception). This bill would establish this new kind of IP in principle. It’s a baby step but still a step.

      You can see in this thread that proposing to expand this to all deepfakes gets a lot of upvotes. Indeed, there are bills out there that go all the way and would even make “piracy” of this IP a federal crime.

      Taylor Swift could be out there, making music or having fun, while also making money from “her consent”, IE by licensing her likeness. She could star in movies or makes cameos by deepfaking her on some nobody actor. She could license all sorts of youtube channels. Or how about a webcam chat with Taylor? She could be an avatar for ChatGPT, or she could be deepfaked onto one of those Indian or Kenyan low-wage workers who do tech support now.

      We are not quite there yet, technologically, but we will obviously get there soonish. Fakes in the past were just some pervs who were making fan art of a sort. Now the smell of money is in the air.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        10 months ago

        This seems like the most likely scenario tbh. I’m not sure whether personal likeness IP is a bad thing per se but one thing is sure - it’s not being done to “protect the kids”.

        • General_Effort@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          10 months ago

          personal likeness IP is a bad thing

          It is. It means that famous people (or their heirs, or maybe just the rights-owner) can make even more money from their fame without having to do extra work. That should be opposed out of principle.

          The extra money for the licensing fees has to come from somewhere. The only place it can come from is working people.

          It would mean more inequality; more entrenchment of the current elite. I see no benefit to society.

          • Dr. Moose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Not necessarily I’m optimistic that this could lead to empowering status and personality as main resources and push money out of society.

            • General_Effort@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              10 months ago

              How so? Fame is already a monetizable resource. The main changes that I see are that 1) no opportunity to show their face and make their voice heard needs to be missed for lack of time, and 2) age no longer needs to be a problem.

    • doctorcrimson@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      4
      ·
      edit-2
      10 months ago

      When you steal a person’s likeness for profit or defame them, then that’s a CIVIL matter.

      This bill will make AI sexualization a CRIMINAL matter.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Where do you see that?

        The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act would add a civil right of action for intimate “digital forgeries” depicting an identifiable person without their consent, letting victims collect financial damages from anyone who “knowingly produced or possessed” the image with the intent to spread it.

        • doctorcrimson@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          10 months ago

          Here:

          A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence.

          • Dr. Moose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            10 months ago

            That doesn’t seem to be correct. More like a typo as criminalize =/= criminal law.

    • gapbetweenus@feddit.de
      link
      fedilink
      arrow-up
      22
      arrow-down
      31
      ·
      edit-2
      10 months ago

      Not a lawyer but it seems like this should be addressed instead of writing up some new memes.

      Always interesting to see people who even admit that they don’t know, but they still have a rather strong opinion.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        25
        arrow-down
        5
        ·
        10 months ago

        So only lawyers can have an opinion on law and be allowed public discourse? Lol

        • gapbetweenus@feddit.de
          link
          fedilink
          arrow-up
          10
          arrow-down
          21
          ·
          10 months ago

          Obviously not. Everyone is allowed to voice their opinion and has to accept that other people might find his opinion stupid and tell them so.

          My point is more, that you seem on one hand to realize that it’s a complex matter and you lack the expert knowledge (I’m not a lawyer), but on other hand still feel the need to express your opinion. There is nothing inherently wrong with that. It’s extremely common. Just something I have fun pointing out.

        • Ledivin@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          17
          ·
          edit-2
          10 months ago

          Nobody’s saying you should be barred from participating, you just rightfully look like an idiot while you do it.

        • TigrisMorte@kbin.social
          link
          fedilink
          arrow-up
          1
          arrow-down
          18
          ·
          10 months ago

          When that opinion is about what a Law does or does not cover? Yes, only a Lawyer’s opinion should be involved. What a Law should/n’t cover or how it should/n’t work? Layperson’s opinion is important.

          • Xhieron@lemmy.world
            link
            fedilink
            English
            arrow-up
            16
            ·
            10 months ago

            Hi, lawyer here.

            Everyone’s opinion about the law matters, including what it covers, whether it’s vague, whether it applies, etc. This is Lemmy–not court. We’re in the town square here. Drinking yourself through three years of law school doesn’t imbue you with magical abilities to interpret laws as though they were religious texts. It’s just an education–not a miracle. If lawyers always knew what the law meant and laypeople always didn’t, no one would be fretting over hotly anticipated SCOTUS opinions, because everyone would already know the outcome.

            But wouldn’t you know it, reasonable people sometimes disagree, and among those reasonable people, quite often, are non-lawyers.

            As it turns out, non-lawyers often have an outsized influence on the law. Did you know that Donald Trump has never been to law school? Unbelievable, right? But hard to fathom though it may be, the big orange idiot hasn’t sat in on a single hour of L1 Torts. In fact he may have never even have seen the inside of a law library. Yet his opinion about the law has a tremendous impact, bigger even than Dr. Moose’s, because checking the “went to law school” box really doesn’t mean a hell of a lot outside of very limited situations.

            Personally, I’m much more interested in Dr. Moose’s opinion on this law than I am Rudy Giuliani’s, or even Clarence Thomas’s (and both those guys went to law school), and it’s no bother to me that he’s not a lawyer. In fact, it’s probably a mark in his favor.

            If you’re not interested in his opinion because he’s not a lawyer, well hey, that’s totally allowed, but you can easily ignore his comments without being pedantic. Or maybe you could just concede that there’s probably a bunch of strong opinions you also hold on subjects on which you’re not an expert. In fact, the whole lot of omg-not-a-lawyer! non-lawyers pitching little fits in this comment thread probably have strong feelings about war even though many of them have probably never put on a uniform. They might have strong feelings about healthcare despite never having darkened the door of a medical school. Shit, we might all even have strong feelings about politics despite never having gotten a single vote in a single election, ever. Can you believe it?!

            Yeah. It’s just an opinion. If you’re gatekeeping ‘having an unqualified opinion’ you should probably just lock yourself in your house and bar the windows, 'cause it’s gonna be an uphill battle for you.

            • TigrisMorte@kbin.social
              link
              fedilink
              arrow-up
              2
              arrow-down
              10
              ·
              10 months ago

              My dear self declared Law involved personage. Nope. It matters not one whit what the layperson’s opinion about what a Law covers or not as the sole arbiters are the Judiciary. Any layperson’s opinion involved is a matter of “should” or "shouldn’t. They have no say in the final passed Law, only the Courts do. To claim otherwise is to pretend “sovereign citizen” is an actual thing.

              But the reality wasn’t important to you. Was it, law boi.

              • Xhieron@lemmy.world
                link
                fedilink
                English
                arrow-up
                7
                arrow-down
                1
                ·
                edit-2
                10 months ago

                Aww. You sound mad. Don’t be mad. Sorry if I got under your skin. Have a Coke and a smile.

                • TigrisMorte@kbin.social
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  4
                  ·
                  10 months ago

                  A: high sugar drinks are a leading cause of diabetes and should be avoided. I recommend rum instead.
                  B: don’t make me angry, you won’t like me when I’m, angry.
                  C: you are not the boss of me.

            • gapbetweenus@feddit.de
              link
              fedilink
              arrow-up
              1
              arrow-down
              12
              ·
              10 months ago

              Everyone’s opinion about the law matters,

              Hard disagree, only opinion of people who actually read the law - matter on the topic. Everything else just creates more confusion. We are on the internet most people never bother to go and actually read what they are talking about - and that includes me.

      • sphericth0r@kbin.social
        link
        fedilink
        arrow-up
        9
        ·
        edit-2
        10 months ago

        I know, Congress should be ashamed of themselves. We would be hard pressed to find a group that had a worse understanding of technology

  • sphericth0r@kbin.social
    link
    fedilink
    arrow-up
    27
    arrow-down
    1
    ·
    10 months ago

    I believe libel laws already exist, but when you’re in Congress you must make laws in a reactionary way otherwise considered thought and reason might begin to permeate the law. We wouldn’t want that.

    • FuglyDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      2
      ·
      10 months ago

      there is a place for deep fakes in satire. (albeit, they should be known as such,)

      • AlternatePersonMan@lemmy.world
        link
        fedilink
        arrow-up
        13
        arrow-down
        5
        ·
        10 months ago

        I agree with the right to satire, but probably not as a deep fake. Comics, skits, etc., sure. Deep fakes are too convincing for an alarming number of folks.

        • FuglyDuck@lemmy.world
          link
          fedilink
          English
          arrow-up
          23
          arrow-down
          1
          ·
          10 months ago

          so how do you feel about skilled impersonators?

          what if they’re convincing? or are we going to allow just the shitty ones? or only if they offend the subject?

          what you’re proposing is a very slippery slope.

          • ZILtoid1991@kbin.social
            link
            fedilink
            arrow-up
            5
            arrow-down
            4
            ·
            10 months ago

            I think the “too convincing skilled impersonator” problem is covered by defamation laws.

            • FuglyDuck@lemmy.world
              link
              fedilink
              English
              arrow-up
              10
              ·
              10 months ago

              Nope. Defamation requires some malicious intent to be illegal. It also requires more or less blatant lies to be maintained.

              Particularly since most satire and most impersonators both go to reasonable lengths to ensure that there’s is minimal confusion as to reality,

          • Zahille7@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            10 months ago

            I’m so glad someone posted a link to Sassy Justice. I thought it was a hilarious little experiment from the South Park guys

        • MagicShel@programming.dev
          link
          fedilink
          arrow-up
          5
          ·
          edit-2
          10 months ago

          An alarming number of folks think the world is flat and the moon is made of cheese. We need a better standard than that.

              • MagicShel@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                10 months ago

                The folks who think it’s made of cheese also think we faked the moon landing.

                Which raises a question… Could someone press for moon landing proof to be suppressed on the grounds that they believe it is a deep fake? I guess that depends on how sexy you find moon cheese.

  • leaky_shower_thought@feddit.nl
    link
    fedilink
    arrow-up
    20
    arrow-down
    1
    ·
    10 months ago

    individuals who produced or possessed the forgery with intent to distribute it

    this is going to be a wild ride.

    there’s a scenario where the creator is not the leaker but angry people with forks won’t even care of the distinction.

    • Asafum@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      This is exactly what has me irritated about this whole nonsense… People have been doing that since Photoshop existed, but big scary AI is in the news now so we are going to attack it full force because people are using it in the way they’ve used everything that has similar capabilities…

      Still no action on our actual issues though, just some performative bullshit to assist the truly needy of our society, billionaires…

    • General_Effort@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      10 months ago

      Only if they do it badly. The bill defines anything as a “digital forgery” that is made with “technological means”:

      to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual

  • theneverfox@pawb.social
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    3
    ·
    10 months ago

    Hot take, but I feel like this is entirely the wrong direction to take. I feel like this will go badly in one of many ways if passed, and I feel like leaning into this would lead to a better world

    Women, especially teachers, lose their jobs because their nudes leaked. This technology is in the wild, it can’t be put back in the box. It can be done at home by a technically gifted teenager with a gaming computer. While this is certainly true, I don’t think the common person will understand this until it’s everywhere.

    Yeah, I get that it must feel horribly violating, but imagine the world where we go the other direction - where nude pictures have no power, because anyone could have whipped them up.

    Where the response to seeing them is anger or disgust, not fear

    But my biggest concern is the fact that most technical people don’t understand generative AI… There’s no way in hell Congress grasps the concept. I’m scared to read the full wording of this bill

    • Dewded@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      I agree. There should be good laws already in place for this. Defamation should do it.

      A “technically gifted teenager” is someone with an attention span longer than 5 minutes and a computer with a decent GPU. While definitely a scarce resource, not super scarce.

      Running stable diffusion locally is getting easier and easier. Took me about 15 mins last time. I just followed the readme. It won’t be long until it’s just a one-click setup and everyone can do it.

  • Serinus@lemmy.world
    link
    fedilink
    arrow-up
    32
    arrow-down
    21
    ·
    10 months ago

    I don’t get it. Why care? It’s not her.

    Maybe if they’re making money of off her likeness. But without a money trail it just seems like chasing ghosts for not much reason.

    • gapbetweenus@feddit.de
      link
      fedilink
      arrow-up
      29
      arrow-down
      6
      ·
      10 months ago

      If you are interested to know you can search interviews with people who have been deepfaked in a sexual way where they explain how they feel and why they care.

    • shiroininja@lemmy.world
      link
      fedilink
      arrow-up
      27
      arrow-down
      8
      ·
      10 months ago

      Because it’s gross, and they do it to minors now. and all they need are pictures of your kids from your social media profile. They even use AI to undress them.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        19
        arrow-down
        2
        ·
        10 months ago

        Generating sexual images of minors is already illegal. And these images can be generated by anyone modestly technical on their computer, so you can’t go after people for creating or posessing the images (except if they look too young), only distribution.

        This is unfortunately theater and will do basically nothing. How does a person even know if they are deep fakes? Or consensual? Hell what’s too close of a likeness, because some of those images didn’t look that much like her and at least one was not even realistic.

        I’m not saying it’s cool people are doing this, just that enforcement of this law is going to be a mess. You wind up with weird standards like how on Instagram you can show your labia but only through sheer material. Are deep fakes fine if you run them through an oil painting filter?

        • yamanii@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          10 months ago

          Are deep fakes fine if you run them through an oil painting filter?

          Probably since nobody could mistake an oil painting for the real person, it’s not a deep fake anymore.

          • MagicShel@programming.dev
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            10 months ago

            I have about a 99% success rate at identifying AI full body images of people. People need to learn to look better. They look just as fake as the oil paintings.

              • MagicShel@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                10 months ago

                I think that’s relevant when the defense against oil paintings is that you can tell they aren’t real. The line can’t be “you can’t tell they are fake” because… well… you can identify AI artwork 99% of the time and the other 1% is basically when the pose is exactly so to conceal the telltale signs and the background is extremely simple so as to give nothing away.

            • gapbetweenus@feddit.de
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              10 months ago

              They look just as fake as the oil paintings.

              You can go photo or even hyper realism with oil. And with AI you just need a bit of post.

      • fishos@lemmy.world
        link
        fedilink
        English
        arrow-up
        25
        arrow-down
        9
        ·
        10 months ago

        And here we have the real answer: prudism. “It’s gross”. And of course “think of the children”. You don’t have a real answer, you have fear mongering

        • MagicShel@programming.dev
          link
          fedilink
          arrow-up
          16
          arrow-down
          1
          ·
          10 months ago

          I agree the issue is one of puritan attitudes toward sex and nudity. If no one gave a fuck about nude images, they wouldn’t be humiliating, and if they weren’t humiliating then the victim wouldn’t really even be a victim.

          However we live in the world we live in and people do find it embarrassing and humiliating to have nude images of themselves made public, even fakes, and I don’t think it’s right to tell them they can’t feel that way.

          They shouldn’t ever have been made to feel their bodies are something to be embarrassed about, but they have been and it can’t be undone with wishful thinking. Societal change must come first. But that complication aside, I agree with you completely.

          • gapbetweenus@feddit.de
            link
            fedilink
            arrow-up
            8
            arrow-down
            1
            ·
            10 months ago

            Even without being puritan, there are just different levels of intimacy we are willing to share with different social circles - which might be different for everyone. It’s fundamental to our happiness (in my opinion) to be able to decide for ourselves what we share with whom.

            • MagicShel@programming.dev
              link
              fedilink
              arrow-up
              4
              arrow-down
              2
              ·
              10 months ago

              In this case I don’t feel fake images are intimate at all, but I don’t disagree with your point.

              • gapbetweenus@feddit.de
                link
                fedilink
                arrow-up
                7
                arrow-down
                2
                ·
                10 months ago

                You might not, but others do. People have rather different thresholds when it comes to what they consider intimate. I recommend to just listen to interviews with victims and it becomes clear that to them the whole thins is very intimate and disturbing.

                • MagicShel@programming.dev
                  link
                  fedilink
                  arrow-up
                  5
                  ·
                  10 months ago

                  And I said their feelings are valid and should be respected regardless of how I might feel about them. I’m not sure if you are looking for something more from me here. Despite my personal feelings that nudity shouldn’t be a source of shame, the fact is that allowing nudity to be used to hurt folks on the premise that nudity is shameful is something I utterly oppose. Like, I don’t think you should be ashamed if someone has a picture of you naked, but the real enemy is the person saying, “haha! I have pictures of you naked!!!” Whether the pictures are AI, or photoshopped, or painted on a canvas, or even real photographs.

        • gapbetweenus@feddit.de
          link
          fedilink
          arrow-up
          10
          arrow-down
          3
          ·
          10 months ago

          So you would not mind if I send AI sex videos of you to your parents and friends? How about a video where you are sexually degraded playing in public space - how would you feel about that? Maybe you performing sexual acts that you find gross yourself? You just need a bit of empathy to understand that not everyone is into exhibitionism and wants intimate things become public.

          • Serinus@lemmy.world
            link
            fedilink
            arrow-up
            12
            arrow-down
            2
            ·
            10 months ago

            I’d really prefer that people not send my parents any kind of porn.

            I look at it like someone took my face out of a Facebook picture, printed it, cut it out, pasted it over some porn, and did the same thing.

            It’d be a weird thing for them to do, but I don’t really need to send the law after them for it. Maybe for harassment?

            Laws have a cost, even good intentioned laws. I don’t believe we need new ones for this.

            • gapbetweenus@feddit.de
              link
              fedilink
              arrow-up
              4
              arrow-down
              3
              ·
              10 months ago

              Do you think people might change their opinion on you and act differently after seeing you performing in porn?

              Laws have a cost, even good intentioned laws.

              It causes distress to victims, arguably violates personal rights and is moral and ethically at least questionable. What would be downsides of criminal persecution for non-consensual sexual Deepfakes?

              • Montagge@kbin.social
                link
                fedilink
                arrow-up
                5
                arrow-down
                1
                ·
                10 months ago

                Yeah, but it’s happening to women mostly so these commenters probably don’t really care.

                • gapbetweenus@feddit.de
                  link
                  fedilink
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  10 months ago

                  I think a lot of man have unfortunately difficulties to empathize with women here, because they have rather different experience when it comes to expressing their sexuality and possible negative consequences.

              • Serinus@lemmy.world
                link
                fedilink
                arrow-up
                5
                arrow-down
                1
                ·
                10 months ago

                If they understand that this kind of porn exists? No.

                But that’s an education thing, not a legal thing.

                The downside is giving law enforcement yet another excuse to violate digital privacy. Laws that are difficult/impossible to enforce tend to do more harm than good.

                I don’t see this law removing any fake Taylor Swift porn from the Internet. Or really any other celebrity, for that matter.

                • gapbetweenus@feddit.de
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  4
                  ·
                  edit-2
                  10 months ago

                  If they understand that this kind of porn exists? No.

                  You know people form opinions on actors based on their roles in movies? So people will change what they think of you and how they act towards you based on media, even if it’s clearly fictional.

                  The downside is giving law enforcement yet another excuse to violate digital privacy. Laws that are difficult/impossible to enforce tend to do more harm than good.

                  How exactly? Which new abilities to violate digital privacy is given the state by the this bill?

          • Zellith@kbin.social
            link
            fedilink
            arrow-up
            6
            arrow-down
            1
            ·
            10 months ago

            “So you would not mind if I send AI sex videos of you to your parents and friends?”. Seems like sending it would be the dick move. My family and friends probably have no interest in seeing deepfakes of me naked.

            “How about a video where you are sexually degraded playing in public space - how would you feel about that?” Considering its not really me… meh. I don’t personally care. Because it’s not me.

            “Maybe you performing sexual acts that you find gross yourself?” If someone wants to make deepfakes of me eating poop or something for whatever reason… oh well? It’s not really me.

            But you do you.

            • gapbetweenus@feddit.de
              link
              fedilink
              arrow-up
              2
              arrow-down
              4
              ·
              10 months ago

              My family and friends probably have no interest in seeing deepfakes of me naked.

              It mostly not saying that it’s deepfakes of you. It’s just a whats app message from someone who does not like you and you have to explain a whole new technology to your parents.

              Considering its not really me… meh. I don’t personally care. Because it’s not me.

              You know it, others don’t. This will greatly change others perception of you and how they treat you.

              It’s not really me.

              Your boss and coworkers don’t know.

              But you do you.

              No, but I have empathy with other people.

        • shiroininja@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          10 months ago

          Not at all. Think of the consequences of someone’s nudes were leaked or an onlyfans account was made with images of them, and an employer sees it. They’re already firing teachers for being on there. And a lot of times they’re used in extortion. Not to mention your image is your property. It is you. And nobody else has rights to that.

            • shiroininja@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              10 months ago

              You don’t have to take nudes anymore to have nudes leaked. There are Ai that strip clothes from pictures. People have been making csam off of pictures of peoples kids on their Instagram profiles,etc.

      • TigrisMorte@kbin.social
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        10 months ago

        And if I feel that cooking carrots is gross and cooked carrots shouldn’t be fed to minors or miners? Should that be illegal as well?

        • gapbetweenus@feddit.de
          link
          fedilink
          arrow-up
          5
          ·
          10 months ago

          Illegal should be things that harm individuals or society. Since the understanding of what is harmful or not might rather differ, we have to come up with compromises or conses on what actually becomes illegal.

    • Selkie@lemm.ee
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      10 months ago

      Its like having your nudes leaked but you never sent any, pretty fucked

    • gila@lemm.ee
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      10 months ago

      There is a money trail when it’s legal. You get blatant advertising of services where you pay to upload your own photos to make deepfakes with them, on all kinds of sites (ahem, Pornhub). That’s a level of access that can’t be ignored, especially if it’s a US-based company providing the service, taking payment via Visa/Master etc. Relegate it to the underground where it belongs.

      • Serinus@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        10 months ago

        I’d be more okay if the law were profit based, because that’s much easier to enforce.

        I don’t like laws that are near impossible to enforce unless they’re absolutely necessary. I don’t think this one is absolutely necessary.

        • gila@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          10 months ago

          I don’t think general enforcement against deepfake porn consumption is a practical application of this proposed law in civil court. Practical applications are shutting down US-based deepfake porn sites and advertising. As far as possessors go, consider cases of non-celebrities being deepfaked by their IRL acquaintances. In a scenario where the victim is aware of the deepfake such that they’re able to bring the matter of possession to court, don’t you agree it’s tantamount to sexual harrassment? All I’m seeing there is the law catching up to cover disruptive tech with established legal principle

    • Deceptichum@kbin.social
      link
      fedilink
      arrow-up
      7
      arrow-down
      5
      ·
      10 months ago

      Because it’s her image?

      I’d be fucking furious if someone was sharing say a fake photo of me fucking a watermelon. Doesn’t matter if it’s physically me or not, people would think it was.

      • dont_lemmee_down@lemm.ee
        link
        fedilink
        arrow-up
        7
        arrow-down
        2
        ·
        10 months ago

        Would they though? I’d argue nobody thinks those were pictures of Taylor Swift. I’d go further and say that it helps in the sense that you can always deny even real pictures arguing they were AI.

  • Copernican@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    10 months ago

    So what happens if a person allows their likeness to be 3d modeled and textured for something like a video game, and that 3d model is used to create explicit images. Is that not a problem (or maybe a different kind of problem) because it’s not a deepfake and instead a use of a digital asset?

    • doctorcrimson@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      Technically the terms of use of a persons likeness would be defined in a contract in the case of a product, but since unauthorized use is already not a legal or protected activity in any way then I believe the bill’s intention is to add potentially fines or prison time to offenders on top of opening them up to legal liability.

      If the studio had an actor’s written consent then it would be left up to the courts as a civil matter, only.

    • General_Effort@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      Copying the asset out of the game file might be a copyright violation but you’re usually allowed to make private copies. IDK to what a degree copyright extends to images made with such an asset. (Funny story. The WWE released a video game with likenesses of their wrestlers (performers? actors? artists? IDK). A tattooist sued because that showed a design of theirs on the skin of a wrestler and won. So much for “my body belongs to me”.)

      As far as this bill is concerned. This bill defines anything as a"digital forgery" that is made with “technological means […] to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual". IDK how good the reasonable person is at spotting CGI. Quick google says that the average juror is about 50 years old. Make of that what you will.

  • alienanimals@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    10 months ago

    It’s already impossible to stop.

    Also, doing something ONLY when a billionaire complains, is a very bad look.