• Drusas@kbin.social
      link
      fedilink
      arrow-up
      13
      ·
      1 year ago

      Same thing happens if you cut your hair short as a woman. Suddenly you look much younger, apparently. Unless you are 50+. Then it’s considered normal.

  • Lvxferre@lemmy.ml
    link
    fedilink
    English
    arrow-up
    22
    ·
    edit-2
    1 year ago

    I’m not directly impacted by this, for multiple reasons (different government/country, childless, hard to confuse with underage). Even then, this sounds like a blatantly Bad Idea® for me.

    The idea is wrong on its very moral (and likely legal) premise, given that the ones who should ensure that children don’t access harmful content should be their guardians. This is not the job of a gov agency as the FTC, and specially not of the ESRB - whose parent organisation is the ESA, ultimately controlled by Capcom, Electronic Arts, Konami, Bandai Namco Entertainment, Microsoft, Nintendo, Sony Interactive Entertainment, Square Enix, Activision Blizzard, Take-Two Interactive, Ubisoft, and Warner Bros. Games.

    The data suggests that for those between 25 and 35, 15 out of 1,000 females vs 7 out of 1,000 males might be incorrectly classified as under-25 (and would have the option of verifying using another method)," the filing states. "The range of difference by skin tone is between 8 out of 1,000 vs 28 out of 1,000.

    Let me rephrase this: if you got the “wrong” skin colour, there’s a chance of 28/1k= 1 in 35 that you’re assumed to be unrightfully trying to access entertainment above your assumed age range. And that’s accordingly to the data from the filing, i.e. from the ones proposing the implementation of this system, so there’s a good chance that they’re lowering the numbers to make it look better (or rather, less worse) than it is. That’s fucking awful when you’re dealing with people; but those fuckers from the ESRB don’t care, right? “You’re a cash cow, not an actual human being.”

    And even if the numbers are accurate (yeah, sure, let’s all be a bunch of gullible morons), one detail that is not being mentioned here is that, if you’re black and a woman, you’re specially screwed - because you’re in both cohorts that increase the likelihood for false positives. I’m betting that, for black women, the false positive rate will be something between 50/1k = 1 in 20 and 100/1k = 1 in 10.

    There’s a legal principle around the world called “presumption of innocence”; in other words, that unless it’s proved that you’re doing something wrong, you should be treated as if doing something lawful. This legal principle applies in USA too, right? Guess what, it won’t apply to those people incorrectly flagged as “I assume that you’re underage”, who’ll need to go out of their way and through stupid bureaucracy and delays to show that they have rightful access to the piece of entertainment in question.

    And let me guess something: once the system does stupid shit after stupid shit, the ones responsible for the system will find a thousand excuses to not take responsibility for it. Such as “it’s the system, not me!” (treating a tool as an agent).

    The ESRB dismissed concerns about the “fairness” of the system, however, saying that “the difference in rejection rates between gender and skin tone is very small.”

    The very data shows the opposite.

    [from the ESRB report] “While bias exists, as is inherent in any automated system, this is not material.”

    The name for this shitty argument is red herring - it distracts you from what matters. If the bias is “ackshyually, not material” is irrelevant, what matters is the presence of the bias there on first place.

    the ESRB presented its facial recognition plan as “an additional, optional verification method”

    Slippery slope can be either a fallacy… or the acknowledgement that we humans tend to act like a bunch of boiled frogs.

    There’s a reasonable risk for any “optional” system or requirement to becomes “obligatory”. Specially when handling legislation.

    Ah, something that the article doesn’t mention: the risk of false negative. Grab a picture of your dad/mum, move it a bit back and forth to pretend that it’s an actual person, dumb bot says “okay”.


    Here’s my question. See all those businesses that I mentioned at the top? How do they plan to profit with the potential implementation of this shitty idea?

    • taladar@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      All your percentages look like they are missing a decimal point before the last digit but since your 1 in… figures are correct i assume that is just a typo or glitch of some kind.

      • octoperson@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        They’re ‰ not % - per 1000 not per 100. Hardly ever see that symbol in the wild tho - probably because, yeah, they’re really easily confused

      • Lvxferre@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        As octoperson said I’ve used permille (‰), not percent (%). The article was already listing ratios per 1000, so it was easier. (Plus old habits die hard - I used ‰ quite a bit in chemistry).

        I’ll swap it with “/1k” for less confusion.

        • taladar@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I hadn’t even considered it might be that. I don’t think I have seen the permille symbol on the web before (as opposed to scientific papers or on black boards or similar places).

  • Rynelan@feddit.nl
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    Hahaha nope, this will never happen. Big privacy hit running kids faces through a database to check their age.

  • carpelbridgesyndrome@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    Good luck getting the age ID system to reliably work. Particulary for telling apart people a day over 18 and a day under. Forget ethics and adversarial conditions where people fake photos. How is this supposed to even work in an ideal world where people don’t misrepresent themselves?

    Also collecting children’s biometrics is likely illegal.

  • Chaotic Entropy
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    1 year ago

    Public feedback presumably is a simple “ques que le fuck?”

  • T156@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    Could it not just be defeated by using a stock photo, or one of the myriad filters that “age-up” a person? It seems like a lot of trouble to go to, for measure that can be easily circumvented.

  • another_lemming@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    Misleading title. They recognize preuploaded photoes of parents, not age itself, as I got it from the article.

    But they’d eventually want to pass it to adults too for ‘security’. ESRB kinda wants spend some money on whatever sticks, but they aren’t interested in that long-term as some Epic. Wouldn’t they like to implement their tools for that and require their usage?