• LemmyIsFantastic@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      1 year ago

      First, the attacker needs to be within wireless proximity of the device, and listen to MAC addresses with prefixes associated with Google. After that, they can send deauth packets, to disconnect the device from the network and trigger the setup mode. In the setup mode, they request device info, and use that information to link their account to the device and - voila! - they can now spy on the device owners over the internet, and can move away from the WiFi.

      Congrats, you found a single instance. It was patched via the security program. It relied on physical proximity.

      Then you link another scenario where an utterly insignificant portion of users data was shared with partners.

      It’s grasping at straws and both those incidents are unrelated to always on recording. None of that shit you linked is related in the least bit. It’s slippery slope bullshit you’re trying to pull.

      Astroturfing 🤣🤣🤣 good lord I wish I could get paid arguing with uninformed privacy zealots.

      • Calavera@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        So much so for your “excellent track record of being secure.” right? Specially this taking almost a year to be patched. Now image the exploits that were found not by researchers, but malicious parts…

        I mean, if you were a paid astroturfer I could understand, because people have to make ends meet right. But doing that for free? What a dystopian world we live in

        • LemmyIsFantastic@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Holy shit, you really are stuck on this 100% unrelated local access hack 🤣

          I guess you’ll never use tls again cause of its history right?