When Microsoft CEO Satya Nadella revealed the new Windows AI tool that can answer questions about your web browsing and laptop use, he said one of the “magical” things about it was that the data doesn’t leave your laptop; the Windows Recall system takes screenshots of your activity every five seconds and saves them on the device. But security experts say that data may not stay there for long.

Two weeks ahead of Recall’s launch on new Copilot+ PCs on June 18, security researchers have demonstrated how preview versions of the tool store the screenshots in an unencrypted database. The researchers say the data could easily be hoovered up by an attacker. And now, in a warning about how Recall could be abused by criminal hackers, Alex Hagenah, a cybersecurity strategist and ethical hacker, has released a demo tool that can automatically extract and display everything Recall records on a laptop.

Dubbed TotalRecall—yes, after the 1990 sci-fi film—the tool can pull all the information that Recall saves into its main database on a Windows laptop. “The database is unencrypted. It’s all plain text,” Hagenah says.⁩ Since Microsoft revealed Recall in mid-May, security researchers have repeatedly compared it to spyware or stalkerware that can track everything you do on your device. “It’s a Trojan 2.0 really, built in,” Hagenah says, adding that he built TotalRecall—which he’s releasing on GitHub—in order to show what is possible and to encourage Microsoft to make changes before Recall fully launches.

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    6 months ago

    So the next step is: M$ encrypts their local database.

    Later they want to upload it to their servers to further exploit your data. But then it is encrypted (and of course only M$ has the key), therefore the upload will be very hard to detect.

    Hmpf.

    • Morphit
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      6 months ago

      So… how does the user actually ®ecall™ anything? Do they have to ask M$ Co-pilot™ AI to get it from The Azu®e™ Cloud? Because I’m pretty sure a hacker could do that just as easily.