> While it’s not the first technology to be able to translate brain signals into language, it’s the only one so far to require neither brain implants nor access to a full-on MRI machine.
I wonder whether, in a decade or two, if the sensor technology has gotten good enough that they don't even need you to wear a cap, just there'll be people saying "obviously you don't have any reasonable expectation of not having your thoughts read in a public space, don't be ridiculous". What I mean is, we just tend to normalize surveillance technology, and I wonder if there's any practical limit to how far that can go.
Not with brain signals reading, but with aggregate dats processing most things about you will be known by any centralized processing.
Over a decade ago were the stories about how Target's loyalty program algorithm had discovered a teen was pregnant before she'd told her family, based on correlative purchase changes (like switching from scented to unscented candles).
If I could take your social media, face and eye tracking on CCTV, phone gyroscope data, purchase history, search history, and the same from all your associated contacts, with a broad enough comparative data set I could probably identify all kinds of skeletons from all kinds of closets.
It's a bit like the "my phone is listening to my conversations" freak out. It's not, but the thing you should be much more concerned about is that it has such an accurate picture of what you end up talking about without needing to be listening in the first place.
I wonder whether, in a decade or two, if the sensor technology has gotten good enough that they don't even need you to wear a cap, just there'll be people saying "obviously you don't have any reasonable expectation of not having your thoughts read in a public space, don't be ridiculous". What I mean is, we just tend to normalize surveillance technology, and I wonder if there's any practical limit to how far that can go.