Stanford University scientists have demonstrated a brain-computer interface that translates a person’s silently imagined words into text in real time, achieving up to 74% accuracy from a 125,000-word vocabulary. The findings, published in Cell, mark the first time inner speech—words thought but not physically attempted—has been reliably decoded with an implanted device. The study enrolled four volunteers with severe paralysis caused by amyotrophic lateral sclerosis or a brainstem stroke. Arrays of micro-electrodes in the motor cortex captured neural patterns associated with phonemes, which an artificial-intelligence model combined to form sentences. Participants said the technique required less effort than existing systems that decode attempted speech. To protect privacy, the researchers introduced a “password” mechanism that activates decoding only when users think of a preset phrase; the system recognized the cue with 98% accuracy and otherwise ignored inner chatter. The team also noted occasional detection of unintended thoughts, underscoring the need for robust safeguards. Lead author Erin Kunz said the advance could restore “fluent, rapid and comfortable” communication for people who cannot speak, while neurosurgeon Frank Willett predicted higher accuracy as sensors and algorithms improve. The work extends BrainGate’s clinical program and comes amid a surge of private investment in neurotechnology, including ventures backed by OpenAI and Neuralink.
Stanford scientists announce they’ve developed a brain implant capable of "decoding inner speech" in real-time, bringing a world of “brain transparency” closer. The decoding can even be done against someone’s will https://t.co/xi6DVfbfh1
New Brain Interface Interprets Inner Monologues With Startling Accuracy https://t.co/eb0ymUEa7D
Imagine predicting a heart attack before it happens. That’s what AI twins are aiming to do. And they’re getting frighteningly good at it. https://t.co/NTBu87MgVx https://t.co/0QLVKOmFog