Meta unveils AI models that convert brain activity into text with unmatched accuracy [View all]
Working with international researchers, Meta has announced major milestones in understanding human intelligence through two groundbreaking studies: they have created AI models that can read and interpret brain signals to reconstruct typed sentences and map the precise neural processes that transform thoughts into spoken or written words.
The first of the studies, carried out by Meta's Fundamental Artificial Intelligence Research (FAIR) lab in Paris, collaborating with the Basque Center on Cognition, Brain and Language in San Sebastian, Spain, demonstrates the ability to decode the production of sentences from non-invasive brain recordings. Using magnetoencephalography (MEG) and electroencephalography (EEG), researchers recorded brain activity from 35 healthy volunteers as they typed sentences.
The system employs a three-part architecture consisting of an image encoder, a brain encoder, and an image decoder. The image encoder builds a rich set of representations of the image independently of the brain. The brain encoder then learns to align MEG signals to these image embeddings. Finally, the image decoder generates a plausible image based on these brain representations.
The results are impressive: the AI model can decode up to 80 percent of characters typed by participants whose brain activity was recorded with MEG, which is at least twice as effective as traditional EEG systems. This research opens up new possibilities for non-invasive brain-computer interfaces that could help restore communication for individuals who have lost the ability to speak.
https://www.techspot.com/news/106721-meta-researchers-unveil-ai-models-convert-brain-activity.html