Algorithm lets scientists ‘read’ your thoughts by decoding brain scans

AI-powered algorithm lets scientists ‘read’ your thoughts by decoding brain scans: First non-invasive technique could help people who can’t speak communicate for the first time

  • The system pulled data from three parts of the brain associated with natural language 
  • The model reconstructs arbitrary stimuli that the person is hearing or thinking into natural language
  • This allowed the system to produce plain text of the person’s thoughts 

Scientists can now ‘read’ your thoughts using an AI-powered model that is specifically designed to decode brain scans. 

The non-invasive breakthrough, developed by the University of Texas, could help those who are unable to speak or type to communicate for the first time. The method however, does decode language in real-time. 

The method works by feeding functional magnetic resonance imaging (fMRI) to the algorithm, which then reconstructs arbitrary stimuli that the person is hearing or thinking into natural language.

For example, study subjects listened to narrated stories while scientists scanned areas of the brain associated with natural language and fed the scan into the AI-powered decoder that returned a summary of what the individual was listening to.

Until now, this process has only been accomplished by implanting electrodes in the brain.

The new model produces an idea or summary of a patient’s thoughts by analyzing the scans, and cannot decode what they are thinking word-for-word. 

This is the first non-invasive technique used to read brain signals. Previously this was only possible by implanting electrodes in the brain

Our brains break down complex thoughts into smaller pieces that correspond to a different aspect of the entire thought, Popular Mechanics reports.

The thoughts can be as simple as a single word, such as dog, or as complex as ‘I must walk to the dog.’ 

The brain also has its own alphabet composed of 42 different elements that refer to a specific concept like size, color or location, and combines all of this to form our complex thoughts.

Each ‘letter’ is handled by a different part of the brain, so by combining all the different parts it is possible to read a person’s mind.

While the system cannot decode the brain scans with word for word to what the individual is thinking, it produces an idea of the thought

While the system cannot decode the brain scans with word for word to what the individual is thinking, it produces an idea of the thought

The system can also describe what a person was seeing in pictures while they were under the MRI machine

The system can also describe what a person was seeing in pictures while they were under the MRI machine

The team did this by recording fMRI data of three parts of the brain that are linked with natural language while a small group of people listened to 16 hours of podcasts. 

The three brain regions analyzed were the prefrontal network, the classical language network and the parietal-temporal-occipital association network, New Scientist reports.

The algorithm was then given the scans, which compared patterns in the audio to patterns in the recorded brain activity, according to The Scientist.

And the system showed it was capable of taking a scan recording and transforming it into a story based on the content, which the team found matched the idea of the narrated stories.

Although the algorithm is not able to break down every ‘word’ in the individual’s thoughts, it is able to decipher the story each person heard.

The study, pre-printed in BioXiv, provides an original story: ‘Look for a message from my wife saying that she had changed her mind and that she was coming back.’

The algorithm decoded it as: ‘To see her for some reason I thought maybe she would come to me and say she misses me.’

The system is unable to spit out word for word what a person is thinking, but is capable of providing an idea of their thoughts.

***
Read more at DailyMail.co.uk