Your Tech Story

AI

Is AI getting better at mind-reading?

Artificial intelligence (AI) technology has made significant advances in recent years, but it’s important to note that AI does not possess a “mind” in the same way humans do. Therefore, the term “mind-reading” is not an accurate description of AI capabilities.

AI
Image Source: readamag.com

However, AI can be trained to predict and infer human behavior and thoughts to a certain extent by analyzing patterns and data. For example, machine learning algorithms can be trained to recognize facial expressions and body language, and infer the emotions and mental states of individuals.

Also Read: Amazon is working to boost the capability of Alexa. Here’s how

Consider the phrases that are running through your mind: that tasteless joke you, wisely, kept to yourself at dinner; your unspoken opinion of your closest friend’s new partner. Now picture someone listening in. University of Texas at Austin researchers took another move in that direction on Monday.

An artificial intelligence (A.I.) that might interpret the private thoughts of human beings was detailed in a study that was released in the journal Nature Neuroscience. The A.I. did this by examining fMRI scans, which assess the flow of blood to various parts of the brain.

Researchers have already created language-decoding techniques to recognize speech attempts made by persons who are mute and to enable paralyzed people to write simply by thinking about writing. However, the new language decoder is among the first to do so without the use of implants.

When respondents watched silent films as part of the study, it was capable to produce fairly accurate accounts of what was occurring onscreen and turn a person’s mental phrases into actual speech.

Three volunteers, who spent 16 hours over many days in Dr. Huth’s lab listening to “The Moth” and other narrative podcasts, were the focus of the study. An fMRI scanner monitored the blood oxygen levels in various regions of their brains while they were listening.

The brain activity patterns were then compared to the words and sentences the subjects had heard using a comprehensive language model.

According to Osaka University neuroscientist Shinji Nishimoto, “Brain activity is a kind of encrypted signal, and language models provide ways to decipher it.” Using another A.I. to convert the participant’s fMRI images into words and sentences, Dr. Huth and his colleagues successfully reversed the process in their study.

Also Read: Did Elon Musk unwittingly expose his alt-Twitter account?

The participants listened to fresh recordings while the researchers evaluated the decoder to assess the degree to which the translation resembled the genuine transcript. Though nearly every word in the decoded script was misplaced, the passage’s meaning was frequently kept intact. The decoders were effectively summarising.

Additionally, participants were able to mask their internal monologues by diverting their attention away from the decoder. A.I. may be able to read our minds, but for the time being it will need our consent and will need to read each thought individually.

Leave a Comment

Your email address will not be published. Required fields are marked *