By Reading Brainwaves, an A.I. Aims to Predict What Words People Listened to – Smithsonian Magazine

Posted: September 15, 2022 at 10:08 pm

The artificial intelligence has looked for patterns between audio recordings and the brain activity of people listening to those recordings. John M Lund Photography Inc / Getty Images

Scientists are trying to use artificial intelligence to translate brain activity into language.

An A.I. program analyzedsnippets of brain activity from people who were listening to recorded speech. It tried to match these brainwavesto a long list of possible speech segments that the person may have heard, writes Science News Jonathan Moens. The algorithm produced its prediction of the ten most likely possibilities, and over 70 percent of the time, its top-ten lists contained the correct answer.

The study, conducted by a team at Facebooks parent company, Meta, was posted in August to the preprint server arXiv and has not been peer reviewed yet.

In the past, much of the work to decode speech from brain activity has relied on invasive methods that require surgery, writes Jean-Rmi King, a Meta A.I. researcher and a neuroscientist at the cole Normale Suprieure in France, in a blog post. In the new research, scientists used brain activity measured with non-invasive technology.

The findings currently have limited practical implications, per New Scientists Matthew Sparkes. But the researchers hope to one day help people who cant communicate by talking, typing or gesturing, such as patients who have suffered severe brain injuries, King writes in the blog post. Most existing techniques to help these people communicate involve risky brain surgeries, per Science News.

In the experiment, the A.I. studied a pre-existing database of 169 peoples brain activity, collected as they listened to recordings of others reading aloud. The brain waves were recorded using magnetoencephalography (MEG) or electroencephalography (EEG), which non-invasively measure the magnetic or electric component of brain signals, according to Science News.

The researchers gave the A.I. three-second segments of brain activity. Then, given a list of more than 1,000 possibilities, they asked the algorithm to pull the ten sound recordings it thought the person had most likely heard, per Science News. The A.I. wasnt very successful with the activity from EEG readings, but for the MEG data, its list contained the correct sound recording 73 percent of the time, according to Science News.

The AIs performance was above what many people thought was possible at this stage, Giovanni Di Liberto, a computer scientist at Trinity College Dublin in Ireland who was not involved in the study, tells Science News. Of its practical use though, he says, What can we do with it? Nothing. Absolutely nothing.

Thats because MEG machines are too costly and impractical for widespread use, he tells Science News. Plus, MEG scans might not ever be able to capture enough detail of the brain to improve upon the findings, says Thomas Knpfel, a neuroscientist at Imperial College London in England, who didnt contribute to the research, to New Scientist. Its like trying to stream an HD movie over old-fashioned analogue telephone modems, he tells the publication.

Another drawback, experts say, is that the A.I. required a finite list of possible sound snippets to choose from, rather than coming up with the correct answer from scratch. With language, thats not going to cut it if we want to scale it to practical use, because language is infinite, says Jonathan Brennan, a linguist at the University of Michigan who didnt contribute to the research, to Science News.

King notes to Times Megan McCluskey that the study has only examined speech perception, not production. In order to help people, future technology would need to figure out what people are trying to communicate, which King says will be extremely challenging. We dont have any clue whether [decoding thought] is possible or not, he tells New Scientist.

Currently, the research, which is conducted by the Facebook Artificial Intelligence Research Lab and not directed top-down by Meta, is not designed for a commercial purpose, King tells Time.

To the critics, he says there is still value in this research. I take this more as a proof of principle, he tells Time. There may be pretty rich representations in these [brain] signalsmore than perhaps we would have thought.

Recommended Videos

Go here to see the original:

By Reading Brainwaves, an A.I. Aims to Predict What Words People Listened to - Smithsonian Magazine

Related Posts