Urban75 Home About Offline BrixtonBuzz Contact

Researchers develop device that can 'hear' your internal voice

editor

hiraethified
I can't even pretend to understand this but it could be life changing for some people who currently can't speak.

948.jpg


Researchers have created a wearable device that can read people’s minds when they use an internal voice, allowing them to control devices and ask queries without speaking.

The device, called AlterEgo, can transcribe words that wearers verbalise internally but do not say out loud, using electrodes attached to the skin.

“Our idea was: could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?” said Arnav Kapur, who led the development of the system at MIT’s Media Lab.

Kapur describes the headset as an “intelligence-augmentation” or IA device, and was presented at the Association for Computing Machinery’s Intelligent User Interface conference in Tokyo. It is worn around the jaw and chin, clipped over the top of the ear to hold it in place. Four electrodes under the white plastic device make contact with the skin and pick up the subtle neuromuscular signals that are triggered when a person verbalises internally. When someone says words inside their head, artificial intelligence within the device can match particular signals to particular words, feeding them into a computer.

Researchers develop device that can 'hear' your internal voice
 
There's been a few devices out over recent years which 'read brainwaves' and enable 'mind control' of the computer, they all seem to be a bit hit and miss and take a lot of training. I suspect this one will need to be trained and the user will have to imagine themself actually saying the word. I hope it works.
 
There's been a few devices out over recent years which 'read brainwaves' and enable 'mind control' of the computer, they all seem to be a bit hit and miss and take a lot of training. I suspect this one will need to be trained and the user will have to imagine themself actually saying the word. I hope it works.

Yeah, I reckon this one might be reading some subtle physical movements or suchlike. Reckon the article is trying to make that bit sound fancier than it really is, and its even further away from actually listening directly to 'the minds voice' than the descriptions suggest.

I dont have detailed proof of this. What I do have is the shape of the device, and some text from an official promo video of the device that says 'simply by vocalizing internally (subtle internal movements)'. Likewise, the video description is careful to use the phrase 'without any voice or discernible movements', leaving room for indiscernible ones to be the source of the magic.

 
  • Like
Reactions: pug
Looking like it does, its a good thing this story didnt come out on April 1st and didnt include the first consumer implementation by Google, the Google Chinny Reckon.
 
Hmm I should probably have clicked the link within the article labelled 'said Arnav Kapur' to get the actual detail in the first place. It's interesting, certainly worth exploring the potential, but due to the limited vocabulary used in the tests I wont make any predictions about how soon this might come to productive fruition.

If I sound a bit skeptical about these topics its only because we dont really have a brain interface that comes anywhere close to sophisticated yet. The data obtained is extremely noisy and our understanding still very primitive compared to the realities of the brain. It is only thanks to enthusiasm and progress in the realms of AI & machine learning that people are able to take another look at this stuff and imagine quite fancy applications being possible despite the noisy data.I dont have a good sense of how far this will take us yet.
 
It looks like an inyeresting step to somewhere but it's not reading the wearers mind is it? It's picking up on tiny movements in the throat and face/lower lip. That's clear from the device itself.
 
that's not quite what the article says - works from electrical signals rather than movements.

Four electrodes under the white plastic device make contact with the skin and pick up the subtle neuromuscular signals that are triggered when a person verbalises internally. When someone says words inside their head, artificial intelligence within the device can match particular signals to particular words, feeding them into a computer.
 
Oh... fuck! Wow! This is bad.... i mean there are obviously creative and positive uses but .... Damn you Charlie Brooker .... Damn you and your genius!
 
It looks like an inyeresting step to somewhere but it's not reading the wearers mind is it? It's picking up on tiny movements in the throat and face/lower lip. That's clear from the device itself.

Yeah. Sub vocalisation. As seen in loads of sci-fi. As in you are speaking but just not out loud. Consciously forming words to be said. As opposed to giving voice to the internal witterings.

It’s a pretty cool idea though.
 
Back
Top Bottom