MIT researchers have developed a computer interface that can ‘listen’ to words that the user verbalises internally. The system consists of a wearable device with electrodes that pick up neuromuscular signals in the jaw and face, triggered by internal verbalisations, and an associated machine-learning system that can correlate particular signals with particular words.
The AlterEgo device also includes a pair of bone-conduction headphones, which transmit vibrations through the bones of the face to the inner ear. Because they don’t obstruct the ear canal, the headphones enable the system to convey information to the user without interrupting conversation or otherwise interfering with the user’s auditory experience.
“The motivation for this was to build an IA device — an intelligence-augmentation device,” says Arnav Kapur, a graduate student at the MIT Media Lab, who led the development of the new system. “Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?”
Although the device could have a number of uses in health or emergency services, rather worryingly (see Google – defining 'evil') Thad Starner, a professor in Georgia Tech’s College of Computing adds “The other thing where this is extremely useful is special ops. There’s a lot of places where it’s not a noisy environment but a silent environment. A lot of time, special-ops folks have hand gestures, but you can’t always see those. Wouldn’t it be great to have silent-speech for communication between these folks?”
Recent Stories