Yesterday, Honda Research Institute revealed the latest trick from its Asimo robot: It can now respond to commands issued only as thoughts. The Japanese carmaker ran a video of a man imagining four simple movements – raising his right hand, raising his left hand, running and eating – that were then duplicated by Asimo, the company’s humanoid robot. Honda said the technology was not ready for a live demonstration because the test subject might get distracted. A previous demonstration in 2006 required the test subjects to lie motionless in an MRI scanner in order to pick up the signals [Financial Times].
The mind-reading system is non-invasive, meaning that the controller doesn’t have electrodes implanted in his head. Researchers used a specialized helmet instead, which is the first “brain-machine interface” to combine two different techniques for picking up activity in the brain. Sensors in the helmet detect electrical signals through the scalp in the same way as a standard EEG (electroencephalogram). The scientists combined this with another technique called near-infrared spectroscopy, which can be used to monitor changes in blood flow in the brain [The Guardian]. A software program then integrates the two signals and transmits a command to the robot.
The bulky technology is hardly ready for the market. Besides the unwieldy size of the present system, researchers note that it currently takes about 7 seconds for the software to interpret the command and transmit it to Asimo. Also, because brain patterns vary from person to person, this early prototype must be trained to recognize each individual’s commands. After about three hours of training, however, Honda says tests of the system have produced results with 90 percent accuracy [CNET].
Machine-brain interfaces are a hot field of research. Some companies are implanting electrodes in the brains of paralysed patients to help them communicate. Others are trying to commercialise non-invasive devices for use as video game controllers [Financial Times]. Honda, meanwhile, thinks that mind-controlled technology could have practical, everyday applications in cars and in the home. Researcher Tatsuya Okabe says he envisions a customer opening a car’s trunk while carrying shopping bags, operating air-conditioning units by mind control, or instructing home robots around the home to do chores. “When your hands are full doing the dishes, you could have a robot give you a hand watering the plants [just by thinking]” [Business Week]. Of course, commanding a robot to water the plants is significantly more complicated than telling Asimo to raise his right arm, Honda admits.
Related Content:
Discoblog: WonderVacuum Reads Your Mind, Cleans the House
80beats: Brain Scan Can Predict When You’re Going to Screw Up
80beats: Researchers Find Another Way to Read (a Little Bit of) Your Mind
80beats: Mind-Reading Infrared Device Knows If You Want a Milkshake
80beats: Mind-Controlled Video Game Gets a Tryout in Japan
DISCOVER: Man’s Best Friend looks at robots that help with surgery, house-cleaning, and space exploration
Image: Honda
No comments:
Post a Comment