ANDREW COOPER
Chris Zhang is building emotionally-savvy “virtual partners” to help rehabilitate patients by teaching robots how to mimic and respond to human emotion.
Although the project started in 2005, Zhang, a professor of mechanical engineering at the University of Saskatchewan, has been in charge since 2007. The work is funded by the Natural Sciences and Engineering Research Council and there are currently four other members working on the project.
The goal of the project is to design machines that can analyze human emotion. Cameras and sensors track emotions in conjunction with other hardware such as a joystick and an ocular movement tracker. This hardware records information such as blood pressure, heartbeat, skin conductivity and eye movement.
“The inference process is very much like how humans learn other humans’ emotions,” Zhang said.
“People have emotions, they react to them and base decisions upon them,” Zhang said in a U of S news release on the project. “If machines cannot understand human emotions, communications are compromised.”
Creating a machine with a technique such as this requires a pool of data, which is acquired through testing with between 20 and 30 volunteer subjects. A generalization of human emotional responses can be gathered by the machine after all the information from a subject is collected. The emotional analysis of the project’s prototype machines is correct about 90 per cent of the time.
Before the era of automation, only human-to-human interaction made sense in most day to day activities. The advent of automation meant that machines could do jobs which otherwise would have been done by humans. Zhang said in order for machines to perform certain jobs, they would need to communicate with humans in some way.
“It is natural that the human will take emotion into account. Therefore, why should the machine be excluded from this emotional effect in cognition,” Zhang said.
The robots Zhang is designing learn emotional cues through interactions. Through this communication A has learned B’s emotions and cues in various situations. This knowledge is then coded as an algorithm. With this formula, A can then deduce B’s emotion when A sees the similar sign or cue from B, Zhang said.
The potential applications of this work are numerous, but the goal right now is to create virtual partners that can analyze patients at home and direct them towards self-rehabilitation.
“My plan is not only management of patient function and performance, but also that emotions become active in rehabilitation. We would have on screen an advisor — like a friend,” Zhang said in the press release.
One of Zhang’s machines is used for wrist rehabilitation. A mechanical arm attached to a computer is used for exercise while the patient’s performance is tracked and software can deduce their emotions.
“This type of technology will be used in health care and medicine in the next 20 years,” Zhang said. As emotion affects the body and mind, he said health care can be enhanced with the proper application of emotions.
“For this reason, I predict [that] health care and medicine would be the earliest area to accept this technology.”
This science would be most useful to patients who live farther away from hospitals, or who have trouble getting to the hospital or nursing stations in their community. One question however is how readily patients will trade human interaction for interaction with an intelligent and intuitive machine.
William Buschert, a philosophy instructor at the U of S, said the successful simulation of human emotions is the goal when creating robots for assistive care of elderly people. However, he warns that although the robots may be human-like, they are still not human.
“In those cases, while it might be overwhelmingly tempting for some users to treat emotion simulating machines as if they are human, it would be a mistake to do so. After all, a landscape painting, no matter how detailed, isn’t an actual landscape, a map isn’t the territory it depicts.”
—
Graphic: Cody Schumacher/Graphics Editor