An artificial intelligence project aims to train companionship robots to feel empathy for humans.
The project involves students at Ontario Tech University, who are trying to program Zenbo, a robot produced by Asus, to be able to read human emotions.
“This development of empathy is going to enable the robot to respond to a particular situation,” explained Miguel Vargas Martin, a computer sciences professor at the university.
“It learns your state of mind and all the situations you could be in.”
A number of students have been working on the project, including international student Eduardo Valle of Mexico. Valle is one of 1,200 students studying in Canada on a three-month internship through the international internship program Mitacs Globalink. He says the students’ hope is to train Zenbo to recognize facial expressions.
“We hope to have Zenbo recognize your mood,” the 22-year-old said. “He can reply to you and make you feel better if you are having a bad moment.”
Get daily National news
The robot has been around since 2016 but was just recently introduced in Canada. It can already dance, tell jokes and show different facial expressions, becoming a friend to those who own him. With empathy programming, which includes interpreting voice inflection, noticing changes in facial expressions and other signals, Martin says the robots could be used to help those in need.
“Take, for example, you’re not being yourself so maybe, silently, it might make a call to the caregiver,” Martin said.
Products and services that use artificial intelligence have come under fire in the past over the collection and potential sharing of user data. Experts have also raised privacy concerns about devices like Amazon’s Alexa and the Google voice assistant. However, Martin says the research team working on Zenbo hopes to prevent data collection with its empathetic robot.
Martin says he hopes that technology like this will help propel us into the future.
“Far into the future, robots will be seamlessly integrated into society and help humans in ways we never even thought about.”
Comments