TOKYO -- Japan's robots do not want to be misunderstood. Researchers in the country are exploring ways to create chatty, expressive robots that use verbal and nonverbal skills to better communicate with humans.
People typically use two channels to communicate face-to-face: speech and body language, the latter including body motions and facial expressions.
While normal speech is good enough for getting basic messages across, people typically use movements and expressions to drive their point home and develop personal bonds.
For researchers, the challenge is to improve robots' ability to express emotions, essential to better communication.
A team from Keio University is developing a robot that responds differently depending on a person's attitude. Associate professor Kazunori Takashio, head of the project, aims to create a robot that can adapt its personality to the person with whom it is interacting.
"I want to make robots a part of people's lives. For that, we need to instill actions that make them more agreeable to humans," Takashio said.
The team's robot can actually start a conversation. If its human counterpart asks questions and listens to the robot's replies, the robot thinks it has a friend. But if the human says something like, "Leave me alone," the conversation is over and the robot remembers this as an unpleasant experience.
Through such exchanges, the robot learns to adjust its attitude. Another example would be responding to abuse by shaking its head or looking down to express disappointment. Conversely, if someone showered the robot with affection, it might joyfully raise its arms. According to Takashio, users might begin to feel that the robot has a real personality.
To reach this stage, the team studied how children develop their characters through interaction with parents and others.
Meanwhile, a team at Tokyo University of Agriculture and Technology is developing technology that allows robots to read emotional states by observing body language.
Gentiane Venture, associate professor at the university, wants her robot to express itself through body movements. Using artificial intelligence and motion capture technology, the robot analyzes the movements of five people to learn about the linkage between emotions and body language, then mimics the same movements.
According to Venture, smooth movements indicate confidence and joy, while awkwardness means fear. Her robots can mimic both.
In an experiment, a commercially available robot learned to look up and wave its hands to express joy, but look down and hesitantly wave to express despondency.
Venture also discovered that people stay about 1.4 meters away from a robot that seems sad, but around 1.1 meters away from one that looks happy.
The professor has also gauged people's reactions to robots in various situations, even having a robot join a play at a kindergarten.
Venture said she wants to create robots that can be emotionally supportive and show the appropriate feelings depending on a given situation.
Interest in robots that can converse has grown in Japan since 2014 when SoftBank Group introduced Pepper. The humanoid robot can express nuanced emotions by adjusting intonation and response speed, making it more lifelike.
Robots that can communicate with humans have long existed, but previous versions mainly served academic purposes or were intended for hobbyists. Today's robots are ready to be used in other areas, such as in customer service roles or at nursing care facilities.
According to a 2017 research by Tokyo's Yano Research Institute, the market for robots with extended communication skills will be worth about 8.7 billion yen ($81.6 million) in the year ending March 2021, nearly double that of fiscal 2016.