Speaker 1: Emotional expression has long been one of those things that separates man from machine, but new research using an expressive Android head named Nicola aims to change that. We talked to one of the researchers working on the project about what they hope to achieve with Nicola, how emotional robotics might prove useful in the future in a brand new video clip that shows off even more of Nicola's capabilities.
Speaker 2: [00:00:30] We are trying to create robots or Androids having human minds, or at least people feel like that.
Speaker 1: Nicola is part of the guardian robot project, a research and development effort backed by weekend a Japanese government funded research Institute, according to their website, the guardian robot project, combined psychology, brain science, cognitive science, and AI research. In order to further the goal of creating a robot that can quote autonomously, recognize its environment [00:01:00] and the state of the per it is supposed to support and provide assistance without compromising that person's autonomy,
Speaker 2: Right? Social interaction is quite important for humans. And, uh, we think that Androids having such abilities would be quite important in research and real life application.
Speaker 1: One potential use for a emotionally expressive Androids like Nicola could be in elder care. The six emotions Nico expressed in the study are happiness. [00:01:30] Sadness, fear, anger, surprise, and disgusted.
Speaker 2: I conducted some psychological experiments, validating his facial act actions. So he has nice evidence showing more humanlike facial actions.
Speaker 1: Nico's facial expressions are powered by 29 actuators controlled by air pressure.
Speaker 2: There are several different options to manipul actuators, electro oil or air [00:02:00] and min to create a humanlike motions. Air traders are the best. Currently. Nicola has a large compressor out of his body by using the compressor, a hija activate air actuators
Speaker 1: In a brand new video clip shared with CNET. The robot looks back and forth between two people using cameras in its eyes.
Speaker 2: So he analyze the video data using his eyes. And, uh, in this demo [00:02:30] he detected the humans talking. So when different people talk though, he change his eye directions and probably he sometimes show the smile depending on by human talking.
Speaker 1: Dr. Sato tells me that the additional cameras placed near where the robot's chest would be. If it had one are gathering depth, sensing data that might eventually be useful in helping Nicola move around. Of course, Nicola still has a long way to go before. [00:03:00] We'll see it giving interviews like some other expressive robots. We've covered AICA and Sophia to name a few. Dr. SATA tells me that for now they're working on improving and expanding Nico's facial expressions, giving Nico a voice and a body and instilling all of these with emotional expressiveness that can eventually work together for maximum humanlike, emotional impact.
Speaker 2: Usually humans combine different modalities like facial expressions and [00:03:30] vocality to communicate their emotions. So if he show emotions in his face and also in his voice, then such emotional communication would be more humanlike. We expect,
Speaker 1: Of course, as a long time sci-fi fan, I couldn't let the interview end without asking Dr. Soto about the possible downside of humans, making robots in our own image.
Speaker 2: We have a plan to collaborate with ethical researchers, uh, [00:04:00] about this issue, right? What, what could be right? Negative size of creating human, like Andros or love.
Speaker 1: I'm your host, Jesse, Y as always. Thanks so much for watching. See you next time with the fan.