We are seeing increasingly advanced AI and robotics coming out of labs all around the world. I first saw ASIMO in the late 1990s and QRIO in the early 2000s, I was amazed by robotic bipedalism. Twenty years forward, though we are seeing much better locomotion and facial expressions are starting to look a little less creepy and we are definitely in a robotic age, I believe we are still at its dawn.
It’s hard to overcome the uncanny valley. I’m glad realistic human locomotion is being solved by robots like Cassie and Atlas but I’m more interested in the emotivity of robots and AI. After all, humans relate to each other through words, body language, and facial expression. So it is not completely a technical problem but perhaps also artistic and philosophical. Projects confronting these dimensions excite me most.
Creoqode Nova is an Arduino-based artificial intelligence robot. Nova comes as an all-inclusive do-it-yourself kit allowing users to build their own artificial intelligence robot and to practice their coding and engineering skills by controlling it in various ways.
The official intro is a little boring. The thing I like about NOVA is that it has linkages that work like a neck which moves and points a face with cameras as eyes, in fluid and natural movements. It mimics curiosity like that of an animal.
Perhaps it’s the design of the face. It just feels more alive than glowy eyes on a tin can. Although not as advanced as the other robots in this article, it’s much more accessible as a hobbyist kit.
If you are into robotics then you should not be a stranger to Professor Hiroshi Ishiguro of Osaka University. His latest humanoid project is Erica. This clip from a BBC documentary shows ERICA’s conversational mechanisms as she mimics head movements and attempts to make eye contact. Much less creepy than his previous humanoids, I think Erica and most definitely his future projects will become increasingly life-like.
Professor Ishiguro is the research director of the Symbiotic Human-Robot Interaction Project. Its website lists many interesting papers including one that measured perceived human likeness based on EEG-based mu rhythm suppression.
Perhaps not trying to leap across the uncanny valley is a more feasible approach to making relatable robots. Consider the endearing fictional robots like Wall-E and C3PO, maybe when we finally have androids walking and living all around us, they would have an appearance more like Nexi — designed by Xitome Design in collaboration with MIT and has been referred to by MIT as an “MDS” (Mobile-Dexterous-Social) class of robot.
Nexi was first introduced in 2008. I wonder how it has evolved since then. The Personal Robots Group (PRG) has since moved to robots with screens for faces.
How important is facial animation? Perhaps even more important than the uncanny valley is the meme valley. The facial animation in the game Mass Effect Andromeda was so stupidly unrealistic that it became a meme, widely ridiculed.
Watch the above for an introduction to facial/conversation systems including ones in Mass Effect Andromeda and The Witcher 3.
Like Erica, Sophia is an attempt at pushing to the summit beyond the uncanny valley but with even more facial dexterity than Nexi. Sophia is Hanson Robotics’ latest and most advanced robot as of 2017. The team developing Sophia has people from Lucas Arts, Apple, and Mattel — Arts, Technology and Toys. Makes a lot of sense because like I said, it’s more than just AI and technology.
The human face has 40 muscles controlling facial expressions. Sophia has 36 motors. This explains the sophisticated lip movements seen in the clip. I believe this is a step in the direction towards human-to-robot empathy. While Nexi is capable of simulating a wide range of emotions for us to form an emotional connection to it, its appearance means we would treat it as a machine, with less respect and empathy a human deserves. I’m excited to see how Sophia evolves into a more believable person.
The last one isn’t a robot but a class of machines. I thought Sophia was cool until I saw animatronics. Perhaps the best people to make realistic facial movements are the special effects people. There may be no AI in these machines but compared to Erica and Sophia, their faces just feel much more believable.
Robotics and AI researchers ought to be working Disney and Hollywood. They are the ones that really know the intricacies of the human face — which muscles pull on the eyelids, corner of the lips, brows. Looking at the last clip, the complex machinery involved clearly shows the 36 motors in Sophia’s face does not come close to the 40 muscles in a human face!
There will come a time soon when animatronics and AI come together to create much more socially realistic androids, and hopefully, make it across the valley.