Jules, the posh robot from the University of Bristol, U.K., is equipped with tiny motors under its skin, which means it can accurately mimic human facial expressions. Jules is a disembodied head though, and while its copycat technique is impressive, robots need to do more than copy us to be able to interact on an emotional level. A step up the emotional adeptness scale, AIDA, the driving companion, uses its facial expressions to respond to the driver’s mood—looking sad if a seatbelt is undone, for instance, or detecting that you are tense as you drive and helping you relax.
Search
Robots Are Emotion Machines
Robots started out conceptually as automaton-servants but instead of creating a modern-day butler, much robotics research today focuses on creating emotional machines.
Special Issue
George Raveling — the iconic leader who brought Michael Jordan to Nike — shares with Big Think a lifetime of priceless wisdom learned at the crossroads of sports and business.
14 articles