This humanoid robot can mimic facial expressions
A team of Chinese scientists, led by Professor Liu Xiaofeng from Hohai University in Jiangsu Province, has successfully developed a humanoid robot capable of mimicking human facial expressions. The groundbreaking research was recently published in the journal IEEE Transactions on Robotics. The primary aim of this innovative project, was to enhance user interaction by enabling the robot to replicate intricate and authentic human expressions.
Innovative algorithm fuels robot's facial expressiveness
The team's groundbreaking work involved the development of a new algorithm specifically designed to generate facial expressions on humanoid robots. The researchers introduced a two-stage methodology that equipped their autonomous affective robot with the ability to display rich and natural facial expressions.
Two-stage methodology for synthesis
The first stage of the team's method involved generating detailed robot facial expression images, guided by Action Units (AUs). The second phase actualized a robot with multifaceted degrees of freedom for facial movements, which enabled it to embody the synthesized fine-grained facial expressions. The researchers used a framework of weakly supervised learning and harnessed facial AUs, to overcome the scarcity of paired training data effectively.
Motor command mapping network and evaluation
The research team also created a specialized motor command mapping network that serves as a bridge between generated expression images and the robot's realistic facial responses. They refined the prediction of precise motor commands from the robot's generated facial expressions using physical motor positions as constraints. This refinement process ensures that the robot's facial movements authentically express accurate and natural expressions. The effectiveness of their proposed generation method was verified through qualitative and quantitative evaluations on the benchmarking Emotionet dataset.