Scientists Takes Efforts to Make Robots look like human Facial Expression

The scientists from Japan have found a way to make robots more expressive thus paving the way for the machines to show more emotions and have a deeper interaction with people.it has been a challenge to capture human expression in a robot face as compared to robotic advances in industrial, healthcare and other settings.

The researchers at Osaka University in Japan have found a method of identifying facial movement on their android robot child head named Affetto, an android first generation model who was first introduced in 2011. now they have found a new system to make the second generation robot more expressive.

Their finding offers a path for robots to express more and higher ranges of emotion and interact with humans. However, there is a big challenge of surface deformations which are the central issue in controlling the robots faces and also the movement of their soft facial skin may create instability.

This challenge may not be a problem anymore since the researchers have sought a better way to measure and control it. They have done it by investigating 116 different facial points on robot Affetto to measure its three-dimensional movement. Something called deformation units underpinned these facial points. In deformation units, each group comprises of a set of mechanisms that create a different facial contortion, for example, lowering or raising apart of lip or eyelid. From these, measurements were subjected to a mathematical model to quantify their structure motion patterns.

Even though the researchers experienced challenges in balancing the applied force and also in adjusting the skin, they managed to employ their system to improve the deformation units for precise control of the robot’s facial surface motions. Hishashi, the first author of the study say that android robot faces have persisted in being a black box problem and they have been implemented but have only been judged in vague and general terms. He also concluded by saying that their precise findings would let them effectively control android robot’s facial movements to introduce more advanced expressions such as smiling and frowning.

Be the first to comment

Leave a Reply

Your email address will not be published.


*