Human communication incorporates plenty of social cues – from facial expressions to body language. Detecting and reacting to such cues comes naturally to us humans. For example, we can get a good idea of how our fellow humans are feeling just by looking at their face or listening to their voice. On the other hand, grasping such information is quite a challenge for machines. However, combining smart computer vision- and machine learning-based algorithms can help turn robots into decent interlocutors.
Seeing eye to eye
Humans use eye contact to initiate and control communication. Looking at someone’s eyes as they speak indicates that they have our attention and keeps the parties engaged.
Face tracking technology can help robots achieve a similar effect. Once a human face is detected, the robot can move its eyes accordingly or even approach that person in order to initiate and maintain eye contact. Such behavior helps grab people’s attention, making it easier for the robot to initiate interaction.
Another step towards a more natural human-robot interaction can be achieved with gaze tracking. By knowing where the human is looking at, the robot can detect whether they have established eye contact. It can then use that information to start the conversation at the right time, greet the human with a friendly smile, and more.
Furthermore, gaze tracking lets the robot keep track of where the person is looking at during the conversation. This can help measure engagement, better understand people’s needs, and, finally, provide more relevant information.
However, too much intense eye contact can be off-putting, even amongst humans. To avoid that, robots can be designed to look friendly and display human-like behavior, such as blinking, looking away occasionally, displaying specific facial expressions, and more.
Reading human emotions
When it comes to successful conversations, eye contact goes hand in hand with facial gestures. We raise our eyebrows when we’re surprised, smile when we’re happy and frown when we’re angry. Such gestures are great indicators of our current mood and it’s important to take them into account during the conversation. If you’re wondering why, imagine someone laughing at you while you’re telling them about having a really bad day.
Robots designed to work with humans are bound to face a variety of emotions, often frustration and anger. So, in order to really treat the customers well, the machines must take human emotions into account, especially when they are directed towards the machine.
Luckily, emotion estimation technology has come a long way. For example, FaceAnalysis developed by Visage Technologies estimates all basic human emotions – happiness, sadness, fear, surprise, anger, and disgust. Additionally, those basic emotions can be combined to detect more complex ones, such as worry or pride. In addition to emotion monitoring, FaceAnalysis estimates each person’s age and gender as well. Combining such information can help robots interact with people in a positive and effective way.
Emotionally intelligent robots can be a valuable asset in various industries. For example, they can be used as tutors for autistic children, caregivers for the elderly or mental patients in hospitals, customer service assistants in retail, and more.
Personalization through recognition
Humans have a natural ability to recognize and distinguish between faces, and we rely on it heavily in our daily communication. There’s a difference in how we talk to a friend and how we talk to a stranger; the information we share, the approach we take, and even our tone of voice may change depending on the person we’re talking to. This helps us maintain meaningful relationships and exchange information more efficiently.
Using face recognition technology, machines can distinguish between different faces, too. While it comes natural to us humans, it’s all about math for machines. Every face has specific landmarks that can be measured, such as the distance between the eyes, the length of the jaw line, the shape of the eyebrows, and more. Together, they create a unique faceprint that can later be compared with other people until a match is found.
Face recognition gives robots a way to personalize interactions with people. It can memorize valuable information gathered during their previous interactions, such as their favorite topics or the preferred tone of communication, and use that information to personalize each interaction. For example, a robot that cares for the elderly can remind them to take their specific medicine, bring up their favorite topics when they seem sad, and more.
User experience comes first
Plenty of complex and sophisticated machines have failed on the market because they were not user-friendly. Any robot that will be in direct contact with people should be designed for natural, intuitive interaction.
User experience is one of the very reasons why visage|SDK has always been the first choice for our robotics clients. Since it’s extremely lightweight, it runs smoothly and quickly, without overloading the system. For users, it means that the robot is able to provide timely responses, making sure the conversation flows smoothly.
The robot’s performance should also be consistently effective. Studies have shown that witnessing a robot error permanently lowers people’s trust in the robot and its reliability, even if it happens only once. Besides losing confidence in the robot, frequent errors can lead to frustration and a loss of interest in any further interactions. Using proven technology helps bring the probability of errors to a minimum.
Finally, it’s important to keep the interaction flowing in real time, without any significant interruptions. However, if the machine depends on an internet connection, its loss directly affects that interaction. That is why visage|SDK works both online and offline, keeping the machine’s services available at any time.
To sum up, software that is lightweight, efficient and customizable can make a huge difference when it comes to user experience and help the robot achieve a human-robot interaction that feels as natural as possible.
Creating a robot with emotional intelligence
Over time, robots keep taking on more sophisticated tasks. One of such tasks, and definitely one of the most challenging ones, is achieving a meaningful interaction with humans. Creating socially adept robots that can become our caregivers, tutors, assistants and companions requires plenty of intertwined, well-orchestrated technologies.
Visage|SDK is a lightweight, customizable solution that helps robots detect human faces, track their eyes and gaze, monitor emotions, recognize people, and more, which is fundamental for initiating and maintaining quality interactions. If you’d like to give it a try for free or discuss custom development with computer vision experts, get in touch.