Visage Technologies
Face Tracking & Analysis


Gaining structural visual knowledge

December 4, 2015

Sherlock: modeling structured knowledge from images

Researchers from the Rutgers University have decided to build a machine-learning method that can continuously gain structured visual knowledge by learning structured facts?

Continue reading →
Irrelevant search ranking examples

November 25, 2015

Images don’t lie: rank learning

The task of ranking search results automatically is a multibillion dollar machine-learning problem. Traditional models optimize over a few hand-crafted features based on the item’s text. Recently, researchers have created a multimodal learning method that combines this traditional features with visual semantic features transferred from a deep convolutional neural network. They have tested their model […]

Continue reading →
Microexpressions magnified at different levels

November 13, 2015

Lie to me: reading hidden emotions

IEEE researchers have studied micro-expressions: rapid, involuntary facial expressions which reveal emotions that people do not intend to show. This is of extreme importance in forensic science and psychotherapy, and we all remember the TV show Lie to me, which focused on these micro-expressions (MEs). Unlike the pompous TV shows, real-life analysis of spontaneous MEs […]

Continue reading →
Comparison of face-aging methods

October 27, 2015

When I’m sixty-four: face aging

Age progression or age synthesis (face aging) is defined as aesthetically rendering a face image with natural aging and rejuvenating effects for a certain face of an individual. This analysis can be used in cross-age face analysis, various authentication system, entertainment, but in finding lost children after a couple of years or more. Researchers from […]

Continue reading →
Pointing gestures are commonly used in everyday world in human interaction

October 21, 2015

What’s the point? Pointing-gesture recognition

Pointing gestures are a fundamental aspect of non-verbal human interaction. They are often used to direct the conversation partner’s attention towards objects and regions of interest. Therefore, a reliable detection and interpretation of pointing gestures is an important part of not only human-human interaction, but human-robot interaction as well. Researchers from the Karlsruhe Institute of […]

Continue reading →
← back  
Go to top