Virtual Characters More Like Real Humans than Animations through Emotions
As such we are under the impression that video games are popular with teens and young adults. But you will be surprised to know that on an average, video game addicts are 35 years old as proved by recent studies. Video games are only 3 decades old but have become more popular than their older counterparts in the entertainment field like movies and television. This is perhaps because the speed at which they are evolving in technological aspects. Everyday video game news and features bring us a new technological advance, and one of the latest among them is emotions on the faces of virtual characters.
Scholars at the Autonomous University of the State of Mexico (UAEM) have become successful to imitate human facial expressions in the virtual models and thereby generate a better environment during a virtual communication.
Till now, these virtual characters copy human behavior by way of programmed commands, but this produces a very “robotic” outcome which is quite uninteresting for the user. The new research is done with an aim of producing emotions and expressions like real people taking the 43 muscles into consideration which are involved in facial movements depending upon the psychological situation.
Tactile sensors were placed to accomplish this in human models, which gave out minute electrical pulses to evoke various gestures which a 3D camera captured as personality traits.
With the data thus gathered, several virtual characters were involved in a project named “serious game” which unlike computers or consoles, doesn’t aim at entertainment, but to run various educational, civil or scientific strategies, as explained by the UAEM researchers.
Human conduct is strongly influenced by feelings, moods, attitudes and intentions that differ according to the social context. As soon as these factors are shot by the 3D camera, they are transformed into numerical data and then introduced into kinesic model created by the UAEM, to sort and produce the animation of the gestures and expressions of the virtual characters in states of joy, sorrow, surprise, annoyance, fear and distress.
UAEM teamed with the University of Guadalajara (UdeG) and CINVESTAV GDL where students acted as models to get the physical and psychological features, which the application needed for temperament and a psychological profile. From the numerical measurement of sensations and emotions various facial expressions have been created.
The purpose of this project is to promote attitudes of self-improvement, use dynamics of context to make the learning process better, foster cooperative environments and communication to find solutions to problems and puzzles.
Though the studies have been done for scientific purposes, it is not impossible that video games will adopt the advances, and thereby you will be soon having the various in-game characters showing exact facial expressions based on their emotions, and resembling more like real humans than animations.