When devices can read human emotions without a camera: Study
Tokyo [Japan], November 30 (ANI): Tokyo Metropolitan University researchers employed long-term skin conductance measurements to distinguish between emotions. Volunteers were given videos representing frightening scenarios, family bonding, and humour, while their skin conductance was measured.
The team's investigation revealed that traces might be used to create accurate estimations about which emotions were being experienced. Advances like this assist in reducing an overreliance on facial data, bringing emotionally sensitive technologies closer to home.
A new frontier is being pioneered in consumer electronics: one day, digital devices might be able to offer services depending on your emotional state. While this sounds amazing, this depends on whether devices can correctly tell what people are feeling. The most common methods depend on facial expressions: while these have had some success, such data may not always be available. This has led to researchers looking for different biological signals which could be interpreted to access emotional states, like brain wave measurements or cardiograms.
A team of scientists led by Professor Shogo Okamoto from Tokyo Metropolitan University have been using skin conductance as a doorway to human emotions. When people feel different things, the electrical properties of their skin change drastically due to perspiration, with signals showing up within one to three seconds of the original stimulus. Previous research has already shown that measurements of peak conductance, for example, can be correlated with certain emotions. In their most recent work, the team focused on the dynamics of the response i.e. how quickly the conductance trace following some stimulus reaches a peak, and how it decays back to normal.
In their experiment, volunteers were asked to wear probes on the skin and watch videos which were either scary scenes from horror movies, emotional scenes of family bonding, or funny acts performed by comedians. Importantly, each of the scenes had well-defined points at which a certain emotional stimulus was sought. Analyzing the traces, the team found many interesting and significant trends. For example, they found that the response to fear lasted the longest. This may be a biologically evolved trait, since there are benefits to perceptions of danger lasting longer. Comparing responses to humor and emotional scenes of family bonding, they found responses to family bonding seemed to increase more slowly. The emotions that were evoked were most likely a mixture of sadness and happiness, so it may be that they interfere with each other, leading to a slower change.
Importantly, the team's statistical analysis revealed that the different numbers extracted from the dynamics of the trace could be used to discriminate the emotional state of an individual. Though they can't yet tell the emotions apart perfectly, the data could, for example, be used to make statistically significant predictions of whether a subject was experiencing fear or feeling the warmth of a family bond. Combined with other signals, the team believe we are one step closer to devices knowing how we are feeling, with scope for a better understanding of human emotions. (ANI)
Next Story