Artificial Neural Networks Can Distinguish Genuine and Acted Anger by Synthesizing Pupillary Dilation Signals from Different Participants

Date

2018

Authors

Qin, Zhenyue
Gedeon, Tom
Chen, Lu
Zhu, Xuanying
Hossain, Md Zakir

Journal Title

Journal ISSN

Volume Title

Publisher

Springer

Abstract

Previous research has revealed that people are generally poor at distinguishing genuine and acted anger facial expressions, with a mere 65% accuracy of verbal answers. We aim to investigate whether a group of feedforward neural networks can perform better using raw pupillary dilation signals from individuals. Our results show that a single neural network cannot accurately discern the veracity of an emotion based on raw physiological signals, with an accuracy of 50.5%. Nonetheless, distinct neural networks using pupillary dilation signals from different individuals display a variety of genuineness for discerning the anger emotion, from 27.8% to 83.3%. By leveraging these differences, our novel Misaka neural networks can compose predictions using different individuals’ pupillary dilation signals to give a more accurate overall prediction than even from the highest performing single individual, reaching an accuracy of 88.9%. Further research will involve the investigation of the correlation between two groups of high-performing predictors using verbal answers and pupillary dilation signals.

Description

Keywords

Citation

Source

Neural Information Processing: 25th International Conference, ICONIP 2018, Proceedings

Type

Conference paper

Book Title

Entity type

Access Statement

License Rights

Restricted until

2099-12-31

Downloads