Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Understanding the emotional dynamics within social interactions is crucial for meaningful interpretation. Despite progress in emotion recognition systems, recognizing the collective emotional climate among peers has been understudied. Addressing this gap, we propose EmoNet, an AI model transcending traditional emotion identification. EmoNet employs Mel-frequency cepstral coefficients and a Temporal Convolutional Network to extract deep features from speech signals. It uniquely integrates affect dynamics and physiological inputs (heart rate, electrodermal activity), providing a holistic view of emotion climates. Tested on K-EmoCon dataset, EmoNet excels in arousal and valence classification, achieving 87.82% and 83.79% accuracy, respectively. These results position EmoNet as a valuable tool for understanding and influencing emotion climates in real-world conversations, with applications in healthcare and human-computer interactions.

Original publication

DOI

10.1109/EMBC53108.2024.10782421

Type

Publication Date

01/01/2024