Emotions are integral to human behavior, yet they are susceptible to external influences, such as visual stimuli, which can introduce biases and misperceptions of reality. For these reasons, in fields like bio-engineering and neuroscience, understanding emotion dynamics is crucial. However, before investigating how emotions can be manipulated, an accurate measurement of emotional responses is essential. This thesis focuses on Facial Emotion Recognition (FER) as a tool to objectively quantify emotional reactions based on facial expressions. The aim is to develop and validate an experimental proto- col for FER. The study is divided into two phases. The first phase involved developing and validating an Online Task to present a visual stimulus and collect quiz responses on the emotions experienced. The dataset of the images used as stimuli was selected from the Open Affective Standardized Image Set (OASIS), using a custom Matlab code to capture the most impactful emotional states. The validation of the Online Task was conducted in two phases. In the reduced version, 51 students (24 female, 24 male, 3 other) were asked to classify their emotions as positive, negative, or neutral after viewing gender-specific stimuli. Accuracy rates were high for both positive (F 92.31%, M 81.30%) and negative emotions (F 83.62%, M 94.23%), while neutral emotions were less accurately detected (F 50%, M 42.70%). While in the complete version, 34 participants (16 female, 17 male, 1 other) categorized emotions into Ekman’s seven basic emotions. The results showed strong accuracy for happiness (F 83.58%, M 67.82%), sadness (F 60.87%, M 86.36%), and disgust (F 100%, M 76.67%). Neutral emotions were also detected with moderate accuracy (F 57.58%, M 74.07%), while other emotions like surprise, anger, and fear had lower accuracy rates. In the second phase, an Offline FER Algorithm was developed using a modified version of the face-api-js library, which analyzed and classified facial expressions into emotional categories by extracting timestamps and probability values for each emotion from the video recordings. The Area Under Curve (AUC) of the emotional signals, which track the probability evolution of each basic emotion over time, was calculated to determine the predominant emotion after each stimulus. The validation of the algorithm revealed significant ac- curacy for detecting happiness, while other emotions showed more inconsistent results. The findings validate FER as a biomarker for emotional responses, creating a reliable experimental protocol. Future works, focusing on integrating the online task with the FER algorithm, would allow for a real-time emotion monitoring in various applications, such as healthcare, education, privacy, law, and human-computer interaction.
Emotions are integral to human behavior, yet they are susceptible to external influences, such as visual stimuli, which can introduce biases and misperceptions of reality. For these reasons, in fields like bio-engineering and neuroscience, understanding emotion dynamics is crucial. However, before investigating how emotions can be manipulated, an accurate measurement of emotional responses is essential. This thesis focuses on Facial Emotion Recognition (FER) as a tool to objectively quantify emotional reactions based on facial expressions. The aim is to develop and validate an experimental proto- col for FER. The study is divided into two phases. The first phase involved developing and validating an Online Task to present a visual stimulus and collect quiz responses on the emotions experienced. The dataset of the images used as stimuli was selected from the Open Affective Standardized Image Set (OASIS), using a custom Matlab code to capture the most impactful emotional states. The validation of the Online Task was conducted in two phases. In the reduced version, 51 students (24 female, 24 male, 3 other) were asked to classify their emotions as positive, negative, or neutral after viewing gender-specific stimuli. Accuracy rates were high for both positive (F 92.31%, M 81.30%) and negative emotions (F 83.62%, M 94.23%), while neutral emotions were less accurately detected (F 50%, M 42.70%). While in the complete version, 34 participants (16 female, 17 male, 1 other) categorized emotions into Ekman’s seven basic emotions. The results showed strong accuracy for happiness (F 83.58%, M 67.82%), sadness (F 60.87%, M 86.36%), and disgust (F 100%, M 76.67%). Neutral emotions were also detected with moderate accuracy (F 57.58%, M 74.07%), while other emotions like surprise, anger, and fear had lower accuracy rates. In the second phase, an Offline FER Algorithm was developed using a modified version of the face-api-js library, which analyzed and classified facial expressions into emotional categories by extracting timestamps and probability values for each emotion from the video recordings. The Area Under Curve (AUC) of the emotional signals, which track the probability evolution of each basic emotion over time, was calculated to determine the predominant emotion after each stimulus. The validation of the algorithm revealed significant ac- curacy for detecting happiness, while other emotions showed more inconsistent results. The findings validate FER as a biomarker for emotional responses, creating a reliable experimental protocol. Future works, focusing on integrating the online task with the FER algorithm, would allow for a real-time emotion monitoring in various applications, such as healthcare, education, privacy, law, and human-computer interaction.
Development and Validation of an Experimental Protocol for Facial Emotion Recognition
EL SOUDANY, ABDALLA
2023/2024
Abstract
Emotions are integral to human behavior, yet they are susceptible to external influences, such as visual stimuli, which can introduce biases and misperceptions of reality. For these reasons, in fields like bio-engineering and neuroscience, understanding emotion dynamics is crucial. However, before investigating how emotions can be manipulated, an accurate measurement of emotional responses is essential. This thesis focuses on Facial Emotion Recognition (FER) as a tool to objectively quantify emotional reactions based on facial expressions. The aim is to develop and validate an experimental proto- col for FER. The study is divided into two phases. The first phase involved developing and validating an Online Task to present a visual stimulus and collect quiz responses on the emotions experienced. The dataset of the images used as stimuli was selected from the Open Affective Standardized Image Set (OASIS), using a custom Matlab code to capture the most impactful emotional states. The validation of the Online Task was conducted in two phases. In the reduced version, 51 students (24 female, 24 male, 3 other) were asked to classify their emotions as positive, negative, or neutral after viewing gender-specific stimuli. Accuracy rates were high for both positive (F 92.31%, M 81.30%) and negative emotions (F 83.62%, M 94.23%), while neutral emotions were less accurately detected (F 50%, M 42.70%). While in the complete version, 34 participants (16 female, 17 male, 1 other) categorized emotions into Ekman’s seven basic emotions. The results showed strong accuracy for happiness (F 83.58%, M 67.82%), sadness (F 60.87%, M 86.36%), and disgust (F 100%, M 76.67%). Neutral emotions were also detected with moderate accuracy (F 57.58%, M 74.07%), while other emotions like surprise, anger, and fear had lower accuracy rates. In the second phase, an Offline FER Algorithm was developed using a modified version of the face-api-js library, which analyzed and classified facial expressions into emotional categories by extracting timestamps and probability values for each emotion from the video recordings. The Area Under Curve (AUC) of the emotional signals, which track the probability evolution of each basic emotion over time, was calculated to determine the predominant emotion after each stimulus. The validation of the algorithm revealed significant ac- curacy for detecting happiness, while other emotions showed more inconsistent results. The findings validate FER as a biomarker for emotional responses, creating a reliable experimental protocol. Future works, focusing on integrating the online task with the FER algorithm, would allow for a real-time emotion monitoring in various applications, such as healthcare, education, privacy, law, and human-computer interaction.File | Dimensione | Formato | |
---|---|---|---|
ElSoudany_Abdalla.pdf
accesso aperto
Dimensione
3.25 MB
Formato
Adobe PDF
|
3.25 MB | Adobe PDF | Visualizza/Apri |
The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License
https://hdl.handle.net/20.500.12608/75157