Show simple item record

dc.contributor.advisorMaillard, Patrícia Augustin Jaques
dc.contributor.authorWerlang, Pablo Santos
dc.date.accessioned2023-06-01T14:47:46Z
dc.date.accessioned2024-02-28T18:54:47Z
dc.date.available2023-06-01T14:47:46Z
dc.date.available2024-02-28T18:54:47Z
dc.date.issued2022-10-31
dc.identifier.urihttps://hdl.handle.net/20.500.12032/126253
dc.description.abstractAffective computing aims to improve human-machine interaction by developing tools and techniques to enable the system’s decision-making processes to adjust to human affective states. Automatic face recognition of emotions is a relatively recent area that has the potential of turning human-computer interaction into an increasingly natural experience. Especially in intelligent learning environments, emotion detection benefits the students by directly using their affective information to perceive their difficulties, adapt the pedagogic intervention and engage them. The present work created a model capable of recognizing by face the emotions commonly experienced by students in interaction sections with learning environments: engagement, confusion, frustration, and boredom. The proposed model used deep neural networks to classify one of these emotions, extracting statistical, temporal, and spatial features from the videos provided for training, including eye movement and Action Units. Considering the psychological model of affect dynamics proposed by D’Mello, which states that in learning situations, each emotion’s experience is tied to each other, and their presence is determined by the order in which they are shown, this work’s main contribution is to take into account the flow of emotions as well as the learner’s personality traits as a mean for increasing emotion detection accuracy. We tested several model configurations and their efficiency compared to recently developed models. Results show that considering the learning emotions sequence and the personality as models’ input improves those algorithms’ effectiveness. Training the model on the DAiSEE dataset, we achieved 26.27% F1 improvement (from 0.5122 to 0.6468) when including the emotions’ history in the model, while we achieved 1.48% F1 improvement on the model trained using the PAT2Math dataset (from 0.8741 to 0.8871) when including subject’s personality traits. Compared to the state-of-the-art, the model achieved a superior 5.6% using the F1 metric. However, its accuracy was 4.7% lower.en
dc.description.sponsorshipNenhumapt_BR
dc.languagept_BRpt_BR
dc.publisherUniversidade do Vale do Rio dos Sinospt_BR
dc.rightsopenAccesspt_BR
dc.subjectReconhecimento de emoçõespt_BR
dc.subjectEmotion recognitionen
dc.titleReconhecimento de emoções acadêmicas por face através de aprendizagem profunda: considerando a sequência de emoções e a personalidade do estudantept_BR
dc.typeTesept_BR


Files in this item

FilesSizeFormatView
Pablo Santos Werlang_.pdf3.122Mbapplication/pdfView/Open

This item appears in the following Collection(s)

Show simple item record


© AUSJAL 2022

Asociación de Universidades Confiadas a la Compañía de Jesús en América Latina, AUSJAL
Av. Santa Teresa de Jesús Edif. Cerpe, Piso 2, Oficina AUSJAL Urb.
La Castellana, Chacao (1060) Caracas - Venezuela
Tel/Fax (+58-212)-266-13-41 /(+58-212)-266-85-62

Nuestras redes sociales

facebook Facebook

twitter Twitter

youtube Youtube

Asociaciones Jesuitas en el mundo
Ausjal en el mundo AJCU AUSJAL JESAM JCEP JCS JCAP