IEEE Access (Jan 2024)

Attach-SwiNet: Multimodal Attachment Style Classification Model Based on Non-Verbal Signals

  • Tusty Nadia Maghfira,
  • T. Basaruddin,
  • Adila Alfa Krisnadhi,
  • Sri Redatin Retno Pudjiati

DOI
https://doi.org/10.1109/ACCESS.2024.3397608
Journal volume & issue
Vol. 12
pp. 79151 – 79165

Abstract

Read online

Attachment systems, which signify emotional bonds with significant others, play a crucial role in influencing self-development and social interactions. Research on adult attachment in psychology has predominantly utilized questionnaires and interviews, focusing mainly on romantic relationships and parent-child interactions during childhood. Although machine learning approaches in adult attachment research have begun to assess non-verbal behaviors objectively, the connection between these behaviors and attachment styles has not yet been fully explored. This paper presents Attach-SwiNet, a new multimodal model for classifying attachment styles in close relationships among young adults. Our model combines representations of emotions from non-verbal behaviors with subjective responses to categorize attachment styles. It utilizes pre-trained Swin Transformers to analyze emotional cues in facial expression videos and pre-trained ResNet50 to examine speech responses. By integrating the most effective emotion representations from both datasets with rating data from the Experiences in Close Relationships - Relationship Structures (ECR-RS), our model significantly enhances the accuracy of classifying attachment styles. Experimental results show that our approach improves performance over traditional unimodal behavioral data and subjective questionnaire responses by 1.13%.

Keywords