Heliyon (May 2024)
Analyzing the packaging design evaluation based on image emotion perception computing
Abstract
Nowadays, a wide variety of labels of items are widely available, and human consumption is increasingly tailored to meet their individual needs. So, many businesses are starting to focus on improving the functionality of modern packaging. Sensorial paradigms and emotional reactions could change during the user-product interaction lifecycle. The designer's emotional imagination and past experiences are the backbone of conventional product package design, which has limitations due to unmanageable content and an absence of professional advice—the majority of previous research on emotional image analysis aimed to forecast the most common viewer emotions. Since the feelings a picture evokes are quite individual and vary from viewer to viewer, this overarching feeling isn't always enough for practical uses. The research presented an approach to packaging design evaluation based on image emotion perception computing (PDE-IEPC), which combines emotion perception technology with a deep LSTM (Long short-term model), resulting in an immersive and dynamic experience for the human senses. Emotion Perception Computing's Dynamic Multi-task Hypergraph Learning (DMHL) approach considers graphical data, social context, spatial evolution, and location, among other criteria, to evaluate packaging designs efficiently based on their emotional impact. Image-Emotion-Social-Net is a large dataset used to evaluate multidimensional and categorical attitude representation. The dataset is sourced from Flickr and contains over 1 million images presented by over 9000 users. Personalized emotion categorization is an area where research on this dataset shows that the suggested strategy outperforms many modern techniques. The experimental results show that the proposed method achieves a high packaging design quality rate of 94.1 %, a performance success rate of 97.5 %, and a mean square error rate of 2 % compared to other existing methods.