Applied Mathematics and Nonlinear Sciences (Jan 2024)
Emotional Characterization Mining in Music Performance and Its Guiding Role
Abstract
Emotional attributes are crucial in music performance, serving a pivotal role in guiding interpretation and execution. This study employs a Wavenet layer within a Waveform-Deep Neural Network (WLDNN) to extract emotional features from musical performances. An activation function is then applied to process and refine these features. Additionally, a Generative Adversarial Network (GAN) is utilized to identify and remove irrelevant features, thereby enhancing the quality of the emotional attributes extracted. The evaluation of these emotional features employs both music emotion language values and an emotion vector model. Our analysis demonstrates that this methodology achieves a 90% accuracy rate in identifying and extracting emotional features from music performances. Based on these findings, a stage lighting control system was developed, tailored to respond to the emotional cues within the music. The system was tested across ten different performances with a sampling frequency of 5 seconds, achieving an average synchronization rate of 94.01% with the emotional content of the music. This approach not only proves effective for stage lighting design but also offers valuable insights for enhancing the emotional expressiveness of musical performances.
Keywords