Music & Science (May 2019)
Semantic Crosstalk in Timbre Perception
Abstract
Many adjectives for musical timbre reflect cross-modal correspondence, particularly with vision and touch (e.g., “dark–bright,” “smooth–rough”). Although multisensory integration between visual/tactile processing and hearing has been demonstrated for pitch and loudness, timbre is not well understood as a locus of cross-modal mappings. Are people consistent in these semantic associations? Do cross-modal terms reflect dimensional interactions in timbre processing? Here I designed two experiments to investigate crosstalk between timbre semantics and perception through the use of Stroop-type speeded classification. Experiment 1 found that incongruent pairings of instrument timbres and written names caused significant Stroop-type interference relative to congruent pairs, indicating bidirectional crosstalk between semantic and auditory modalities. Pre-Experiment 2 asked participants to rate natural and synthesized timbres on semantic differential scales capturing luminance (brightness) and texture (roughness) associations, finding substantial consistency for a number of timbres. Acoustic correlates of these associations were also assessed, indicating an important role for high-frequency energy in the intensity of cross-modal ratings. Experiment 2 used timbre adjectives and sound stimuli validated in the previous experiment in two variants of a semantic-auditory Stroop-type task. Results of linear mixed-effects modeling of reaction time and accuracy showed slight interference in semantic processing when adjectives were paired with cross-modally incongruent instrument timbres (e.g., the word “smooth” with a “rough” timbre). Taken together, I conclude by suggesting that semantic crosstalk in timbre processing may be partially automatic and could reflect weak synesthetic congruency between interconnected sensory domains.