PLoS Computational Biology (Apr 2025)

Perceptual learning improves discrimination but does not reduce distortions in appearance.

  • Sarit F A Szpiro,
  • Charlie S Burlingham,
  • Eero P Simoncelli,
  • Marisa Carrasco

DOI
https://doi.org/10.1371/journal.pcbi.1012980
Journal volume & issue
Vol. 21, no. 4
p. e1012980

Abstract

Read online

Human perceptual sensitivity often improves with training, a phenomenon known as "perceptual learning." Another important perceptual dimension is appearance, the subjective sense of stimulus magnitude. Are training-induced improvements in sensitivity accompanied by more accurate appearance? Here, we examined this question by measuring both discrimination (sensitivity) and estimation (appearance) responses to near-horizontal motion directions, which are known to be repulsed away from horizontal. Participants performed discrimination and estimation tasks before and after training in either the discrimination or the estimation task or none (control group). Human observers who trained in either discrimination or estimation exhibited improvements in discrimination accuracy, but estimation repulsion did not decrease; instead, it either persisted or increased. Hence, distortions in perception can be exacerbated after perceptual learning. We developed a computational observer model in which perceptual learning arises from increases in the precision of underlying neural representations, which explains this counterintuitive finding. For each observer, the fitted model accounted for discrimination performance, the distribution of estimates, and their changes with training. Our empirical findings and modeling suggest that learning enhances distinctions between categories, a potentially important aspect of real-world perception and perceptual learning.