Frontiers in Psychology (May 2021)

A Dual Simple Recurrent Network Model for Chunking and Abstract Processes in Sequence Learning

  • Lituan Wang,
  • Yangqin Feng,
  • Yangqin Feng,
  • Qiufang Fu,
  • Qiufang Fu,
  • Jianyong Wang,
  • Xunwei Sun,
  • Xunwei Sun,
  • Xiaolan Fu,
  • Xiaolan Fu,
  • Lei Zhang,
  • Zhang Yi

DOI
https://doi.org/10.3389/fpsyg.2021.587405
Journal volume & issue
Vol. 12

Abstract

Read online

Although many studies have provided evidence that abstract knowledge can be acquired in artificial grammar learning, it remains unclear how abstract knowledge can be attained in sequence learning. To address this issue, we proposed a dual simple recurrent network (DSRN) model that includes a surface SRN encoding and predicting the surface properties of stimuli and an abstract SRN encoding and predicting the abstract properties of stimuli. The results of Simulations 1 and 2 showed that the DSRN model can account for learning effects in the serial reaction time (SRT) task under different conditions, and the manipulation of the contribution weight of each SRN accounted for the contribution of conscious and unconscious processes in inclusion and exclusion tests in previous studies. The results of human performance in Simulation 3 provided further evidence that people can implicitly learn both chunking and abstract knowledge in sequence learning, and the results of Simulation 3 confirmed that the DSRN model can account for how people implicitly acquire the two types of knowledge in sequence learning. These findings extend the learning ability of the SRN model and help understand how different types of knowledge can be acquired implicitly in sequence learning.

Keywords