Australian Journal of Psychology (Jun 2020)
A systematic narrative review of psychological literacy measurement
Abstract
Objective This study aimed to identify studies measuring psychological literacy and analyse the methodological quality of these studies. We also aimed to determine conceptual consistency of psychological literacy in the included studies. Method PsycINFO, Web of Science, Scopus, ProQuest, and Google Scholar were searched for relevant literature. The Joanna Briggs Institute systematic review methodology was then followed to assess methodological quality of identified studies. We then analysed whether psychological literacy research used a consistent conceptualisation, as inconsistency is detrimental for result generalisation and validity. Results Six relevant articles met inclusion criteria. Assessment of methodological quality revealed confounds and statistical concerns. Most studies used self‐reported skill development and used one psychological literacy definition to select measures. Measures were diverse, suggesting inconsistent operationalisation. Conclusion Longitudinal studies are needed to avoid confounds of age and skill development prior to university. The definition used in most studies requires interpretation as it contains broad attribute descriptions. psychological literacy needs a more concise definition to standardise assessment. Varied conceptualisation and operationalisation suggest a construct validity assessment is needed. As psychological literacy is understood in diverse ways in the literature, there is a need to know what psychology educators understand about the term and how it is applied in curricula.
Keywords