Big Data & Society (Mar 2018)
Understanding perception of algorithmic decisions: Fairness, trust, and emotion in response to algorithmic management
Abstract
Algorithms increasingly make managerial decisions that people used to make. Perceptions of algorithms, regardless of the algorithms' actual performance, can significantly influence their adoption, yet we do not fully understand how people perceive decisions made by algorithms as compared with decisions made by humans. To explore perceptions of algorithmic management, we conducted an online experiment using four managerial decisions that required either mechanical or human skills. We manipulated the decision-maker (algorithmic or human), and measured perceived fairness, trust, and emotional response. With the mechanical tasks, algorithmic and human-made decisions were perceived as equally fair and trustworthy and evoked similar emotions; however, human managers' fairness and trustworthiness were attributed to the manager's authority, whereas algorithms' fairness and trustworthiness were attributed to their perceived efficiency and objectivity. Human decisions evoked some positive emotion due to the possibility of social recognition, whereas algorithmic decisions generated a more mixed response – algorithms were seen as helpful tools but also possible tracking mechanisms. With the human tasks, algorithmic decisions were perceived as less fair and trustworthy and evoked more negative emotion than human decisions. Algorithms' perceived lack of intuition and subjective judgment capabilities contributed to the lower fairness and trustworthiness judgments. Positive emotion from human decisions was attributed to social recognition, while negative emotion from algorithmic decisions was attributed to the dehumanizing experience of being evaluated by machines. This work reveals people's lay concepts of algorithmic versus human decisions in a management context and suggests that task characteristics matter in understanding people's experiences with algorithmic technologies.