IEEE Access (Jan 2021)

An Empirical Study on Group Fairness Metrics of Judicial Data

  • Yanjun Li,
  • Huan Huang,
  • Xinwei Guo,
  • Yuyu Yuan

DOI
https://doi.org/10.1109/ACCESS.2021.3122443
Journal volume & issue
Vol. 9
pp. 149043 – 149049

Abstract

Read online

Group fairness means that different groups have an equal probability of being predicted for one aspect. It is a significant fairness definition, which is conducive to maintaining social harmony and stability. Fairness is a vital issue when an artificial intelligence software system is used to make judicial decisions. Either data or algorithm alone may lead to unfair results. Determining the fairness of the dataset is a prerequisite for studying the fairness of algorithms. This paper focuses on the dataset to research group fairness from both micro and macro views. We propose a framework to determine the sensitive attributes of a dataset and metrics to measure the fair degree of sensitive attributes. We conducted experiments and statistical analysis of the judicial data to demonstrate the framework and metric approach better. The framework and metric approach can be applied to datasets of other domains, providing persuasive evidence for the effectiveness and availability of algorithmic fairness research. It opens up a new way for the research of the fairness of the dataset.

Keywords