Software Fairness: An Analysis and Survey
In the last decade, researchers have studied fairness as a software property. In particular, how to engineer fair software systems. This includes specifying, designing, and validating fairness properties. However, the landscape of works addressing bias as a software engineering concern is unclear, i...
Saved in:
Published in | ACM computing surveys |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
08.09.2025
|
Online Access | Get full text |
ISSN | 0360-0300 1557-7341 |
DOI | 10.1145/3762170 |
Cover
Summary: | In the last decade, researchers have studied fairness as a software property. In particular, how to engineer fair software systems. This includes specifying, designing, and validating fairness properties. However, the landscape of works addressing bias as a software engineering concern is unclear, i.e., techniques and studies that analyze the fairness properties of learning-based software. In this work, we provide a clear view of the state-of-the-art in software fairness analysis. To this end, we collect, categorize and conduct in-depth analysis of 164 publications investigating the fairness of learning-based software systems. Specifically, we study the evaluated fairness measure, the studied tasks, the type of fairness analysis, the main idea of the proposed approaches and the access level (e.g., black, white or grey box). Our findings include the following: (1) Fairness concerns (such as fairness specification and requirements engineering) are under-studied; (2) Fairness measures such as conditional, sequential and intersectional fairness are under-explored; (3) Semi-structured datasets (e.g., audio, image, code and text) are barely studied for fairness analysis in the SE community; and (4) Software fairness analysis techniques hardly employ white-box, in-processing machine learning (ML) analysis methods. In summary, we observed several open challenges including the need to study intersectional/sequential bias, policy-based bias handling and human-in-the-loop, socio-technical bias mitigation. |
---|---|
ISSN: | 0360-0300 1557-7341 |
DOI: | 10.1145/3762170 |