Towards Automating Disambiguation of Regulations: Using the Wisdom of Crowds

Compliant software is a critical need of all modern businesses. Disambiguating regulations to derive requirements is therefore an important software engineering activity. Regulations however are ridden with ambiguities that make their comprehension a challenge, seemingly surmountable only by legal e...

Full description

Saved in:
Bibliographic Details
Published in2018 33rd IEEE/ACM International Conference on Automated Software Engineering (ASE) pp. 850 - 855
Main Authors Patwardhan, Manasi, Sainani, Abhishek, Sharma, Richa, Karande, Shirish, Ghaisas, Smita
Format Conference Proceeding
LanguageEnglish
Published ACM 01.09.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Compliant software is a critical need of all modern businesses. Disambiguating regulations to derive requirements is therefore an important software engineering activity. Regulations however are ridden with ambiguities that make their comprehension a challenge, seemingly surmountable only by legal experts. Since legal experts' involvement in every project is expensive, approaches to automate the disambiguation need to be explored. These approaches however require a large amount of annotated data. Collecting data exclusively from experts is not a scalable and affordable solution. In this paper, we present the results of a crowd sourcing experiment to collect annotations on ambiguities in regulations from professional software engineers. We discuss an approach to automate the arduous and critical step of identifying ground truth labels by employing crowd consensus using Expectation Maximization (EM). We demonstrate that the annotations reaching a consensus match those of experts with an accuracy of 87%.
ISSN:2643-1572
DOI:10.1145/3238147.3240727