Evaluating probabilistic software development effort estimates: Maximizing informativeness subject to calibration

Probabilistic effort estimates inform about the uncertainty and may give useful input to plans, budgets and investment analyses. This paper introduces, motivates and illustrates two principles on how to evaluate the accuracy and other performance criteria of probabilistic effort estimates in softwar...

Full description

Saved in:
Bibliographic Details
Published inInformation and software technology Vol. 115; pp. 93 - 96
Main Author Jørgensen, Magne
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.11.2019
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Probabilistic effort estimates inform about the uncertainty and may give useful input to plans, budgets and investment analyses. This paper introduces, motivates and illustrates two principles on how to evaluate the accuracy and other performance criteria of probabilistic effort estimates in software development contexts. The first principle emphasizes a consistency between the estimation error measure and the loss function of the chosen type of probabilistic single point effort estimates. The second principle points at the importance of not just measuring calibration, but also informativeness of estimated prediction intervals and distributions. The relevance of the evaluation principles is illustrated by a performance evaluation of estimates from twenty-eight software professionals using two different uncertainty assessment methods to estimate the effort of the same thirty software maintenance tasks.
ISSN:0950-5849
1873-6025
DOI:10.1016/j.infsof.2019.08.006