Effect of Requirements Analyst Experience on Elicitation Effectiveness: A Family of Quasi-Experiments

Context. In software engineering there is a widespread assumption that experience improves requirements analyst effectiveness, although empirical studies demonstrate the opposite. Aim. Determine whether experience (interviews, eliciting, development, professional) influences requirements elicitation...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on software engineering Vol. 49; no. 4; pp. 1 - 20
Main Authors Aranda, Alejandrina M., Dieste, Oscar, Panach, Jose Ignacio, Juristo, Natalia
Format Journal Article
LanguageEnglish
Published New York IEEE 01.04.2023
IEEE Computer Society
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Context. In software engineering there is a widespread assumption that experience improves requirements analyst effectiveness, although empirical studies demonstrate the opposite. Aim. Determine whether experience (interviews, eliciting, development, professional) influences requirements elicitation using interviews. Method. We ran 12 quasi-experiments recruiting 124 subjects in which we measured analyst effectiveness as the number of items (i.e., concepts, rules, processes) correctly elicited. The experimental task was to elicit requirements using the open interview technique followed by the consolidation of the elicited information in domains with which the analysts were and were not familiar. Results. In unfamiliar domains, interview experience, requirements experience, development experience, and professional experience does not have any relationship with analyst effectiveness. In familiar domains, effectiveness varies depending on the type of experience. Interview experience has a positive effect, whereas professional experience has a moderate negative effect. Requirements experience appears to have a moderately positive effect; however, the statistical power of the analysis is insufficient to be able to confirm this point. Development experience has no effect. Conclusion. Experience impacts analyst effectiveness differently depending on the problem domain type (familiar, unfamiliar). Generally, experience does not account for all the observed variability in effectiveness, so there are other influential factors.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0098-5589
1939-3520
DOI:10.1109/TSE.2022.3210076