Enhancing Content Validity Assessment With Item Response Theory Modeling

Background:: Ensuring the validity of assessments requires a thorough examination of the test content. Subject matter experts (SMEs) are commonly employed to evaluate the relevance, representativeness, and appropriateness of the items. This article proposes incorporating item response theory (IRT) i...

Full description

Saved in:
Bibliographic Details
Published inPsicothema Vol. 36; no. 2; pp. 145 - 153
Main Authors Schames Kreitchmann, Rodrigo, Nájera, Pablo, Sanz, Susana, Sorrel, Miguel Ángel
Format Journal Article
LanguageEnglish
Published Spain 01.01.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Background:: Ensuring the validity of assessments requires a thorough examination of the test content. Subject matter experts (SMEs) are commonly employed to evaluate the relevance, representativeness, and appropriateness of the items. This article proposes incorporating item response theory (IRT) into model assessments conducted by SMEs. Using IRT allows for the estimation of discrimination and threshold parameters for each SME, providing evidence of their performance in differentiating relevant from irrelevant items, thus facilitating the detection of suboptimal SME performance while improving item relevance scores. Method: : Use of IRT was compared to traditional validity indices (content validity index and Aiken's V) in the evaluation of items. The aim was to assess the SMEs’ accuracy in identifying whether items were designed to measure conscientiousness or not, and predicting their factor loadings. Results:: The IRT-based scores effectively identified conscientiousness items ( = 0.57) and accurately predicted their factor loadings ( = 0.45). These scores demonstrated incremental validity, explaining 11% more variance than Aiken's V and up to 17% more than the content validity index. Conclusions:: Modeling SME assessments with IRT improves item alignment and provides better predictions of factor loadings, enabling improvement of the content validity of measurement instruments.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0214-9915
1886-144X
1886-144X
DOI:10.7334/psicothema2023.208