Reducing "I Don't Know" Responses and Missing Survey Data: Implications for Measurement

"I don't know" (DK) responses are common in health behavior research. Yet analytic approaches to managing DK responses may undermine survey validity and researchers' ability to interpret findings. Compare the usefulness of a methodological strategy for reducing DK responses to 3...

Full description

Saved in:
Bibliographic Details
Published inMedical decision making Vol. 38; no. 6; p. 673
Main Authors Denman, Deanna C, Baldwin, Austin S, Betts, Andrea C, McQueen, Amy, Tiro, Jasmin A
Format Journal Article
LanguageEnglish
Published United States 01.08.2018
Subjects
Online AccessGet more information

Cover

Loading…
More Information
Summary:"I don't know" (DK) responses are common in health behavior research. Yet analytic approaches to managing DK responses may undermine survey validity and researchers' ability to interpret findings. Compare the usefulness of a methodological strategy for reducing DK responses to 3 analytic approaches: 1) excluding DKs as missing data, 2) recoding them to the neutral point of the response scale, and 3) recoding DKs with the mean. We used a 4-group design to compare a methodological strategy, which encourages use of the response scale after an initial DK response, to 3 methods of analytically treating DK responses. We examined 1) whether this methodological strategy reduced the frequency of DK responses, and 2) how the methodological strategy compared to common analytic treatments in terms of factor structure and strength of correlations between measures of constructs. The prompt reduced DK response frequency (55.7% of 164 unprompted participants vs. 19.6% of 102 prompted participants). Factorial invariance analyses suggested equivalence in factor loadings for all constructs throughout the groups. Compared to excluding DKs, recoding strategies and use of the prompt improved the strength of correlations between constructs, with the prompt resulting in the strongest correlations (.589 for benefits and intentions, .446 for perceived susceptibility and intentions, and .329 for benefits and perceived susceptibility). This study was not designed a priori to test methods for addressing DK responses. Our analysis was limited to an interviewer-administered survey, and interviewers did not probe about reasons for DK responses. Findings suggest that use of a prompt to reduce DK responses is preferable to analytic approaches to treating DK responses. Use of such prompts may improve the validity of health behavior survey research.
ISSN:1552-681X
DOI:10.1177/0272989X18785159