Pre-Trained Nonresponse Prediction in Panel Surveys with Machine Learning
While predictive modeling for unit nonresponse in panel surveys has been explored in various contexts, it is still under-researched how practitioners can best adopt these techniques. Currently, practitioners need to wait until they accumulate enough data in their panel to train and evaluate their ow...
Saved in:
Published in | Survey research methods Vol. 19; no. 2 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
European Survey Research Association
08.08.2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | While predictive modeling for unit nonresponse in panel surveys has been explored in various contexts, it is still under-researched how practitioners can best adopt these techniques. Currently, practitioners need to wait until they accumulate enough data in their panel to train and evaluate their own modeling options. This paper presents a novel “cross-training” technique in which we show that the indicators of nonresponse are so ubiquitous across studies that it is viable to train a model on one panel study and apply it to a different one. The practical benefit of this approach is that newly commencing panels can potentially make better nonresponse predictions in the early waves because these pre-trained models make use of more data. We demonstrate this technique with five panel surveys which encompass a variety of survey designs: the Socio-Economic Panel (SOEP), the German Internet Usage Panel (GIP), the GESIS Panel, the Mannheim Corona Study (MCS), and the Family Demographic Panel (FREDA). We demonstrate that nonresponse history and demographics, paired with tree-based modeling methods, make highly accurate and generalizable predictions across studies, despite differences in panel design. We show how cross-training can effectively predict nonresponse in early panel waves where attrition is typically highest. |
---|---|
ISSN: | 1864-3361 |
DOI: | 10.18148/srm/2025.v19i2.8473 |