Estimating response propensities in nonprobability surveys using machine learning weighted models
Propensity Score Adjustment (PSA) is a widely accepted method to reduce selection bias in nonprobability samples. In this approach, the (unknown) response probability of each individual is estimated in a nonprobability sample, using a reference probability sample. This, the researcher obtains a repr...
Saved in:
Published in | Mathematics and computers in simulation Vol. 225; pp. 779 - 793 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
01.11.2024
|
Subjects | |
Online Access | Get full text |
ISSN | 0378-4754 |
DOI | 10.1016/j.matcom.2024.06.012 |
Cover
Abstract | Propensity Score Adjustment (PSA) is a widely accepted method to reduce selection bias in nonprobability samples. In this approach, the (unknown) response probability of each individual is estimated in a nonprobability sample, using a reference probability sample. This, the researcher obtains a representation of the target population, reflecting the differences (for a set of auxiliary variables) between the population and the nonprobability sample, from which response probabilities can be estimated.
Auxiliary probability samples are usually produced by surveys with complex sampling designs, meaning that the use of design weights is crucial to accurately calculate response probabilities. When a linear model is used for this task, maximising a pseudo log-likelihood function which involves design weights provides consistent estimates for the inverse probability weighting estimator. However, little is known about how design weights may benefit the estimates when techniques such as machine learning classifiers are used.
This study aims to investigate the behaviour of Propensity Score Adjustment with machine learning classifiers, subject to the use of weights in the modelling step. A theoretical approximation to the problem is presented, together with a simulation study highlighting the properties of estimators using different types of weights in the propensity modelling step.
•Machine learning methods with design weights can be used for propensity estimation.•Using design weights in propensity estimation is effective under complex designs.•All modelling approaches performed well but machine learning ones are preferable.•Design weights can be used in propensity transformations based on correction factors. |
---|---|
AbstractList | Propensity Score Adjustment (PSA) is a widely accepted method to reduce selection bias in nonprobability samples. In this approach, the (unknown) response probability of each individual is estimated in a nonprobability sample, using a reference probability sample. This, the researcher obtains a representation of the target population, reflecting the differences (for a set of auxiliary variables) between the population and the nonprobability sample, from which response probabilities can be estimated.
Auxiliary probability samples are usually produced by surveys with complex sampling designs, meaning that the use of design weights is crucial to accurately calculate response probabilities. When a linear model is used for this task, maximising a pseudo log-likelihood function which involves design weights provides consistent estimates for the inverse probability weighting estimator. However, little is known about how design weights may benefit the estimates when techniques such as machine learning classifiers are used.
This study aims to investigate the behaviour of Propensity Score Adjustment with machine learning classifiers, subject to the use of weights in the modelling step. A theoretical approximation to the problem is presented, together with a simulation study highlighting the properties of estimators using different types of weights in the propensity modelling step.
•Machine learning methods with design weights can be used for propensity estimation.•Using design weights in propensity estimation is effective under complex designs.•All modelling approaches performed well but machine learning ones are preferable.•Design weights can be used in propensity transformations based on correction factors. |
Author | Cobo, Beatriz Ferri-García, Ramón Rueda-Sánchez, Jorge L. Rueda, María del Mar |
Author_xml | – sequence: 1 givenname: Ramón orcidid: 0000-0002-9655-933X surname: Ferri-García fullname: Ferri-García, Ramón email: rferri@ugr.es organization: Department of Statistics and Operations Research, University of Granada, Avenida Fuentenueva, s/n, Granada, 18017, Spain – sequence: 2 givenname: Jorge L. surname: Rueda-Sánchez fullname: Rueda-Sánchez, Jorge L. organization: Mathematics Institute of the University of Granada (IMAG), Calle Ventanilla, 11, 18001, Granada, Spain – sequence: 3 givenname: María del Mar orcidid: 0000-0002-2903-8745 surname: Rueda fullname: Rueda, María del Mar organization: Department of Statistics and Operations Research, University of Granada, Avenida Fuentenueva, s/n, Granada, 18017, Spain – sequence: 4 givenname: Beatriz orcidid: 0000-0003-2654-0032 surname: Cobo fullname: Cobo, Beatriz organization: Department of Quantitative Methods for Economics and Business, University of Granada, Campus Universitario de Cartuja, Granada, 18071, Spain |
BookMark | eNp9kMtqwzAQRbVIoUnaP-hCP2B3bEuysymUkD4g0E27FrI8SRRs2WiclPx9ZdJ1NjPDZc487oLNfO-RsacM0gwy9XxMOzPavktzyEUKKoUsn7E5FGWViFKKe7YgOgJArOWcmQ2NLhLO73lAGnpPyIfQD-jJjQ6JO8_jiijVpnatGy-cTuGMF-InmqjO2IPzyFs0wU_CL7r9YcSGd32DLT2wu51pCR__85L9vG2-1x_J9uv9c_26TWwB2ZhYEEpBJQFkjCspM4u1qFRdV2WhLJi8wtyaHEW1Kg0aKzE2ljtb1UqWKyiWTFzn2tATBdzpIcTPwkVnoCdr9FFfrdGTNRqUjtZE7OWKxVPx7DBosg69xcYFtKNuend7wB_cS3Um |
Cites_doi | 10.1177/0049124108329643 10.1093/jssam/smaa028 10.2307/2528036 10.1214/aoms/1177729988 10.1093/jssam/smz003 10.1080/01621459.1984.10478078 10.1007/s11749-021-00795-7 10.3414/ME00-01-0052 10.3390/math8060879 10.1214/16-STS597 10.1023/A:1010933404324 10.1371/journal.pone.0231500 10.1214/16-STS598 10.1177/0049124110392533 10.1111/rssa.12564 10.1145/2939672.2939785 10.1093/jssam/smz023 10.1111/rssc.12371 10.1080/01621459.2019.1677241 |
ContentType | Journal Article |
Copyright | 2024 The Authors |
Copyright_xml | – notice: 2024 The Authors |
DBID | 6I. AAFTH AAYXX CITATION |
DOI | 10.1016/j.matcom.2024.06.012 |
DatabaseName | ScienceDirect Open Access Titles Elsevier:ScienceDirect:Open Access CrossRef |
DatabaseTitle | CrossRef |
DatabaseTitleList | |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science |
EndPage | 793 |
ExternalDocumentID | 10_1016_j_matcom_2024_06_012 S0378475424002374 |
GroupedDBID | --K --M -~X .~1 0R~ 1B1 1RT 1~. 1~5 29M 4.4 457 4G. 5GY 5VS 6I. 7-5 71M 8P~ 9JN 9JO AAAKF AAAKG AACTN AAEDT AAEDW AAFTH AAIKJ AAKOC AALRI AAOAW AAQFI AAQXK AARIN AAXKI AAXUO ABAOU ABEFU ABFNM ABJNI ABMAC ABUCO ABXDB ACDAQ ACGFS ACNNM ACRLP ADBBV ADEZE ADGUI ADMUD ADTZH AEBSH AECPX AEKER AENEX AFFNX AFJKZ AFKWA AFTJW AGHFR AGUBO AGYEJ AHHHB AHJVU AIEXJ AIGVJ AIKHN AITUG AJOXV AKRWK ALMA_UNASSIGNED_HOLDINGS AMFUW AMRAJ APLSM ARUGR AXJTR AZFZN BJAXD BKOJK BLXMC CS3 DU5 EBS EFJIC EJD EO8 EO9 EP2 EP3 F5P FDB FEDTE FGOYB FIRID FNPLU FYGXN G-2 G-Q GBLVA HAMUX HLZ HMJ HVGLF HZ~ H~9 IHE J1W JJJVA KOM LG9 M26 M41 MHUIS MO0 N9A O-L O9- OAUVE OZT P-8 P-9 P2P PC. Q38 R2- RIG RNS ROL RPZ SBC SDF SDG SES SEW SME SPC SPCBC SSB SSD SST SSW SSZ T5K TN5 WUQ XPP ZMT ~02 ~G- AATTM AAYWO AAYXX ABWVN ACRPL ACVFH ADCNI ADNMO AEIPS AEUPX AFPUW AFXIZ AGCQF AGQPQ AGRNS AIGII AIIUN AKBMS AKYEP ANKPU APXCP BNPGV CITATION SSH |
ID | FETCH-LOGICAL-c301t-c04660850050859551ceb486bb8736c0a28e2ca2e4897aeac5e5087fc8b657903 |
IEDL.DBID | AIKHN |
ISSN | 0378-4754 |
IngestDate | Tue Jul 01 03:39:45 EDT 2025 Sat Sep 28 16:09:28 EDT 2024 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Keywords | Propensity score adjustment Design weights Nonprobability samples |
Language | English |
License | This is an open access article under the CC BY license. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c301t-c04660850050859551ceb486bb8736c0a28e2ca2e4897aeac5e5087fc8b657903 |
ORCID | 0000-0002-9655-933X 0000-0002-2903-8745 0000-0003-2654-0032 |
OpenAccessLink | https://www.sciencedirect.com/science/article/pii/S0378475424002374 |
PageCount | 15 |
ParticipantIDs | crossref_primary_10_1016_j_matcom_2024_06_012 elsevier_sciencedirect_doi_10_1016_j_matcom_2024_06_012 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | November 2024 2024-11-00 |
PublicationDateYYYYMMDD | 2024-11-01 |
PublicationDate_xml | – month: 11 year: 2024 text: November 2024 |
PublicationDecade | 2020 |
PublicationTitle | Mathematics and computers in simulation |
PublicationYear | 2024 |
Publisher | Elsevier B.V |
Publisher_xml | – name: Elsevier B.V |
References | Friedman (b24) 2001 Castro-Martín, Rueda, Ferri-García (b11) 2020; 8 Cochran (b15) 1968 Madow (b26) 1949; 20 Beaumont (b18) 2020; 46 Breiman (b20) 2001; 45 Valliant, Dever (b9) 2011; 40 Chen, He, Benesty, Khotilovich, Tang, Cho, Chen, Mitchell, Cano, Zhou, Li, Xie, Lin, Geng, Li, Yuan (b25) 2024 Tillé, Matei (b27) 2023 Rivers (b4) 2007 Valliant (b17) 2020; 8 Rosenbaum, Rubin (b16) 1984; 79 Lee, Valliant (b14) 2009; 37 Buskirk, Kolenikov (b28) 2015 Ferri-García, Beaumont, Bosa, Charlebois, Chu (b6) 2022; 31 Malley, Kruppa, Dasgupta, Malley, Ziegler (b21) 2012; 51 K.C.K. Chu, J.F. Beaumont, The use of classification trees to reduce selection bias for a non-probability sample with help from a probability sample, in: Proceedings of the Survey Methods Section: SSC Annual Meeting, vol. 26, Calgary, AB, Canada, 2019. Andridge, West, Little, Boonstra, Alvarado-Leiton (b7) 2019; 68 Wang, Graubard, Katki, Li (b3) 2020; 183 Kern, Li, Wang (b12) 2020; 9 Chen, Li, Wu (b5) 2020; 115 Schonlau, Couper (b13) 2017; 32 Little, West, Boonstra, Hu (b8) 2020; 8 Elliott, Valliant (b1) 2017; 32 Lee (b2) 2006; 22 T. Chen, C. Guestrin, Xgboost: A scalable tree boosting system, in: Proceedings of the 22nd Acm SIGKDD International Conference on Knowledge Discovery and Data Mining, 2016, pp. 785–794. Castro-Martín, Rueda, Ferri-García, Hernando-Tamayo (b23) 2021; 9 Ferri-García, Rueda (b10) 2020; 15 Valliant (10.1016/j.matcom.2024.06.012_b17) 2020; 8 Ferri-García (10.1016/j.matcom.2024.06.012_b6) 2022; 31 Little (10.1016/j.matcom.2024.06.012_b8) 2020; 8 Chen (10.1016/j.matcom.2024.06.012_b5) 2020; 115 Elliott (10.1016/j.matcom.2024.06.012_b1) 2017; 32 Andridge (10.1016/j.matcom.2024.06.012_b7) 2019; 68 Madow (10.1016/j.matcom.2024.06.012_b26) 1949; 20 Beaumont (10.1016/j.matcom.2024.06.012_b18) 2020; 46 Castro-Martín (10.1016/j.matcom.2024.06.012_b11) 2020; 8 Rosenbaum (10.1016/j.matcom.2024.06.012_b16) 1984; 79 10.1016/j.matcom.2024.06.012_b19 Valliant (10.1016/j.matcom.2024.06.012_b9) 2011; 40 Schonlau (10.1016/j.matcom.2024.06.012_b13) 2017; 32 Breiman (10.1016/j.matcom.2024.06.012_b20) 2001; 45 10.1016/j.matcom.2024.06.012_b22 Chen (10.1016/j.matcom.2024.06.012_b25) 2024 Ferri-García (10.1016/j.matcom.2024.06.012_b10) 2020; 15 Friedman (10.1016/j.matcom.2024.06.012_b24) 2001 Rivers (10.1016/j.matcom.2024.06.012_b4) 2007 Castro-Martín (10.1016/j.matcom.2024.06.012_b23) 2021; 9 Buskirk (10.1016/j.matcom.2024.06.012_b28) 2015 Kern (10.1016/j.matcom.2024.06.012_b12) 2020; 9 Malley (10.1016/j.matcom.2024.06.012_b21) 2012; 51 Lee (10.1016/j.matcom.2024.06.012_b2) 2006; 22 Wang (10.1016/j.matcom.2024.06.012_b3) 2020; 183 Tillé (10.1016/j.matcom.2024.06.012_b27) 2023 Lee (10.1016/j.matcom.2024.06.012_b14) 2009; 37 Cochran (10.1016/j.matcom.2024.06.012_b15) 1968 |
References_xml | – year: 2007 ident: b4 article-title: Sampling for web surveys – volume: 37 start-page: 319 year: 2009 end-page: 343 ident: b14 article-title: Estimation for volunteer panel web surveys using propensity score adjustment and calibration adjustment publication-title: Sociol. Methods Res. – start-page: 295 year: 1968 end-page: 313 ident: b15 article-title: The effectiveness of adjustment by subclassification in removing bias in observational studies publication-title: Biometrics – volume: 20 start-page: 333 year: 1949 end-page: 354 ident: b26 article-title: On the theory of systematic sampling, II publication-title: Ann. Math. Stat. – volume: 8 start-page: 879 year: 2020 ident: b11 article-title: Inference from non-probability surveys with statistical matching and propensity score adjustment using modern prediction techniques publication-title: Mathematics – year: 2024 ident: b25 article-title: Xgboost: Extreme gradient boosting. r package version 1.7.7.1 – volume: 32 start-page: 249 year: 2017 end-page: 264 ident: b1 article-title: Inference for nonprobability samples publication-title: Statist. Sci. – volume: 8 start-page: 932 year: 2020 end-page: 964 ident: b8 article-title: Measures of the degree of departure from ignorable sample selection publication-title: J. Surv. Stat. Methodol. – volume: 46 start-page: 1 year: 2020 end-page: 28 ident: b18 article-title: Are probability surveys bound to disappear for the production of official statistics publication-title: Survey Methodol. – volume: 115 start-page: 2011 year: 2020 end-page: 2021 ident: b5 article-title: Doubly robust inference with nonprobability survey samples publication-title: J. Amer. Statist. Assoc. – volume: 15 year: 2020 ident: b10 article-title: Propensity score adjustment using machine learning classification algorithms to control selection bias in online surveys publication-title: PLoS One – volume: 9 start-page: 1088 year: 2020 end-page: 1113 ident: b12 article-title: Boosted kernel weighting—Using statistical learning to improve inference from nonprobability samples publication-title: J. Surv. Stat. Methodol. – volume: 79 start-page: 516 year: 1984 end-page: 524 ident: b16 article-title: Reducing bias in observational studies using subclassification on the propensity score publication-title: J. Amer. Statist. Assoc. – volume: 45 start-page: 5 year: 2001 end-page: 32 ident: b20 article-title: Random forests publication-title: Mach. Learn. – year: 2023 ident: b27 article-title: Sampling: Survey sampling – reference: T. Chen, C. Guestrin, Xgboost: A scalable tree boosting system, in: Proceedings of the 22nd Acm SIGKDD International Conference on Knowledge Discovery and Data Mining, 2016, pp. 785–794. – volume: 8 start-page: 231 year: 2020 end-page: 263 ident: b17 article-title: Comparing alternatives for estimation from nonprobability samples publication-title: J. Surv. Stat. Methodol. – volume: 31 start-page: 619 year: 2022 end-page: 643 ident: b6 article-title: Weight smoothing for nonprobability surveys publication-title: TEST – start-page: 1 year: 2015 end-page: 17 ident: b28 article-title: Finding respondents in the forest: A comparison of logistic regression and random forest models for response propensity weighting and stratification publication-title: Surv. Methods: Insights Field – volume: 40 start-page: 105 year: 2011 end-page: 137 ident: b9 article-title: Estimating propensity adjustments for volunteer web surveys publication-title: Sociol. Methods Res. – volume: 22 start-page: 329 year: 2006 end-page: 349 ident: b2 article-title: Propensity score adjustment as a weighting scheme for volunteer panel web surveys publication-title: J. Off. Stat. – reference: K.C.K. Chu, J.F. Beaumont, The use of classification trees to reduce selection bias for a non-probability sample with help from a probability sample, in: Proceedings of the Survey Methods Section: SSC Annual Meeting, vol. 26, Calgary, AB, Canada, 2019. – volume: 183 start-page: 1293 year: 2020 end-page: 1311 ident: b3 article-title: Improving external validity of epidemiologic cohort analyses: A kernel weighting approach publication-title: J. R. Stat. Soc. Ser. A: Stat. Soc. – volume: 51 start-page: 74 year: 2012 end-page: 81 ident: b21 article-title: Probability machines publication-title: Methods Inf. Med. – volume: 32 start-page: 279 year: 2017 end-page: 292 ident: b13 article-title: Options for conducting web surveys publication-title: Statist. Sci. – start-page: 1189 year: 2001 end-page: 1232 ident: b24 article-title: Greedy function approximation: A gradient boosting machine publication-title: Ann. Stat. – volume: 9 year: 2021 ident: b23 article-title: On the use of gradient boosting methods to improve the estimation with data obtained with self-selection procedures publication-title: Mathematics – volume: 68 start-page: 1465 year: 2019 end-page: 1483 ident: b7 article-title: Indices of non-ignorable selection bias for proportions estimated from non-probability samples publication-title: J. R. Stat. Soc. Ser. C. Appl. Stat. – volume: 37 start-page: 319 issue: 3 year: 2009 ident: 10.1016/j.matcom.2024.06.012_b14 article-title: Estimation for volunteer panel web surveys using propensity score adjustment and calibration adjustment publication-title: Sociol. Methods Res. doi: 10.1177/0049124108329643 – volume: 9 start-page: 1088 issue: 5 year: 2020 ident: 10.1016/j.matcom.2024.06.012_b12 article-title: Boosted kernel weighting—Using statistical learning to improve inference from nonprobability samples publication-title: J. Surv. Stat. Methodol. doi: 10.1093/jssam/smaa028 – year: 2023 ident: 10.1016/j.matcom.2024.06.012_b27 – start-page: 295 year: 1968 ident: 10.1016/j.matcom.2024.06.012_b15 article-title: The effectiveness of adjustment by subclassification in removing bias in observational studies publication-title: Biometrics doi: 10.2307/2528036 – volume: 20 start-page: 333 issue: 3 year: 1949 ident: 10.1016/j.matcom.2024.06.012_b26 article-title: On the theory of systematic sampling, II publication-title: Ann. Math. Stat. doi: 10.1214/aoms/1177729988 – volume: 22 start-page: 329 issue: 2 year: 2006 ident: 10.1016/j.matcom.2024.06.012_b2 article-title: Propensity score adjustment as a weighting scheme for volunteer panel web surveys publication-title: J. Off. Stat. – volume: 8 start-page: 231 issue: 2 year: 2020 ident: 10.1016/j.matcom.2024.06.012_b17 article-title: Comparing alternatives for estimation from nonprobability samples publication-title: J. Surv. Stat. Methodol. doi: 10.1093/jssam/smz003 – volume: 79 start-page: 516 issue: 387 year: 1984 ident: 10.1016/j.matcom.2024.06.012_b16 article-title: Reducing bias in observational studies using subclassification on the propensity score publication-title: J. Amer. Statist. Assoc. doi: 10.1080/01621459.1984.10478078 – volume: 31 start-page: 619 issue: 3 year: 2022 ident: 10.1016/j.matcom.2024.06.012_b6 article-title: Weight smoothing for nonprobability surveys publication-title: TEST doi: 10.1007/s11749-021-00795-7 – start-page: 1189 year: 2001 ident: 10.1016/j.matcom.2024.06.012_b24 article-title: Greedy function approximation: A gradient boosting machine publication-title: Ann. Stat. – volume: 51 start-page: 74 issue: 01 year: 2012 ident: 10.1016/j.matcom.2024.06.012_b21 article-title: Probability machines publication-title: Methods Inf. Med. doi: 10.3414/ME00-01-0052 – volume: 8 start-page: 879 issue: 6 year: 2020 ident: 10.1016/j.matcom.2024.06.012_b11 article-title: Inference from non-probability surveys with statistical matching and propensity score adjustment using modern prediction techniques publication-title: Mathematics doi: 10.3390/math8060879 – volume: 32 start-page: 279 issue: 2 year: 2017 ident: 10.1016/j.matcom.2024.06.012_b13 article-title: Options for conducting web surveys publication-title: Statist. Sci. doi: 10.1214/16-STS597 – volume: 46 start-page: 1 issue: 1 year: 2020 ident: 10.1016/j.matcom.2024.06.012_b18 article-title: Are probability surveys bound to disappear for the production of official statistics publication-title: Survey Methodol. – start-page: 1 year: 2015 ident: 10.1016/j.matcom.2024.06.012_b28 article-title: Finding respondents in the forest: A comparison of logistic regression and random forest models for response propensity weighting and stratification publication-title: Surv. Methods: Insights Field – volume: 45 start-page: 5 year: 2001 ident: 10.1016/j.matcom.2024.06.012_b20 article-title: Random forests publication-title: Mach. Learn. doi: 10.1023/A:1010933404324 – year: 2024 ident: 10.1016/j.matcom.2024.06.012_b25 – year: 2007 ident: 10.1016/j.matcom.2024.06.012_b4 – volume: 15 issue: 4 year: 2020 ident: 10.1016/j.matcom.2024.06.012_b10 article-title: Propensity score adjustment using machine learning classification algorithms to control selection bias in online surveys publication-title: PLoS One doi: 10.1371/journal.pone.0231500 – volume: 32 start-page: 249 issue: 2 year: 2017 ident: 10.1016/j.matcom.2024.06.012_b1 article-title: Inference for nonprobability samples publication-title: Statist. Sci. doi: 10.1214/16-STS598 – volume: 40 start-page: 105 issue: 1 year: 2011 ident: 10.1016/j.matcom.2024.06.012_b9 article-title: Estimating propensity adjustments for volunteer web surveys publication-title: Sociol. Methods Res. doi: 10.1177/0049124110392533 – volume: 183 start-page: 1293 issue: 3 year: 2020 ident: 10.1016/j.matcom.2024.06.012_b3 article-title: Improving external validity of epidemiologic cohort analyses: A kernel weighting approach publication-title: J. R. Stat. Soc. Ser. A: Stat. Soc. doi: 10.1111/rssa.12564 – volume: 9 issue: 2991 year: 2021 ident: 10.1016/j.matcom.2024.06.012_b23 article-title: On the use of gradient boosting methods to improve the estimation with data obtained with self-selection procedures publication-title: Mathematics – ident: 10.1016/j.matcom.2024.06.012_b22 doi: 10.1145/2939672.2939785 – volume: 8 start-page: 932 issue: 5 year: 2020 ident: 10.1016/j.matcom.2024.06.012_b8 article-title: Measures of the degree of departure from ignorable sample selection publication-title: J. Surv. Stat. Methodol. doi: 10.1093/jssam/smz023 – volume: 68 start-page: 1465 issue: 5 year: 2019 ident: 10.1016/j.matcom.2024.06.012_b7 article-title: Indices of non-ignorable selection bias for proportions estimated from non-probability samples publication-title: J. R. Stat. Soc. Ser. C. Appl. Stat. doi: 10.1111/rssc.12371 – ident: 10.1016/j.matcom.2024.06.012_b19 – volume: 115 start-page: 2011 issue: 532 year: 2020 ident: 10.1016/j.matcom.2024.06.012_b5 article-title: Doubly robust inference with nonprobability survey samples publication-title: J. Amer. Statist. Assoc. doi: 10.1080/01621459.2019.1677241 |
SSID | ssj0007545 |
Score | 2.386104 |
Snippet | Propensity Score Adjustment (PSA) is a widely accepted method to reduce selection bias in nonprobability samples. In this approach, the (unknown) response... |
SourceID | crossref elsevier |
SourceType | Index Database Publisher |
StartPage | 779 |
SubjectTerms | Design weights Nonprobability samples Propensity score adjustment |
Title | Estimating response propensities in nonprobability surveys using machine learning weighted models |
URI | https://dx.doi.org/10.1016/j.matcom.2024.06.012 |
Volume | 225 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1La8MwDBZ9XHbZe6x7FB929Zr4ESfHUlq6DXrZCr2F2HFLBwulj41d9tsnO8nYYOywY4IERrY_SeaTBHCTM4WnVgfU5KGkgs_nVOs8pxzPB5M6Z0J7guwkGk_F_UzOGjCoa2EcrbLC_hLTPVpXf3qVNXur5bL3GHCF0CqFY0EyrkQT2ownkWxBu3_3MJ58ATLKeCYjylOnUFfQeZoXxoWONsLQV_lGniH73UN98zqjQ9ivwkXSL1d0BA1bHMNBPYqBVDfzBLIhXlUXfBYLsi5pr5as3EN7sfE9U8myIJjpu_kxZWfud7LZrV9xH4mjvi_Ii2dVWlKNkViQN_9oanPih-VsTmE6Gj4NxrSankANXtotNZj5Rq4hXSB9EzMZGqtFHGkdKx6ZIGOxZSZjVsSJyhB_pUVBNTexjqRKAn4GLVyXPQeCIR_GgbmQJgnRo-cx45h2cqOyRKOy7QCtLZauyiYZac0ee05LC6fOwqkj0YWsA6o2a_pjs1PE8T81L_6teQl77qssI7yC1na9s9cYT2x1F5q3H2G3OjWfb-LLmA |
linkProvider | Elsevier |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnZ1LSwMxEIBDrQe9-BbrMwevsbt5bHaPUlqq1l5sobewebRUsJQ-FC_-difZXVQQD153JxAmk5lJ-DKD0LWlEqxWR8TYWBDOxmOitbWEgX1QoS3lOgCy_aQ75PcjMaqhVvUWxmOVpe8vfHrw1uWXZqnN5nw6bT5FTIJrFdxTkJRJvoE2uWDSc303H1-cB0gEjhGkiRev3s8FyAuyQg-NUIhUoYxnTH-PT99iTmcP7ZTJIr4t5rOPam52gHarRgy43JeHKG_DRvWp52yCFwX06vDcX7PPlqFiKp7OMJzzffeYoi73O16uF6-witiD7xP8EphKh8smEhP8Fq5MncWhVc7yCA077UGrS8reCcTAll0RA-fexJeji0QoYSZi4zRPE61TyRIT5TR11OTU8TSTOXhf4UBQjk2qEyGziB2jOszLnSAMCR9kgZYLk8UQz21KGRw6mZF5pmGwayBSaUzNixIZqmLHnlWhYeU1rDxCF9MGkpVa1Y-lVuDF_xx5-u-RV2irO3jsqd5d_-EMbfs_xYPCc1RfLdbuAjKLlb4MlvMJJdHMYw |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Estimating+response+propensities+in+nonprobability+surveys+using+machine+learning+weighted+models&rft.jtitle=Mathematics+and+computers+in+simulation&rft.au=Ferri-Garc%C3%ADa%2C+Ram%C3%B3n&rft.au=Rueda-S%C3%A1nchez%2C+Jorge+L.&rft.au=Rueda%2C+Mar%C3%ADa+del+Mar&rft.au=Cobo%2C+Beatriz&rft.date=2024-11-01&rft.issn=0378-4754&rft.volume=225&rft.spage=779&rft.epage=793&rft_id=info:doi/10.1016%2Fj.matcom.2024.06.012&rft.externalDBID=n%2Fa&rft.externalDocID=10_1016_j_matcom_2024_06_012 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0378-4754&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0378-4754&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0378-4754&client=summon |