予測モデルの可逆性について

入力と出力が一対一に対応するとき,その関数は可逆であるという.本研究では可逆性制約を考えて,可逆な予測モデルに関する最近の研究について紹介する.さらに我々の研究Okuno and Imaizumi (2024) の紹介として,可逆性が既存のリプシッツ制約などと比べてどの程度強い制約であるのかをミニマックスレートの観点から議論する.提案法としてノンパラメトリックな可逆推定量を紹介し,ミニマックス最適性を達成することを説明する....

Full description

Saved in:
Bibliographic Details
Published in日本統計学会誌 Vol. 54; no. 2; pp. 205 - 220
Main Author 奥野, 彰文
Format Journal Article
LanguageJapanese
Published 一般社団法人 日本統計学会 04.03.2025
Subjects
Online AccessGet full text
ISSN0389-5602
2189-1478
DOI10.11329/jjssj.54.205

Cover

Abstract 入力と出力が一対一に対応するとき,その関数は可逆であるという.本研究では可逆性制約を考えて,可逆な予測モデルに関する最近の研究について紹介する.さらに我々の研究Okuno and Imaizumi (2024) の紹介として,可逆性が既存のリプシッツ制約などと比べてどの程度強い制約であるのかをミニマックスレートの観点から議論する.提案法としてノンパラメトリックな可逆推定量を紹介し,ミニマックス最適性を達成することを説明する.
AbstractList 入力と出力が一対一に対応するとき,その関数は可逆であるという.本研究では可逆性制約を考えて,可逆な予測モデルに関する最近の研究について紹介する.さらに我々の研究Okuno and Imaizumi (2024) の紹介として,可逆性が既存のリプシッツ制約などと比べてどの程度強い制約であるのかをミニマックスレートの観点から議論する.提案法としてノンパラメトリックな可逆推定量を紹介し,ミニマックス最適性を達成することを説明する.
Author 奥野, 彰文
Author_xml – sequence: 1
  fullname: 奥野, 彰文
  organization: 統計数理研究所 統計基盤数理研究系
BookMark eNo9j81KAzEcxINUcK09-gyeds3XP5ucRIpfUPCi55DNprqhVtn04q3LinrzHUT0oOhVfJ1owbewVPEyMz8GBmYVdcbnY4fQOsEZIYyqTe9D8BnwjGJYQgklUqWE57KDEszmGQSmK6gXQlVgDAoTxViCNj4_bmfvL7G9j-1NbJ9j8_p19_Y9vZ5Nn2Izx4fYXMXmcQ0tD80ouN6fd9Hx7s5Rfz8dHO4d9LcHqadYsHQoVA60JFwWHKwRjBQORE4ll0Rx7mSuqOXWlpIXihhLc1WCoCUYxiQHwrpo63fXh4k5cfqirs5MfalNPansyOnFSw1c04Vg-G_sqam1N-wHGaZa6Q
ContentType Journal Article
Copyright 2025 日本統計学会
Copyright_xml – notice: 2025 日本統計学会
DOI 10.11329/jjssj.54.205
DatabaseTitleList
DeliveryMethod fulltext_linktorsrc
Discipline Statistics
EISSN 2189-1478
EndPage 220
ExternalDocumentID article_jjssj_54_2_54_205_article_char_ja
GroupedDBID 2WC
3K4
5GY
ABDBF
ACGFO
ACIWK
AEGXH
AIAGR
ALMA_UNASSIGNED_HOLDINGS
E3Z
EBS
EJD
GX1
JSF
JSH
KQ8
OK1
OVT
P2P
RJT
TN5
TR2
XSB
ID FETCH-LOGICAL-j2063-f69752d148b45ca631be56728481944e8792c4ccd84b91ac279d562d5a3384513
ISSN 0389-5602
IngestDate Wed Sep 03 06:30:50 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed false
IsScholarly true
Issue 2
Language Japanese
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-j2063-f69752d148b45ca631be56728481944e8792c4ccd84b91ac279d562d5a3384513
OpenAccessLink https://www.jstage.jst.go.jp/article/jjssj/54/2/54_205/_article/-char/ja
PageCount 16
ParticipantIDs jstage_primary_article_jjssj_54_2_54_205_article_char_ja
PublicationCentury 2000
PublicationDate 2025/03/04
PublicationDateYYYYMMDD 2025-03-04
PublicationDate_xml – month: 03
  year: 2025
  text: 2025/03/04
  day: 04
PublicationDecade 2020
PublicationTitle 日本統計学会誌
PublicationTitleAlternate 日本統計学会和文誌
PublicationYear 2025
Publisher 一般社団法人 日本統計学会
Publisher_xml – name: 一般社団法人 日本統計学会
References Sill, J. (1997). Monotonic networks, In Advances in Neural Information Processing Systems, volume 10, MIT Press.
Teshima, T., Ishikawa, I., Tojo, K., Oono, K., Ikeda, M. and Sugiyama, M. (2020). Coupling-based invertible neural networks are universal diffeomorphism approximators, In Advances in Neural Information Processing Systems, volume 33, pages 3362–3373, Curran Associates, Inc.
Dinh, L., Sohl-Dickstein, J. and Bengio, S. (2017). Density estimation using real NVP, In International Conference on Learning Representations.
Ramsay, J. (1988). Monotone regression splines in action, Statistical Science, 3(4), 425–441.
Wright, F. (1981). The asymptotic behavior of monotone regression estimates, The Annals of Statistics, 9(2), 443–448.
Tomczak, J. M. and Welling, M. (2016). Improving variational auto-encoders using householder flow, arXiv preprint arXiv:1611.09630.
Jaini, P., Selby, K. A. and Yu, Y. (2019). Sum-of-squares polynomial flow, In Proceedings of the 36th International Conference on Machine Learning, volume 97 of Proceedings of Machine Learning Research, pages 3009–3018, PMLR.
Huang, C.-W., Krueger, D., Lacoste, A. and Courville, A. (2018). Neural autoregressive flows, In Proceedings of the 35th International Conference on Machine Learning, volume 80 of Proceedings of Machine Learning Research, pages 2078–2087, PMLR.
Tsybakov, A. B. (2009). Introduction to Nonparametric Estimation, Springer.
Ishikawa, I., Teshima, T., Tojo, K., Oono, K., Ikeda, M. and Sugiyama, M. (2023). Universal approximation property of invertible neural networks, Journal of Machine Learning Research, 24(287), 1–68.
Dinh, L., Krueger, D. and Bengio, Y. (2015). NICE: Non-linear independent components estimation, In International Conference in Learning Representations Workshop Track, 2015.
Ayer, M., Brunk, H., Ewing, G., Reid, W. and Silverman, E. (1955). An empirical distribution function for sampling with incomplete information, The Annals of Mathematical Statistics, 26(4), 641–647.
Daneri, S. and Pratelli, A. (2014). Smooth approximation of bi-Lipschitz orientation-preserving homeomorphisms, Annales de l'I.H.P. Analyse Non Lineaire, 31(3), 567–589.
Okuno, A. and Imaizumi, M. (2024). Minimax Analysis for Inverse Risk in Nonparametric Planer Invertible Regression, Electronic Journal of Statistics, 18(1), 355–394.
Okuno, A. and Yano, K. (2023). A generalization gap estimation for overparameterized models via the langevin functional variance, Journal of Computational and Graphical Statistics, 32(4), 1287–1295.
Robertson, T., Wright, F. and Dykstra, R. (1988). Order Restricted Statistical Inference, John Wiley & Sons.
Cybenko, G. (1989). Approximation by superpositions of a sigmoidal function, Mathematics of Control, Signals and Systems, 2(4), 303–314.
Okuno, A. and Harada, K. (2024). An interpretable neural network-based nonproportional odds model for ordinal regression, Journal of Computational and Graphical Statistics. to appear.
Kong, Z. and Chaudhuri, K. (2020). The expressive power of a class of normalizing flow models, In Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, volume 108 of Proceedings of Machine Learning Research, pages 3599–3609, PMLR.
Yarotsky, D. (2017). Error bounds for approximations with deep relu networks, Neural Networks, 94, 103–114.
Rezende, D. J., Papamakarios, G., Racaniere, S., Albergo, M., Kanwar, G., Shanahan, P. and Cranmer, K. (2020). Normalizing flows on tori and spheres, In Proceedings of the 37th International Conference on Machine Learning, volume 119 of Proceedings of Machine Learning Research, pages 8083–8092, PMLR.
Kingma, D. P. and Dhariwal, P. (2018). Glow: Generative flow with invertible 1x1 convolutions, In Advances in Neural Information Processing Systems, volume 31, Curran Associates, Inc.
Kobyzev, I., Prince, S. J. and Brubaker, M. A. (2021). Normalizing flows: An introduction and review of current methods, IEEE Transactions on Pattern Analysis & Machine Intelligence, 43(11), 3964–3979.
Hall, P. and Huang, L.-S. (2001). Nonparametric kernel regression subject to monotonicity constraints, The Annals of Statistics, 29(3), 624–647.
Rezende, D. and Mohamed, S. (2015). Variational inference with normalizing flows, In Proceedings of the 32nd International Conference on Machine Learning, volume 37 of Proceedings of Machine Learning Research, pages 1530–1538, PMLR.
Barlow, R., Bartholomew, D., Bremner, J. and Brunk, H. (1972). Statistical Inference under Order Restrictions, John Wiley & Sons.
References_xml – reference: Hall, P. and Huang, L.-S. (2001). Nonparametric kernel regression subject to monotonicity constraints, The Annals of Statistics, 29(3), 624–647.
– reference: Barlow, R., Bartholomew, D., Bremner, J. and Brunk, H. (1972). Statistical Inference under Order Restrictions, John Wiley & Sons.
– reference: Yarotsky, D. (2017). Error bounds for approximations with deep relu networks, Neural Networks, 94, 103–114.
– reference: Cybenko, G. (1989). Approximation by superpositions of a sigmoidal function, Mathematics of Control, Signals and Systems, 2(4), 303–314.
– reference: Teshima, T., Ishikawa, I., Tojo, K., Oono, K., Ikeda, M. and Sugiyama, M. (2020). Coupling-based invertible neural networks are universal diffeomorphism approximators, In Advances in Neural Information Processing Systems, volume 33, pages 3362–3373, Curran Associates, Inc.
– reference: Ramsay, J. (1988). Monotone regression splines in action, Statistical Science, 3(4), 425–441.
– reference: Tomczak, J. M. and Welling, M. (2016). Improving variational auto-encoders using householder flow, arXiv preprint arXiv:1611.09630.
– reference: Jaini, P., Selby, K. A. and Yu, Y. (2019). Sum-of-squares polynomial flow, In Proceedings of the 36th International Conference on Machine Learning, volume 97 of Proceedings of Machine Learning Research, pages 3009–3018, PMLR.
– reference: Dinh, L., Sohl-Dickstein, J. and Bengio, S. (2017). Density estimation using real NVP, In International Conference on Learning Representations.
– reference: Kingma, D. P. and Dhariwal, P. (2018). Glow: Generative flow with invertible 1x1 convolutions, In Advances in Neural Information Processing Systems, volume 31, Curran Associates, Inc.
– reference: Okuno, A. and Imaizumi, M. (2024). Minimax Analysis for Inverse Risk in Nonparametric Planer Invertible Regression, Electronic Journal of Statistics, 18(1), 355–394.
– reference: Ishikawa, I., Teshima, T., Tojo, K., Oono, K., Ikeda, M. and Sugiyama, M. (2023). Universal approximation property of invertible neural networks, Journal of Machine Learning Research, 24(287), 1–68.
– reference: Rezende, D. J., Papamakarios, G., Racaniere, S., Albergo, M., Kanwar, G., Shanahan, P. and Cranmer, K. (2020). Normalizing flows on tori and spheres, In Proceedings of the 37th International Conference on Machine Learning, volume 119 of Proceedings of Machine Learning Research, pages 8083–8092, PMLR.
– reference: Tsybakov, A. B. (2009). Introduction to Nonparametric Estimation, Springer.
– reference: Okuno, A. and Harada, K. (2024). An interpretable neural network-based nonproportional odds model for ordinal regression, Journal of Computational and Graphical Statistics. to appear.
– reference: Kong, Z. and Chaudhuri, K. (2020). The expressive power of a class of normalizing flow models, In Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, volume 108 of Proceedings of Machine Learning Research, pages 3599–3609, PMLR.
– reference: Sill, J. (1997). Monotonic networks, In Advances in Neural Information Processing Systems, volume 10, MIT Press.
– reference: Huang, C.-W., Krueger, D., Lacoste, A. and Courville, A. (2018). Neural autoregressive flows, In Proceedings of the 35th International Conference on Machine Learning, volume 80 of Proceedings of Machine Learning Research, pages 2078–2087, PMLR.
– reference: Daneri, S. and Pratelli, A. (2014). Smooth approximation of bi-Lipschitz orientation-preserving homeomorphisms, Annales de l'I.H.P. Analyse Non Lineaire, 31(3), 567–589.
– reference: Dinh, L., Krueger, D. and Bengio, Y. (2015). NICE: Non-linear independent components estimation, In International Conference in Learning Representations Workshop Track, 2015.
– reference: Rezende, D. and Mohamed, S. (2015). Variational inference with normalizing flows, In Proceedings of the 32nd International Conference on Machine Learning, volume 37 of Proceedings of Machine Learning Research, pages 1530–1538, PMLR.
– reference: Kobyzev, I., Prince, S. J. and Brubaker, M. A. (2021). Normalizing flows: An introduction and review of current methods, IEEE Transactions on Pattern Analysis & Machine Intelligence, 43(11), 3964–3979.
– reference: Wright, F. (1981). The asymptotic behavior of monotone regression estimates, The Annals of Statistics, 9(2), 443–448.
– reference: Okuno, A. and Yano, K. (2023). A generalization gap estimation for overparameterized models via the langevin functional variance, Journal of Computational and Graphical Statistics, 32(4), 1287–1295.
– reference: Robertson, T., Wright, F. and Dykstra, R. (1988). Order Restricted Statistical Inference, John Wiley & Sons.
– reference: Ayer, M., Brunk, H., Ewing, G., Reid, W. and Silverman, E. (1955). An empirical distribution function for sampling with incomplete information, The Annals of Mathematical Statistics, 26(4), 641–647.
SSID ssib005901933
ssib023160829
ssib023160828
ssib000650024
ssib023157179
ssj0033564
ssib000936966
ssib002223900
ssib000936967
Score 2.3839552
Snippet 入力と出力が一対一に対応するとき,その関数は可逆であるという.本研究では可逆性制約を考えて,可逆な予測モデルに関する最近の研究について紹介する.さらに我々の研究Okuno and Imaizumi (2024)...
SourceID jstage
SourceType Publisher
StartPage 205
SubjectTerms ノンパラメトリック推定
ミニマックスレート
可逆性
Title 予測モデルの可逆性について
URI https://www.jstage.jst.go.jp/article/jjssj/54/2/54_205/_article/-char/ja
Volume 54
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
ispartofPNX 日本統計学会誌, 2025/03/04, Vol.54(2), pp.205-220
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lb9QwEI5KuewF8RRv9cCIA0pJ_Ijto7PNqgKBhNRKvUV59ZBDqdD2wqmrRcCN_4AQHEBwRfydQCX-BeNJshtQD2WlVTR27OyMx7G_ccZjz7vHsyArnM9TmZU5Gij4prug6n4hEC6zfc0L2iH35Gm0vSse7cm9tXOjgdfS0TTfLF6euq9kFa1iHurV7ZL9D80uHooZSKN-8YoaxuuZdAyJgNiC1pBEEGuwY0g4aA6WdQSa-l1OTEQINoFEgp6AnUBinKODjlx1JKzqyywKi47QPWGjIZx1FY0CK4kYEwMKYglxCAnyo4k3CXYLKxK3YzCWbiHbiwVaKkI_x5ECTS6aLjfegjighyOHarhEwST5aC2XKOnx2smR0N-2vKAEMUlsYrCc2omDkV3LxfbBaiIMxk_EYj4CunawrygPAY3xQ9GeGtRPAG0U666js-FoHsgBMGC0a--UOYczF7O1rnH636R1OrmcXBcuj10nSalYKkXK6BLItL_jttmlNWL980wpcjl4_OzvUIfBILJ-QCcxRv-kh6Y542YQetHtMTZLaI-wXqpw-YUW01Ew-ILapU2PajiXXSi2rlH7eLUo_MOh6IjkarRrep9Igmk7F70LnX21YVtpL3lrdXbZGzmTqo1IfsW7__PH25PvX5r5-2b-ppl_bmZff7379vv49cnxp2aGyQ_N7FUz-3jV250kO-NtvzsuxK8ZAm1_PzJKshLt-1zIIot4mFcyUswdGGGEqLQyrBBFUWqRmzArmDIlov9SZpxrIUN-zVs_eH5QXfc2cN7LjIlKExgtKqFypQsXmbIsZYWIt7rh6VbI9LCNCZOeWb03V696yxst367b3vr0xVF1ByHxNL9LfeUPbDeKGw
linkProvider Colorado Alliance of Research Libraries
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=%E4%BA%88%E6%B8%AC%E3%83%A2%E3%83%87%E3%83%AB%E3%81%AE%E5%8F%AF%E9%80%86%E6%80%A7%E3%81%AB%E3%81%A4%E3%81%84%E3%81%A6&rft.jtitle=%E6%97%A5%E6%9C%AC%E7%B5%B1%E8%A8%88%E5%AD%A6%E4%BC%9A%E8%AA%8C&rft.au=%E5%A5%A5%E9%87%8E%2C+%E5%BD%B0%E6%96%87&rft.date=2025-03-04&rft.pub=%E4%B8%80%E8%88%AC%E7%A4%BE%E5%9B%A3%E6%B3%95%E4%BA%BA+%E6%97%A5%E6%9C%AC%E7%B5%B1%E8%A8%88%E5%AD%A6%E4%BC%9A&rft.issn=0389-5602&rft.eissn=2189-1478&rft.volume=54&rft.issue=2&rft.spage=205&rft.epage=220&rft_id=info:doi/10.11329%2Fjjssj.54.205&rft.externalDocID=article_jjssj_54_2_54_205_article_char_ja
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0389-5602&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0389-5602&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0389-5602&client=summon