Intuitive physics learning in a deep-learning model inspired by developmental psychology

‘Intuitive physics’ enables our pragmatic engagement with the physical world and forms a key component of ‘common sense’ aspects of thought. Current artificial intelligence systems pale in their understanding of intuitive physics, in comparison to even very young children. Here we address this gap b...

Full description

Saved in:
Bibliographic Details
Published inNature human behaviour Vol. 6; no. 9; pp. 1257 - 1267
Main Authors Piloto, Luis S., Weinstein, Ari, Battaglia, Peter, Botvinick, Matthew
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 01.09.2022
Nature Publishing Group
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:‘Intuitive physics’ enables our pragmatic engagement with the physical world and forms a key component of ‘common sense’ aspects of thought. Current artificial intelligence systems pale in their understanding of intuitive physics, in comparison to even very young children. Here we address this gap between humans and machines by drawing on the field of developmental psychology. First, we introduce and open-source a machine-learning dataset designed to evaluate conceptual understanding of intuitive physics, adopting the violation-of-expectation (VoE) paradigm from developmental psychology. Second, we build a deep-learning system that learns intuitive physics directly from visual data, inspired by studies of visual cognition in children. We demonstrate that our model can learn a diverse set of physical concepts, which depends critically on object-level representations, consistent with findings from developmental psychology. We consider the implications of these results both for AI and for research on human cognition. Piloto et al. introduce a deep-learning system which is able to learn basic rules of the physical world, such as object solidity and persistence.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2397-3374
2397-3374
DOI:10.1038/s41562-022-01394-8