Scaffolding learning: From specific to generic with large language models

Large language models such as ChatGPT have been shown to excel in solving complex math problems. However, they cannot solve basic arithmetic problems such as 758*639 = 484,362. This makes us ponder if LLMs have been trained to solve math and science problems in the right way. When a student learns m...

Full description

Saved in:
Bibliographic Details
Published inPloS one Vol. 19; no. 9; p. e0310409
Main Authors Yin, David S., Yin, Xiaoxin
Format Journal Article
LanguageEnglish
Published United States Public Library of Science 20.09.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Large language models such as ChatGPT have been shown to excel in solving complex math problems. However, they cannot solve basic arithmetic problems such as 758*639 = 484,362. This makes us ponder if LLMs have been trained to solve math and science problems in the right way. When a student learns math at school, she or he starts with arithmetic, then moves to word problems, polynomials, and calculus. Each skill she or he acquires will be used in the next stage to solve more advanced problems. In this paper we propose Scaffolding Learning for LLMs, which imitates how a student learns a subject in a step-by-step manner. For example, we first train an LLM to perform highly specific operations such as multiplication and division, and then apply such “skills” in a more generic task such as solving word problems. This is related to Curriculum Training , which trains a model on tasks following a specific order, such as training on easy tasks first and then gradually increases the difficulty. Our proposed approach goes from specific tasks to generic ones, which can be considered as a special case of Curriculum Training. Our empirical studies show that when an LLM has “mastered” a specific skill, only a small amount of training is required to teach it to apply the skill to a more generic application.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1932-6203
1932-6203
DOI:10.1371/journal.pone.0310409