Comparing Collaborative Problem Solving Profiles Derived from Human and Semi-Automated Annotation
New challenges in today's world have contributed to increased attention toward evaluating individuals' collaborative problem solving (CPS) skills. One difficulty with this work is identifying evidence of individuals' CPS capabilities, particularly when interacting in digital spaces. O...
Saved in:
Published in | Grantee Submission |
---|---|
Main Authors | , , , |
Format | Report |
Language | English |
Published |
2022
|
Subjects | |
Online Access | Get more information |
Cover
Loading…
Summary: | New challenges in today's world have contributed to increased attention toward evaluating individuals' collaborative problem solving (CPS) skills. One difficulty with this work is identifying evidence of individuals' CPS capabilities, particularly when interacting in digital spaces. Often human-driven approaches are used but are limited in scale. Machine-driven approaches can save time and money, but their reliability relative to human approaches can be a challenge. In the current study, we compare CPS skill profiles derived from human and semi-automated annotation methods across two tasks. Results showed that the same clusters emerged for both tasks and annotation methods, with the annotation methods showing agreement on labeling most students according to the same profile membership. Additionally, validation of cluster results using external survey measures yielded similar results across annotation methods. [This paper was published in: "CSCL2022 Proceedings," International Society of the Learning Sciences, 2022, pp. 363-366.] |
---|