Challenges to Informed Peer Review Matching Algorithms
Background Peer review is a beneficial pedagogical tool. Despite the abundance of data instructors often have about their students, most peer review matching is by simple random assignment. In fall 2008, a study was conducted to investigate the impact of an informed algorithmic assignment method, ca...
Saved in:
Published in | Journal of engineering education (Washington, D.C.) Vol. 99; no. 4; pp. 397 - 408 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Oxford, UK
Blackwell Publishing Ltd
01.10.2010
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Background
Peer review is a beneficial pedagogical tool. Despite the abundance of data instructors often have about their students, most peer review matching is by simple random assignment. In fall 2008, a study was conducted to investigate the impact of an informed algorithmic assignment method, called Un‐weighted Overall Need (UON), in a course involving Model‐Eliciting Activities (MEAs). The algorithm showed no statistically significant impact on the MEA Final Response scores. A study was then conducted to examine the assumptions underlying the algorithm.
Purpose (Hypothesis)
This research addressed the question: To what extent do the assumptions used in making informed peer review matches (using the Un‐weighted Overall Need algorithim) for the peer review of solutions to Model‐Eliciting Activities decay?
Design/method
An expert rater evaluated the solutions of 147 teams' responses to a particular implementation of MEAs in a first‐year engineering course at a large mid‐west research university. The evaluation was then used to analyze the UON algorithm's assumptions when compared to a randomly assigned control group.
Results
Weak correlation was found in the five UON algorithm's assumptions: students complete assigned work, teaching assistants can grade MEAs accurately, accurate feedback in peer review is perceived by the reviewed team as being more helpful than inaccurate feedback, teaching assistant scores on the first draft of an MEA can be used to accurately predict where teams will need assistance on their second draft, and the error a peer review has in evaluating a sample MEA solution is an accurate indicator of the error they will have while subsequently evaluating a real team's MEA solution.
Conclusions
Conducting informed peer review matching requires significant alignment between evaluators and experts to minimize deviations from the algorithm's designed purpose. |
---|---|
Bibliography: | istex:BD481DB99136F500D954CDD48F37833865C21F78 ArticleID:JEE1070 ark:/67375/WNG-3S298NMP-8 Sean P. Brophy is an assistant professor in the School of Engineering Education at Purdue University and research director for INSPIRE P‐12 Engineering Education. Dr. Brophy is currently conducting research on precursors to engineering thinking in young children. This work aligns well with his other research interests relate to using simulations and models to facilitate students' understanding of difficult concepts within engineering. Mary Besterfield‐Sacre is an associate professor and Fulton C. Noss Faculty Fellow in Industrial Engineering at the University of Pittsburgh. Her principal research interests are in engineering education evaluation methodologies, planning and modeling the K‐12 educational system. Her current focus areas lie in measuring aspects of technical entrepreneurship, problem solving, and design. She received her B.S. in Engineering Management from the University of Missouri—Rolla, her M.S. in Industrial Engineering from Purdue University, and a Ph.D. in Industrial Engineering at the University of Pittsburgh. She is a former associate editor for the Matthew A. Verleger is an assistant professor in the Utah State University department of Engineering and Technology Education. Prior to joining their faculty, he was a post‐doctoral researcher in Purdue University's School of Engineering Education. He received his B.S. in Computer Engineering in 2002, M.S. in Agricultural and Biological Engineering in 2005, and Ph.D. in Engineering Education in 2009 all from Purdue. Throughout that time, he has been a teaching assistant in Purdue first‐year engineering problem solving and computer tools courses. His research focuses on Model‐Eliciting Activities and the first‐year experience. and current associate editor for the Heidi A. Diefes‐Dux is an associate professor in the School of Engineering Education at Purdue University. She received her B.S. and M.S. in Food Science from Cornell University and her Ph.D. in Food Process Engineering from the Department of Agricultural and Biological Engineering at Purdue University. Since 1999, she has been a faculty member within the First‐Year Engineering Program at Purdue, the gateway for all first‐year students entering the College of Engineering. She coordinated (2000–2006) and continues to teach in the required first‐year engineering problem solving and computer tools course, which engages students in open‐ended problem solving and design. Her research focuses on the development, implementation, and assessment of model‐eliciting activities with realistic engineering contexts. She is currently the Director of Teacher Professional Development for the Institute for P‐12 Engineering Research and Learning (INSPIRE). Matthew W. Ohland is an associate professor in Purdue University's School of Engineering Education. He received his Ph.D. in Civil Engineering from the University of Florida in 1996. Dr. Ohland is the Past President of Tau Beta Pi and has delivered over 100 volunteer seminars as a facilitator in the society's award‐winning Engineering Futures program. He is Chair of the Educational Research and Methods division of the American Society for Engineering Education and a member of the Administrative Committee of the IEEE Education Society. His research on the longitudinal study of engineering student development, peer evaluation, and high‐engagement teaching methods has been supported by over $11.4 million in funding from NSF and Sloan. Journal of Engineering Education Advances in Engienering Education . |
ISSN: | 1069-4730 2168-9830 |
DOI: | 10.1002/j.2168-9830.2010.tb01070.x |