Supporting scientific writing and evaluation in a conceptual physics course with Calibrated Peer Review Documents

Main Document

Supporting scientific writing and evaluation in a conceptual physics course with Calibrated Peer Review 

written by Edward Price, Fred Goldberg, Scott Patterson, and Paul Heft

Writing tasks are one way students can apply science concepts, yet evaluating students' writing can be difficult in large classes. With the web-based Calibrated Peer Review* (CPR) system, students submit written work and evaluate each other. Students write a response to a prompt, read and evaluate responses prepared by the curriculum developers, and receive feedback on their evaluations, allowing students to "calibrate" their evaluation skills. Students then evaluate their peers' work and their own work. We have used CPR for two semesters in conceptual physics courses with enrollments of ~100 students. By independently assessing students' responses, we evaluated the CPR calibration process and compared students' peer reviews with expert evaluations. Students' scores on their essays correlate with our independent evaluations. This poster describes these findings and our experiences with implementing CPR assignments.

Published January 24, 2013
Last Modified June 28, 2013