PERC 2011 Abstract Detail Page
|Abstract Title:||Multiple Assessments of Multiple-choice Assessments|
|Abstract:||Multiple-choice tests are used frequently in physics education as tools for assessing student learning. These assessments are often used to evaluate research-based curricula/pedagogies because they can be administered to a large student population, and are easy to grade, compare and analyze quantitatively. However, even with research-based multiple-choice tests, students' thought process is not revealed very well by the test answers alone. This session will showcase posters with examples of research highlighting advantages and limitations of multiple-choice assessments. These issues include correlation between students' performance on carefully designed multiple-choice tests and the same problems administered in a free-response format and graded on a rubric, gender effects, and other factors.|
|Abstract Type:||Poster Gallery Session
University of Pittsburgh
3941 Ohara Street
University of Pittsburgh
Pittsburgh, PA 15260
|Jing Li, University of Pittsburgh|
Poster Gallery Session Specific Information
|Poster 1 Title:||Comparing Students' Performance on Research-based Conceptual Assessments and Traditional Classroom Assessments: An Example from Second-semester Calculus-based Physics|
|Poster 1 Authors:||N. Sanjay Rebello
Physics Department, Kansas State University
|Poster 1 Abstract:||The use of concept inventories to investigate students' learning gains is common in physics education research. However, comparatively little research has compared students' learning gains on concept inventories with other more traditional assessments in the classroom. We present a study comparing second semester calculus-based physics students' performance on traditional classroom assessments including exams and homework with learning gains on SEMCO (Survey of Electricity, Magnetism, Circuits and Optics), which was previously created by combining questions on other conceptual surveys such as CSEM and DIRECT. We report on students' performance on specific items on SEMCO and corresponding traditional classroom assessments that are based on the same topic. Our results can raise some potentially interesting questions on the validity and usefulness of traditional classroom assessments and conceptual assessments that are often used to measure student learning in introductory physics.
This work is supported in part by U.S. National Science Foundation grant 0816207.
|Poster 2 Title:||Time-dependent Interpretation of Correct Responses to Multiple-Choice Questions|
|Poster 2 Authors:||David E. Meltzer
Arizona State University
|Poster 2 Abstract:||Students' reasoning regarding electric field concepts was analyzed by pre- and post-instruction responses to two related questions on the Conceptual Survey of Electricity and Magnetism; both multiple-choice responses and written explanations were examined. Although nearly half of all students gave correct pre-instruction responses on one of the questions, their written explanations made it clear that many either based their correct answers on vague or inconsistent "intuitive" thinking, or they simply guessed. These explanations, along with inconsistent responses on the related question, showed that most correct pretest answers on this item were not genuinely indicative of conceptual understanding. By contrast, most correct post-instruction responses to the same question were accompanied by acceptable explanations, as well as by matching correct responses on the related item. Thus the same (correct) response on a particular multiple-choice question had dramatically different interpretations regarding students' thinking depending on whether it had been given pre- or post-instruction.|
|Poster 3 Title:||Correlation Between Students' Performance on Free-response and Multiple-choice Questions|
|Poster 3 Authors:||Shih-Yin Lin and Chandralekha Singh
University of Pittsburgh
|Poster 3 Abstract:||While a multiple-choice test provides an efficient tool for assessment, instructors are often concerned that a free-response format facilitates a more accurate understanding of students' thought processes. Moreover, free-response questions allow students to get partial credit for displaying different extent of understanding of the subject tested. Here, we discuss a study in which two carefully designed research-based multiple-choice questions were transformed into free-response format and implemented on an exam in a calculus-based introductory physics course. Students' performance on the free-response questions was graded twice, first by using a rubric, and secondly by converting the answers back to one of the choices in the original multiple-choice format. There was an excellent match between the different free-response answers and the original choices in the multiple-choice questions. The strong correlation between the two scores graded using different methods suggests that carefully designed multiple-choice assessments can mirror the relative performance on the free-response questions.
This work is supported by NSF.
|Poster 4 Title:||FCI Normalized Gain, Scientific Reasoning Ability, Thinking in Physics, and Gender Effects|
|Poster 4 Authors:||Vincent Coletta1, Jeff Phillips1, Raquel Sena1, and Jeff Steinert2
1Loyola Marymount University, 2Arizona School for the Arts
|Poster 4 Abstract:||We observe no significant effect of gender on grades in our IE introductory mechanics courses at Loyola Marymount University, but we do observe a significant gender gap on FCI normalized gains, with males achieving higher gains than females. Over the past three years FCI gains have improved for both male and female students in IE classes taught with the Thinking in Physics (TIP) pedagogy. However, a gender gap on FCI gains remains, even when scientific reasoning abilities are taken into account. Indeed, the gap appears much greater for students with the strongest scientific reasoning skills and the highest FCI gains. Data from IE introductory physics courses using modeling at Edward Little High School in Maine show a similar result, with some additional data showing a reverse gender gap for those students with very weak scientific reasoning skills.|
|Poster 5 Title:||Assessing Gender Differences in Students' Understanding of Magnetism|
|Poster 5 Authors:||Jing Li and Chandralekha Singh
Department of Physics and Astronomy, University of Pittsburgh
|Poster 5 Abstract:||We investigate gender differences in students' difficulties with concepts related to magnetism. Our research uses a multiple-choice test whose reliability and validity have been substantiated earlier. We also conduct individual interviews with a subset of students to get a better understanding of the rationale behind their responses. We find that females performed significantly worse than males when the test was given both as a pre-test and post-test in traditionally taught calculus-based introductory physics courses. In the algebra-based courses, the performance of females was significantly worse in the pre-test but there was no statistical difference in the post-test performance of males and females. We discuss possible reasons for these differences. Supported by NSF.|