PERC 2011 Abstract Detail Page
|Abstract Title:||Moving Beyond Conceptual Inventories|
|Abstract:||Many instructors across the country, particularly those teaching by non-traditional methods or interested in modifying their teaching techniques or curricula, find themselves in need of a more comprehensive assessment instrument. They would like to evaluate their instructional methods through a broader assessment of their students' skills, including laboratory, modeling (conceptual and mathematical), problem solving and critical thinking skills, assessed in the context of physics. They would like an instrument that can be used to compare their students' performance to that of other students nationwide.
In this targeted poster session, we will discuss the pros and cons of existing assessment instruments, which include assessments such as conceptual inventories and pre- and post-testing (mostly local), present work on assessments that go beyond conceptual inventories, and the need for broader, more comprehensive assessments, including examples of assessment instruments that might be useful to serve as a prototype.
|Abstract Type:||Poster Gallery Session
Texas Tech University
Physics Dept., MS 41051
Lubbock, TX 79409
Poster Gallery Session Specific Information
|Poster 1 Title:||Analysis of a Large-scale Assessment Project Using Available Assessment Instruments|
|Poster 1 Authors:||Beth Thacker and Keith West, Texas Tech University|
|Poster 1 Abstract:||We analyze the results of a large-scale assessment project which used available conceptual inventories and locally written pre- and post-tests in the laboratories in order to assess the effectiveness of our laboratory implementation. We found that the instruments were limited in scope and not sufficient to generate an accurate assessment of all aspects of the laboratory implementation. The conceptual inventories assessed both lecture and lab and did not assess problem solving, modeling, aspects of critical thinking or laboratory skills. The locally written pre- and post-tests were limited in coverage. However, they did give information on students' thinking processes, laboratory and problem solving skills. We discuss the need for a more comprehensive assessment instrument that addresses, not just conceptual understanding, but modeling, problem solving, aspects of critical thinking, laboratory and other skills that can be used across courses and universities to evaluate the effectiveness of our instruction.
This project is supported by the NIH grant 5RC1GM090897-02.
|Poster 2 Title:||Assessing Creativity and Innovation in Physics Students|
|Poster 2 Authors:||Patrick B. Kohl, H. Vincent Kuo, Susan Kowalski, Frank Kowalski, Colorado School of Mines|
|Poster 2 Abstract:||Creative thought and the ability to innovate are critical skills in industrial and academic careers alike. There exist attempts to foster creative skills in the business world, but little such work has been documented in a physics context. In particular, there are few tools available for those that want to assess the creativity of their physics students, making it difficult to tell whether instruction is having any effect. In this poster we outline a new course in the Colorado School of Mines physics department designed to develop creativity and innovation in physics majors. We present our efforts to assess this course formatively using tablet PCs and InkSurvey software, and summatively using the discipline-independent Torrance Tests of Creative Thinking. We also describe early work towards developing a physics-specific instrument.|
|Poster 3 Title:||Physics Learning Identity: Survey Development and Validation|
|Poster 3 Authors:||Dedra Demaree and Sissi Li, Oregon State|
|Poster 3 Abstract:||It's becoming common for physics courses to utilize active engagement and social learning. However, newest innovative curricula aim not only to improve content knowledge, but also include helping students develop practices and skills of authentic scientists. To students, this is often very different from their previous learning experiences in terms of behavioral expectations, attitude, and what learning means. Consequently, students must modify their identity as learners to participate productively in this learning environment. Current assessments are very good at measuring development of conceptual understanding, basic scientific reasoning, and attitudes toward science, but do not address issues specific to these innovative courses. We developed a 49-item survey to assess students' 1) expectations of student and teacher roles, 2) self efficacy towards skills supported in the Investigative Science Learning Environment and 3) attitudes towards social learning. Using principle components exploratory factor analysis, we established eight factors that measure these three characteristics.|
|Poster 4 Title:||Matching the Goals of Your Class with Assessment|
|Poster 4 Authors:||Eugenia Etkina, Rutgers University|
|Poster 4 Abstract:||One of the major goals of the ISLE learning system is to help students develop scientific abilities – tools and procedures that scientists use when constructing and applying new knowledge. To achieve this goal ISLE students learn to devise and test their own explanations, to design experiments to investigate new phenomena and to solve practical problems, and to pose their own questions. How do we help students succeed and how do we assess their progress in such a course? This poster will describe new approaches to formative assessment, new types of the traditional paper-and pencil exam questions and laboratory practical exams, and will share possible course structure that allows the instructors to invest time and energy in both formative and summative assessment.|
|Poster 5 Title:||Assessments that Analyze Students' Reasoning on Written Exam Questions|
|Poster 5 Authors:||Mojgan Matloob Haghanikar, Sytil Murphy, Dean Zollman,
Kansas State University
|Poster 5 Abstract:||As part of a study on the science preparation of elementary school teachers, we investigated students' reasoning skills in courses with inquiry-oriented teaching strategies and their counterparts in traditional courses. Inspired by a revision of Bloom's taxonomy  and Neiswandt and Bellomo's  conceptual structure classification method, we developed assessments that would classify different levels and qualities of indications of students' reasoning ability. The assessment tools are methods of analyzing extended written questions which based on a protocol that can define the question's level of abstraction, knowledge types and cognitive processes. Along with the protocol, we developed a rubric that allowed us analytically examine students' responses through several different lenses. This approach is being test with a large amount of data that were collected nationwide from 20 different universities. Our initial findings indicate varying results about students' reasoning abilities.|