Comparing Unprompted and Prompted Student-Generated Diagrams

Diagrams are ubiquitous in physics, especially in physics education and problem solving. Students might generate diagrams to orient themselves to a scenario, to organize information to aid in solving a problem, or as a tool of communication to demonstrate their understanding of a physical scenario. By asking 19 undergraduate and graduate physics majors to solve a number of multiple-choice physics problems—with no prompting regarding diagrams—and then explicitly asking them to generate diagrams of similar physical scenarios, we are able to compare which elements of a scenario students externalize on their own as compared to when they are prompted. We found that different physical contexts impact how critical it is to draw an accurate diagrams, and we explore implications for teaching and research.


I. INTRODUCTION
Diagramming physical scenarios, such as with a sketch or graph, is an integral part of physics problem solving. In physics education, diagrams may appear in problem statements, as required portions of student solutions, or as tools that aid in the problem-solving process. Instruction around constructing diagrams is standard in physics education (e.g., free body, ray, and field-line diagrams) [1][2][3][4], and problem solvers may generate diagrams to orient themselves to a problem [5,6], to aid in problem solving [7,8], or as a tool of communication [9,10].
While many researchers have looked into how students interpret and use common or professional representations [7,8,[11][12][13][14][15][16][17], few have looked at student-generated diagrams to inform our understanding of students' problem solving strategies [7,8,10]. Some researchers have found that unprompted, accurate force diagrams may help students solve force problems [7,10], but also that prompting for diagrams may interfere with student problem solving [10]. Additionally, researchers have found the characterization of such diagrams difficult, especially while trying to avoid a deficit framing of student work.
In this study, we investigate student-generated diagrams in order to answer the following question: How do spontaneously-generated student diagrams used in problem solving compare with similar, prompted student diagrams? To answer this question, we interviewed 19 undergraduate and graduate physics majors to gather unprompted and prompted student diagrams. Development of our interview prompts is discussed in Sec. II. Results from our interviews are described in Sec. III, and Sec. IV contains an overarching discussion of patterns across our results, as well as implications for teaching and research.

II. METHODS
To study and characterize diagrams generated by students, we interviewed 19 physics majors (4 sophomores, 5 juniors, 5 seniors, and 5 graduate students) and asked them to complete 18 multiple-choice physics problems followed by 6 diagramming tasks. The 18 problems were mostly introductory level content and could be solved entirely or in part with a diagram. To avoid cuing students to generate diagrams, question statements were text-only and did not ask for explanations or illustrations. The 6 diagramming tasks that followed asked students to carefully sketch/draw/graph and label a physical scenario resembling one of the multiple-choice problems. This gave us six problem-task pairs for comparing unprompted and prompted diagrams of similar situations (Fig. 1).
Questions and tasks were piloted with two faculty (including author BRW) and one graduate student. Interview participants responded to email solicitations sent to physics majors at the University of Colorado -Boulder and were financially compensated for their time.
Since students chose to draw unprompted diagrams and were asked to draw prompted diagrams, we could  What is the angle between the incoming and outgoing light?
Not enough information T. Carefully draw and label a ray-diagram for a ray of light that bounces off of two mirrors with an angle of 135 • between them. E-Field P. A charge of -q sits at ( ,0,0) and a second charge 2q sits at (0, ,0). What is the electric field at (0,0, )?
T. Carefully draw and label 3 points: A at ( ,0,0), B at (0,-,0) and C at (0,0, ). Then, if a charge -q sits at A and a charge 3q sits at B, sketch the electric field at the point C. Deltas P. Consider the 2-dimensional charge distribution: σ(x, y) = A δ(x − 1)δ(y + 1) + B δ(x + 1)δ(y − 1) + C δ(x + 2), and assume A, B, and C have the appropriate units to make all of the dimensions work out. How much total charge exists in the space defined below: None of the above T. Carefully draw and label the following charge distribution: FIG. 1. The six paired problems (P) and tasks (T) from our interviews. The tasks were the final 6 items in the interview.
Using the distributed cognition framework [6,9,19,20], we identified elements of physical scenarios students externalized for their own problem solving, then compared these elements to those externalized with prompting. Starting with a list of expected diagram elements (e.g., particular objects, arrows, labels, etc.), we iteratively coded these student diagrams, revising our coding until it accounted for virtually every mark a student made as part of a diagram [21,22]. As the marks coded required little qualitative interpretation, the authors examined a subset of diagrams to reach consensus after each round of coding, rather than complete a full IRR. I. Student performance on problem-task pairs. If part of a problem-task pair was skipped due to time constraints, those results are omitted from the table. Numbers are numbers of students. Columns indicate correctness of student answers to problems. Sub-columns categorize unprompted diagram content: more than given information (G+); only given information (G); no diagram (ND); or blank (B) if no work was shown. Rows describe accuracy of prompted student diagrams: Accurate or containing small errors (e.g., a 400 m length drawn longer than a 500 m length), Inaccurate (e.g., forces missing or going in the wrong direction), or No Diagram if the student chose not to diagram the scenario. Table I shows how students performed on problem-task pairs across three axes: answer correctness, unprompted diagram detail, and prompted diagram accuracy. For unprompted diagrams, we distinguished between diagrams depicting only information given in the prompt (G) and diagrams depicting additional information, such as calculated values or simplifications (G+). This distinction helped us determine if a diagram was used for orientation or as a dynamic representation used throughout the problem-solving process (discussed more in Sec. IV and also the focus of future work). We now discuss student answers (Table I) and diagrams (Fig. 2) for each problemtask pair and provide initial interpretations. Sec. IV discusses patterns across the 6 problem-task pairs. The Maps problem-task pair required students to consider the addition of four spatial vectors. Every student drew an unprompted diagram and completed the diagramming task. Only 3 students answered the problem incorrectly: 2 because of algebra mistakes and 1 who only drew 3 of the 4 segments of the path. No students attempted to answer the problem by drawing a diagram to scale and measuring the desired distance.

III. RESULTS AND INTERPRETATIONS
Of the 6 problems discussed in this paper, the Maps problem is the only one where every student actively referred to their unprompted diagram when answering the question, and no student struggled (despite a few small errors) to generate unprompted or prompted diagrams.
The Blocks problem-task pair asked students to consider forces (including friction) acting on two blocks. The problem is a canonical but challenging force problem. Every student drew an unprompted diagram, though 8 students included only information given in the problem statement in their diagram (i.e., they did not add to the diagram during problem solving). The presence of (and detail in) unprompted diagrams is not strongly correlated with selecting the correct answer (Table I), with the exception that 5 of 6 students who drew non-given forces on the block in question got the correct answer.
In the unprompted diagrams, 11 of 14 students who represented forces with arrows drew those arrows at the locations where the forces act (Fig. 2b), rather than at the center of mass, and 5 students also did this during the prompted diagramming task. We note this is consistent with student examples shown in Heckler [10], though this feature is not discussed at length in that text.
The Decay scenarios asked students to consider the amplitude of a damped oscillator. The 11 students who drew unprompted diagrams all sketched a mass on a spring (e.g., Fig. 2c left), and only one of these students drew a graph. No student referred to their unprompted diagram to answer the problem, and only 2 students included more than just given information in their diagram.
We believe that students who sketched this scenario primarily used their diagram to orient themselves to the problem. As 13 of 17 students who answered the question selected the correct answer, and as students were largely successful in drawing a decaying trig function when prompted (e.g., Fig. 2c right), we believe students productively chose tools other than diagramming (e.g., algebra, sensemaking) to solve the problem. The Mirrors problem-task pair asked students to consider a ray of light that reflects off two plane-mirrors with a given angle between them. Every student drew unprompted diagrams for this problem. An equal number of students (8) answered the problem correctly and incorrectly, with 3 students drawing unprompted diagrams but not selecting an answer. There seemed to be little correlation between the amount of detail in the unprompted diagrams and getting the correct answer, 1 except that the 4 students who selected the incorrect answer "Not enough information" drew very little. Interestingly, the 3 students who did not select an answer drew some of the most detailed unprompted diagrams.
Additionally, we found no correlation between students who selected the right or wrong answer with students who did or did not struggle to draw the similar situation when prompted. This indicates that drawing this particular ray diagram is challenging and, by itself, insufficient to answer the Mirrors problem correctly.
The E-Field scenarios asked students to consider the electric field at a point noncollinear with two point charges. Six students selected the correct answer for the electric field, and 5 students selected an answer with the correct magnitude but a sign error in the direction. None of these 5 students labeled axes or directions, whereas 4 of 7 students who did not have a sign error (including 1 who had a magnitude error) labeled axes or directions.
When tasked with carefully drawing a similar situation, 3 of the 5 students who had a sign error in the problem correctly drew the direction of the eclectic field and included directional labels (e.g., the student whose work is shown in Fig. 2e). These students were capable of drawing an accurate diagram with prompting, so it seems that while generating a diagram with directional labels is necessary (but not sufficient) to solve this type of problems, many students did not recognize or do this.
The Deltas problem-task pair asked students to consider a 2-dimensional charge distribution (σ) containing delta functions to represent two point charges (A & B) and a line charge (C). Only 3 of 18 students correctly answered the problem (how much charge is in the enclosed region?), and only 1 of these 3 students drew an accurate unprompted diagram. Six students stated or implied that C was a point, and 4 drew it as such (e.g., Fig 2f). Four of these 6 students (and 1 other student) asserted the total charge in the enclosed region was B + C, which is noteworthy as B + C was not a provided option.
One of these students drew C correctly as a line for the paired task, indicating that they might have simply misread the expression for σ in the problem. Eleven students did not complete the task as it was the last item in the interview (though only 1 student was asked by the interviewer to skip this task). However, the high number of students who selected an incorrect answer for the problem and drew a diagram consistent with their incorrect answer suggests that many students who did not complete the diagramming task would not have been able to draw an accurate diagram given more time. The difficulties we observed with this problem-task pair are consistent with research indicating that graphical interpretations of delta functions are challenging for upperdivision physics students [23].

IV. SYNTHESIS AND CONCLUSION
In this study, we presented 19 physics majors with 6 problem-task pairs in which students were asked to answer a multiple-choice question and then later diagram a similar physical situation. By comparing students' unprompted diagrams from the multiple-choice problems with the paired prompted diagrams, we hoped to learn about what elements of diagramming students use in problem-solving as compared to when the diagram is requested. The previous section detailed these 6 problemtask pairs and discussed the unprompted and prompted student diagrams for each pair. In this section, we discuss overarching trends, as well as implications for instruction and research.
Overall, and unsurprisingly, we found that students' prompted diagrams contained more detail than their unprompted diagrams. This includes double the frequencies of drawing axes (49% to 26%) and including units on numerical labels (74% to 37%), indicating that students do not generally value including these details while problem solving. Furthermore, as can be seen by the density of page lines across Fig. 2, prompted diagrams were often drawn much larger than unprompted diagrams, though students did have a little more space for these tasks and were not (generally) trying to do algebra in same space.
We also found that students' prompted diagrams were generally more accurately to scale. For example, we scanned and traced the vertices of student diagrams for the Maps problem and task to digitally reconstruct the paths students drew. We then scaled, aligned, and overlaid these paths to generate the images shown in Fig. 3. We can see qualitatively-by how close the majority of the student-drawn paths match the accurate red, bolded path-that students drew more accurate paths during the diagramming task than when the diagram was drawn without prompting, and in fact we see that the accurate path is the outlier among unprompted student-generated paths. Quantitatively, the end points of the unprompted diagrams are, on average, 2.3 times further 2 from the desired ending position than are the endpoints for the prompted diagram. 3 While the Maps problem had the most correct answers of the problems discussed here, we found a parallel between it and the E-Field problem, which had the second lowest number of correct student answers. For both of these problems, students who drew and labeled vectors accurately had a much higher chance of getting the correct answer than students who did not. This stands in contrast to problems, such as the Blocks and Decay problems, where accurate diagrams seem to have been less critical to getting the correct answer (though, for the Blocks problem, accurate and thorough diagrams did reduce the chance of students selecting an incorrect answer). Finally, we found that for the Mirrors and Deltas problems, the presence of an unprompted diagram did not correspond to a higher likelihood of getting the correct answer. For the Mirrors problem, it may be that the complicated diagram and algebra necessary to solve the problem symbolically overwhelmed students, whereas for the Deltas problem (the only non-introductory level problem), student difficulties likely stemmed from less exposure to this challenging physical context [23]. Three students used rulers at some point during the diagramming tasks, while no student used rulers during during the multiple-choice problems. As 4 of the 6 problems (Maps, Decay, Mirrors, and Deltas) could be solved without any computation given a moderately accurate diagram (and a ruler for the Maps problem and a protractor for the Mirrors problem), it is worth noting that, of our 19 students, only on the Deltas problem did any students (7) attempted to answer a problem using just a diagram, and only 1 student was successful.
The findings in this study have implications for instruction. We found many instances (such as with the Decay problem) where diagrams did not seem to aid students' in solving the problem. Taken together with Heckler's finding that prompting for diagrams might negatively impact student problem solving [10], we encourage instructors to evaluate when they require students to provide diagrams. While diagrams can be useful in orienting to a problem, as a tool to help solve the problem, and/or as a tool of communication, we suggest instructors strive for transparency and alignment in how they teach, require, and assess diagrams in their courses: automatically requiring diagrams in all solutions might not help student develop productive diagramming habits and intuitions. Furthermore, our findings show just one way in which doing physics is messy. Yet if students never see experts' messy work (including messy diagrams), and if students' messy work is not valued in assessment, students may believe that they are not good at physics.
The study design detailed in this paper can be used by other researchers to study elements of student problem solving-not just diagrams-while avoiding a deficit framing of student work. Future work includes using data collected in these interviews to compare diagramming across lower-division, upper-division, and graduate physics majors, and to further explore how students use diagrams to orient themselves to a problem.