Visualizing student engagement with simulations: a dashboard to characterize and differentiate instructional approaches

A central idea behind educational interactive simulations (sims) is that students’ learning and experience is shaped through their interactions with the simulation. Prior work has established that student interaction and engagement with sims is influenced by the instructional strategies used in sim-based lessons. This finding motivates the need for tools to record, analyze, and report on students’ interactions with sims during their learning experiences. In this paper, we investigate the capabilities of a new teacher dashboard for sims, specifically examining the dashboard’s ability to characterize and differentiate students’ engagement with two different instructional approaches in homework activities. One instructional approach invited students to discover essential variables via challenge-style questions, while the second asked students to make predictions and observations given specific actions in the sim. The experiment was conducted in college introductory physics courses and repeated for two sims, PhET's Energy Skate Park: Basics sim and Forces and Motion: Basics sim. The results demonstrate that the new teacher dashboard can successfully capture students’ interactions and help identify the differences in engagement across these activities. In this case, the dashboard showed students’ exploration of the sim elements and their total time of interaction increased in Challengestyle questions. We reflect on the capabilities of the dashboard and its role in instructional design.


I. INTRODUCTION
Student engagement with simulations (sims) in science classes can support conceptual learning [1,2] and promote students' development of science practices [3]. Within the classroom, the instructional design of the lesson and the teacher facilitation are important factors in addressing these goals [1,4]. Prior research has established that both the level and type of guidance used in a sim-based activity impact student engagement with sims and student learning [4][5][6][7].
However, how to design and iteratively improve simbased activities to support these goals remains a challenge for teachers and an active area of research [8,9]. A significant impediment is that currently teachers' primary way to know how their students are interacting with a sim during an activity is through direct observation. This approach misses information in a classroom full of students and is not available in sim-based homework. Advancements in how teachers access students' interaction with sims during sim-based assignments are needed to help teachers to review and improve their instructional design of sim-based lessons.
Dashboards in education are used to represent student data visually, helping teachers to understand how learners engage with different elements of the educational material and provide information for future activity design [10] and teacher support. In a prior studies, we reported on the design and development of a teacher dashboard for sims based on challenges that teachers faced in sim-centered activities in their classrooms [11], and we used the dashboard to characterize student interaction during a sim-based activity [12].
In this paper, we build on this research to examine the potential for this new dashboard to provide teachers with insights into student engagement that can inform activity design, comparison, and iterative improvement. Specifically, we focus on the question: What are the capabilities and limitations of the dashboard for characterizing and comparing student engagement across two different instructional approaches for sims? We compare instructional approaches known to elicit differential student engagement and interaction with sims [5]. For a dashboard to provide teachers with useful insight and feedback, it must be able to meaningfully characterize and differentiate between instructional approaches.

II. METHODS
In this study, we compared students' sim engagement and interaction during two sim-based homework assignments. These assignments shared the same learning goals, but used different instructional approaches -either a Guided Activity or a Challenge Activity. We conducted this research in the algebra-and calculus-based introductory physics courses at a large public research university. Students were asked to complete an approximately 20-minute homework activity. Each homework used a sim from University of Colorado Boulder's PhET Interactive Simulation project [13], enhanced with the PhET-iO extension that allows recording all student-sim interactions [14].
Students in the algebra-based introductory physics course (n=235) used a version of the Forces and Motion: Basics (FAMB) sim that was constrained to have just two screens: 'Motion' and 'Friction' 1 . In the 'Motion' screen, students can explore force and its relationship to the movement of objects. Users can push different objects (up to three objects stacked) modifying the force. Representations of the force vectors and other numeric data can be displayed. The 'Friction' screen adds the option to modify the friction and explore how this variable affects various aspects of motion.
Students in the calculus-based introductory physics course (n=653) used a version of the sim Energy Skate Park: Basics (ESPB) that was also constrained to have just two screens, 'Intro' and 'Friction' 2 . In the 'Intro' screen, students can explore the conservation of energy while a skater moves on different tracks. Students can move the skater to different positions of the track and change the skater's mass. Different representations of energy, speed, and a grid can be displayed.
The 'Friction' screen adds the option to modify the friction and explore how energy changes depend on this variable.
Briefly, the learning goals of the FAMB activities were to identify the variables that affect motion, to describe motion of an object in a situation with and without friction, and to explore the factors that affect the friction force. The learning goals for the ESPB activities were to identify the variables that affect energy, to describe the transformation of energy in situations with and without friction, and to explore the factors that affect thermal energy.
In each course, the students were divided into two groups and assigned one of two possible homework activity versions -either a Guided Activity or a Challenge Activity: • Guided Activity: This activity had more direct student instructions, providing more specific direction about how to control the sim before answering conceptual questions. An example prompt is: "Apply a constant force to the box for a few seconds and then stop the force. What happens to the speed when you are applying the force? After you stop applying a force?" • Challenge Activity: This activity posed challenges for the students but did not explicitly explain how to accomplish the challenge in the sim, providing opportunities for more student agency in the activity. An example prompt is: "Find and describe a situation where the speed decreases." For the FAMB sim, 115 and 120 students completed the Guided and Challenge Activities, respectively. For the ESPB sim, 323 and 330 students completed the Guided and Challenge Activities, respectively. Students' answers to the assignments were not considered in the analysis, as the focus was on evaluating the dashboards' ability to characterize student engagement with the sim.
In the analysis, we use four main graphs from the teacher dashboard to visualize student interaction and engagement with the sim [11,12] and to compare for differences. These graphs are described in the results sections, and use data recorded for each student, including every interaction event, time, and position (clicks, mouse-down and mouse-up) as well as the elements used during interaction.

III. RESULTS
In order to be a valuable tool for teachers, a dashboard for sims needs to provide teachers with sufficient insight into how students engage with the sim, such that they can evaluate and compare different sim-based activities. If the dashboard can capture differences in student experience, there is opportunity to inform a teacher's iterative improvement of her activity design.

A. Characterizing duration and level of engagement
The comparison of the dashboards for the two homework activities, Challenge and Guided, shows significant differences in student engagement across several dimensions, including time, interaction level, and elements used. The Time, Events and Elements graph for the FAMB activities ( Fig. 1), for example, shows that students in the Challenge Activity group used the sim for a longer time and had many more interaction events than students in the Guided Activity. Here, the median for the interaction time and events are distinctly different between these activities (medians of 21.0 min and 261 events for the Challenge, and 15.9 min and 162 events for the Guided, Fig. 1-A and B). Furthermore, we see students in the Challenge Activity condition use a greater percentage of the available control elements in the sim, as indicated by the greater abundance of the darker red dots in Fig. 1.
The Percentage of Student vs. Time graph (Fig. 2) provides another perspective on students' interaction time and the pattern of when students complete their use of the sim. Both activities were intended to be about 20 minutes. In the Challenge Activity, we find that nearly 100% of the students are still interacting with the sim after 10 minutes. In the Guided Activity, after 10 minutes over 10% of students have already left the sim. Similar differences were captured by the dashboard for the ESPB activities. In the Guided Activity, the median duration was 8.6 min, with a median of 52 events and 42% of sim elements used. In the Challenge Activity, the median time was 12.5 min, with a median of 100 events and 50% of sim elements used.
These dashboard graphs allow teachers to access metrics that provide a general overview of student engagement. With This graph shows the percentage of students that are running the sim as a function of the elapsed time. It shows when students finish using the sim and provides a measure of activity duration. this access, teachers can evaluate the duration and level of interaction with the sim generated by an activity, comparing it to their expectations for that activity, to data from a previous version of the activity, or to data from an activity with a different instructional pedagogy. For instance, for teachers who are concerned about students giving up early in the face of challenge-style homework questions, the dashboard can provide data to inform their instructional choices. In this case, that dashboard shows that students completing the Challenge Activity actually engaged for a longer time and interacted more with the sim and its elements than students in the Guided Activity.

B. Characterizing use of specific sim elements
Beyond the general overview, the dashboard graphs also capture more detailed student sim interaction, including information about student use of specific sim controls and how their use compares for two different activities. The dashboard's Elements Used Map (Fig. 3 for FAMB sim) visualizes this information for teachers. The most important action in the sim is to generate a force to move the objects in the sim. Almost all students (99% and 98%) in both activities applied a force in the sim (Fig. 3-A).
The sim has several different ways to modify the force, each with different affordances. We see both activities generated similar interactions with the force slider (Fig. 3-B), but the Challenge Activity shows more students using the buttons to change the applied force ( Fig. 3-B, 53-75%) compared to the Guided Activity (38-65%). Compared to the force slider, the buttons provide a more systematic and controlled way to increase or decrease the force. These buttons also create a constant force. More students in the Challenge Activity interacted with this element, which suggests that they may have noticed the difference between these controls.
In our prior work that identified teachers' challenges with sims and sim-based instruction [12], some teachers expressed a desire to know whether students follow their activity's instructions and whether they go beyond the instructions and explore the sim more deeply. We find that the dashboard can help answer these teachers' questions and shows differences between the two activities.
In the Guided Activity for FAMB, we see an example of students not following instructions. The activity gave students specific instructions to first test one box, then test two boxes, and describe the difference. The learning goal of this instruction was to help students observe the effect of mass on the motion. However, the dashboard shows that only 15% of students in the Guided Activity group interacted with the second box ( Fig. 3-C top picture). While most of these students did not follow the instructions to move the second box, 94% of them did move the fridge (Fig. 3-C top picture). Moving the fridge was not included in the instructions. Students using the fridge may mean that students had the opportunity to explore the relationships between mass, force and motion, although with a different experience than instructed in the activity.
While knowing a student interacted with a sim element does not conclusively imply what a student is learning or understanding, this knowledge is useful for teachers, forming a communication bridge. The teacher knows students had interaction experiences with certain sim elements and can use this information to support discussion. For example, in leading a group discussion, the teacher could leverage the knowledge that most students interacted with the fridge and ask students to share their observations around moving the fridge, or its comparison with other objects.
In comparison to the Guided Activity, the Challenge Activity shows more interaction with many of the mass elements (box, people, garbage can and presents) and almost universal interaction with the fridge (Fig.3-D). This data could be used by teachers to infer that students are responding as desired to the Challenge Activity's prompt,

FIG. 3. Student engagement comparison using the Elements Used
Map visualization for the FAMB sim. This visualization is a screenshot of the sim that displays the percentage of students that used each interactive element. It shows the numerical value and also uses a color gradient, with light colors for the elements that fewer students used and dark red color for the most-used elements.
"Find and describe a situation where the speed increases as slowly as possible in the simulation", which is designed to promote student exploration to find a configuration involving the maximum mass and a small applied force.

C. Characterizing patterns of interaction
The Events Map (Fig. 4) provides further detail and insight into student exploration patterns, showing the location of each user mouse event. In particular, in Fig.4-A, the dashboard is visualizing student interactions in the area above the box, showing if students are stacking objects one on top of another. The Challenge Activity shows more dots across all vertical stacking positions. This accumulation indicates that more students explored changing the objects pushed by the robot and that more students tried stacking up to three objects in the play area. The Guided Activity shows fewer interaction events over the box, capturing a difference in student interactions between these two activities.
As another example, the Events Map overlay on the friction slider (Fig. 4-B) shows that while students in both activity conditions explored lower friction values (the left half of the slider), students in the Challenge Activity more often explored higher values of friction. Teachers could leverage this information to revise the Guided Activity or to inform their facilitation of a follow-up group discussion in the classroom.
The dashboard graphs showed similar capabilities to capture interaction patterns and discern differences between the two ESPB activities. For example, • In the Guided Activity, no students interact with the mass slider in the friction screen (Fig. 5-A). • In the Guided Activity, the friction slider has more uniform values tested. The Challenge Activity shows more interaction at the extreme values (Fig. 5-A). • In both activities, students mostly started the skater at the top of the track (Fig. 5-B), but in the Challenge Activity, some students explored starting the skater in other intermediate track positions. • The Challenge Activity motivated more students to explore the other tracks (the ramp and the double well), as visualized with the event clicks and the values from the 'Elements Used Map' (Fig. 5-C). Less than 10% of students changed the track in the Guided Activity.

IV. CONCLUSIONS
This case study demonstrates that a teacher dashboard for sims can help characterize student interaction with a sim and meaningfully differentiate student interaction in activities that use the same sim and learning goals, but different instructional approaches. From prior research [4], we expect higher student engagement and interaction with sims during Challenge Activities. This work establishes that a dashboard can capture these differences and make this information accessible to teachers. Every metric of engagement considered in this work (total interaction time, elements used, and elements' values tested) shows higher engagement with the Challenge Activity condition.
Knowing how students interact with the sim during an activity may help iteratively improve sim-based activity design and teacher facilitation techniques. Even in the case of explicit step-by-step instructions, where students generally follow instructions, the dashboard captured instances in which directions were not followed (e.g., only 15% of students in the Guided Activity group interacted with the second box). Furthermore, the dashboard can identify when students interact with elements that are important for addressing the learning goals, and when they do not, allowing both modification of the instructions in the future and immediate facilitation to address the lost interaction with demonstrations or discussions in the following class period.
While useful for teachers, the dashboard has limitations. Notably, it does not provide information about student thinking, understanding or learning. While some research is examining the connection between sim interaction patterns and learning [15], the challenge is great.