Capture, Code, Compare: Integrating computational modeling with video analysis
It’s clear that computational modeling has significantly impacted undergraduate physics education. Particularly, at the introductory level, students are now able to study problems that were previously inaccessible (such as a damped driven pendulum or a multi-body gravitational system) and observe the solutions to physics problems unfold in easy-to-create animations (using tools such as GlowScript). But in my experience (and, I suspect, in the experience of other physics educators), there remains a gap in student sensemaking between the results they see in a computational model and their observation of the physical world. For example, even after spending a week creating an animated solar system in GlowScript, it is still cognitively possible for an introductory student to shrug and say, “Well, that’s nice, but it’s still just a theory,” or, “That looked cool, but it doesn’t apply to my real life.” Now that physics educators have access to a robust foundation of computational modeling materials, we need to (1) help our students see that the models they implement on the computer actually do match up with the physical behavior of the universe around them, and (2) help our students learn how to quantitatively discuss where a model matches reality and where a model and reality diverge. I believe that working toward (1) can help build students’ confidence in physics as a field and their sense of physics’ relevance to them, and that working toward (2) can help prepare our students for the ever complexifying world of research and engineering projects they will be tasked with. I’ve spent the last year pursuing these goals at the introductory level in an activity structure I’ve dubbed Capture, Code, Compare (CCC). In this article, I’ll discuss the structure of a CCC activity, provide examples of the activities I’ve found most successful, explore a few possible ways of assessing CCC activities, recommend possible tools one might use in a CCC activity, and suggest some next steps for development of this structure.
Structure of a CCC activity
The basic outline of CCC is for students to capture the motion of a physical system using video and study that motion using video analysis software, develop a code that reproduces the physical system in a computer animation, and compare the behavior and quantitative results of the video analysis and coding process. In my introductory class, we spend nearly all week (4 out of 6 contact hours) on a CCC activity, with the remaining time available to discuss the week’s concept, conduct course-related business, and help students catch up on missed work. We conduct a total of 11 CCC activities, with the lowest score dropped and the remaining weeks of the semester devoted to an end-of-term project.
During the capture process, students work in groups to video-record an experiment with their cell phones. I provide each student with an adjustable universal cell phone tripod (available for less than $10 each on amazon.com) to help take as steady a video as possible (although video analysis software packages usually have the ability to correct for a moving camera). Each time they record a video, in addition to the usual need to set up and use the lab equipment properly, they must consider the angle they’re recording at (to eliminate motion toward or away from the camera), ensure the unique visibility of the object they intend to track (to help the software’s automated pixel-tracking system), and provide a length scale in the frame of the video (usually a meter stick at the same distance from the camera as the object to be tracked). During the first couple of activities, the students need frequent reminders about these considerations, but after working with a deficient video, they usually learn to proactively make the appropriate arrangements.
Once the video is imported from their cell phones to a computer (a process many recent high school grads have never needed to undertake), the students begin the process of studying the motion with video tracking software (I’ll leave the description here system-agnostic and provide references for possible tracking software packages below.). Students must orient a set of axes, which allows for corrections to the tilt of the camera, and set a length scale based on the meter stick in the video, which is a great opportunity to start talking about error propagation during the analysis. Then, they try to use the autotracking system, which works most of the time; when the video is too blurry or the object to be tracked is too indistinct for autotracking, I pose them the options of rerecording the experiment (which usually only takes 5 minutes) or manually tracking the object (which also takes about 5 minutes). Students tend to opt for each with equal frequency.
With the position (x,y) of the object for each video frame in the experiment recorded in a table, students can begin to examine the graphs of the position, velocity, acceleration, and any other quantities the software is capable of calculating. They quickly notice that the graphs look worse the more complex the dependent variable gets, which is another great opportunity to talk about error propagation. I usually provide instructions for which are the most salient graphs to examine (such as velocity and momentum in a collision experiment) and which quantities they’ll need to input a formula for the software to display (such as potential energy or power). There’s usually some set of physical properties the students need to extract from these graphs (average acceleration, time-of-travel, oscillation frequency) which will be important in the coding process.
During the coding process, I provide students with a starter code that they must modify to match the motion in the video they’ve analyzed. This process emphasizes that they are recreating the experiment in the computer by providing the same initial conditions as the experiment (initial position and velocity of the object moving, the object’s mass, the stiffness of a spring, etc.), but that they have no control over the results of the model (i.e., they can’t guarantee that it will produce the same time-of-travel or oscillation frequency).
In terms of output, I usually only provide students with a position versus time graph and ask them to modify the code to recreate the graphs they saw in the tracking software. Having students add graphs of additional physical properties is a great opportunity for them to wrestle with how the code calculates and stores physical properties, and helps them learn the difference between a scalar and a vector. They know the core of the code is working fine from the animation, and so if something doesn’t look right on the graphs, they can easily conclude that they need to refine their additions to the code. As is recommended by several PICUPers, I tend to directly tell students how to address syntax errors and typos but provide more question-based guidance regarding physical errors in their additions to the code.
Once the graphs are in place, the students start taking down the same results they recorded from the video analysis (time-of-travel, oscillation frequency). Although comparison is listed as a separate step in each week’s instructions, after a week or two they start to immediately compare these results with those from the experiment to see how they need to adjust the video analysis, the code, or both.
Finally, students compare the animation, graphs, and quantitative results of the video and computational model. If the code’s animation and the video’s motion clearly don’t match, the students might need to add an additional physical feature (such as drag force or friction) to the code, which is when we discuss how scientists revise their models based on empirical data. If the motions match qualitatively but there are some significant quantitative differences (such as location or height of a maximum, or oscillation frequency), the students go back over their work to explore whether the error might be in the video analysis, the code, or (as is usually the case) both. It’s at this point the students have to wrestle with the conundrum of which of these results is “actually correct.” If they discover a serious deficiency in the video, they might need to rerecord; if they discover a flaw in the code, I help them review it line by line.
Example CCC activities
Clearly, not every introductory-level lab activity can be used in the CCC framework. The beloved force table designed to help students learn vector addition doesn’t make for an exciting video, but video analysis can lend new life to the often maligned (including by me) cart on an inclined ramp.
Inclined plane experiment showing parabolic time-dependence.
CCC activities must be designed around a dynamic problem, which students tend to find more interesting than static problems. You can find my bank of activities on my Trinket course page.
My semester of CCC activities starts with students walking at constant velocity, making them active participants in the first activity and finding out their own average walking speeds. This activity is also helpful when we set up the first code, since I can make an analogy between the code’s time step and the steps they took on their walk.
Student walking at roughly constant velocity.
We then move on to collisions and constant-force motion with carts, and then by week 4 we can start talking about non-constant forces with a coffee filter drop.
Energy versus time results for a simulation and video analysis of a coffee filter drop.
In week 5, we work with spring oscillations; for whatever reason, this seems to be the point in the semester when this repetitive process of Capture, Code Compare finally sticks with the students, and they start to become more self-directed. (This is all anecdotal observation; we can talk about formal PER assessment at the end.)
Students realizing that their spring model’s energy is not conserved.
Then, in the second half of the semester, we move on to two-dimensional activities. Projectile motion suddenly becomes more real to the students when they have to evaluate whether their projectile was significantly impacted by air resistance (which it usually is), and students can see where the small-angle approximation for a pendulum starts to break down as they try higher amplitudes.
Projectile video analysis and computer model incorporating drag force.
Having students track and model the motion of different points on a rotating or rolling object helps drive home the point that those points have different linear velocities but the same angular velocity. Finally, I end these activities with a popular video of stars orbiting the black hole at the center of our galaxy:
https://www.universetoday.com/wp-content/uploads/2017/03/Black-hole-workable.gif
By this point in the semester, I hardly have to give instructions. The students take a gravitational force simulation and begin modifying the mass of the black hole and initial conditions of the star until they achieve an orbital path and period that match:
https://trinket.io/glowscript/2aff48ffd8
Possible assessments of a CCC activity
With the richness of a CCC activity, there is a lot an instructor can do to assess student learning. One could develop a problem set based directly out of the elements of the activity, or require a written lab report at the end of each activity. I find the CCC structure pairs well with the Letter Home (Lane, The Physics Teacher 52, 397 (2014)), in which students write a description of their activity to someone (usually their parents) not in the class, since the CCC structure provides a natural narrative for their writing to follow. If one can develop enough variety within an activity, CCC makes for great student presentations in which students can learn from each other. If one is pressed for time in a semester, it’s even sufficient to use an exit ticket each week where the activity is graded on the quality of agreement between video analysis and computer model as students wrap up the activity.
Recommended tools for CCC
In my implementation of CCC, I use universal cell phone tripods available from Acuvar. These are flexible enough to enable students to capture any physical behavior and sturdy enough to secure students’ mobile devices. Depending on your department’s budgetary flexibility, you can probably buy a new set each year and allow the students to keep the tripods. (You never know when they might find some physics in the wild worth analyzing!)
For the video analysis, I use the popular (and free) Tracker software for the video analysis. Tracker is well known in the physics education community and offers a variety of tools students can use. Vernier has also released a video analysis software.
For the computer simulations, I use GlowScript (usually hosted via Trinket, where my course is also hosted), which makes the animation process simple. If you want your students to use more advanced graphing options, matplotlib or MATLAB are good options, as well.
Next steps for development
I’ve enjoyed teaching using the CCC framework for the last year, and my students enjoy it, as well. I think they’re learning plenty of physics from it, but CCC needs some formal PER assessment to answer a few questions…
- Does CCC offer any advantages in developing students’ conceptual understanding? Does it bring about any detriments?
- How does CCC affect students’ attitudes and beliefs about physics (particularly its reliability at making predictions)?
- How does CCC help students understand the interplay between model-making and experimental testing? Can it help students grow out of thinking about “the scientific method?”
- How does CCC affect the development of students’ quantitative research skills in contexts beyond the introductory level?
I also see a potential and need for CCC to expand beyond traditional physics problems like projectile motion and the inclined plane. I’m working on incorporating richer activities like weight lifting and non-simple harmonic oscillators.