Interactive Video-Enhanced Tutorials Research

Problem-Solving Skills Assessment

Selected IVETs were evaluated using comparison groups for impact on student development of the problem-solving strategies targeted within each IVET. The IVETs were evaluated in Algebra-based and Calculus-based first semester Introductory Physics courses at the University of Cincinnati. For each study, two large lecture sections (up to 132 students each) of the same course taught by the same instructor were used, where one section was assigned the IVET as homework and the other section was assigned to watch the non-interactive video summary of the solution from the IVET. Students of both sections who opted not to complete the assignment comprised the "no treatment" group. In the class following the homework assignment, students were given a similar problem to solve on paper. This follow-up problem was designed to assess the specific learning outcomes around which the IVET was designed. Students' written solutions were then scored using a rubric that awarded points for meeting the learning outcomes. For the analysis, the mean scores for the graded follow-up problem were determined for each group (IVET, Video-only, no treatment), as well as the mean course exam scores for each group to determine similarity between the groups. Patterns of problem-solving behavior for each group were also compared to determine which group engaged in more expert-like approaches.  

The two-dimension motion IVET is showcased here as an example of this process. The problem presented to students in this IVET is shown in Figure 1. This problem was selected for the IVET because the solution is complex, which is shown in Figure 2 in the form of seven identified decision points that the solver must engage in as part of the solution process. These decision points become the set of learning outcomes for this IVET and inform the multiple-choice questions used to guide the student through an expert-like solution.

Problem students solve as part of two-dimensional motion IVET.

Fig. 1. Problem students solve as part of two-dimensional motion IVET.

Decision points for solution to IVET problem

Fig. 2. Decision points for solution to IVET problem.

The follow-up problem used to measure transferability of problem-solving skills is written around the identified learning outcomes, where the solution requires a similar approach to the IVET problem but the context is different (see Figures 3 and 4).

Follow-up problem to measure IVET impact

Fig. 3. Follow-up problem to measure IVET impact on development of problem-solving approaches.

Decision points for solution to follow-up problem

Fig. 4. Decision points for solution to follow-up problem.

The items in Figure 4 are turned into a 7-point rubric for scoring students' solutions to the follow-up problem. Figure 5 shows the distribution of scores (out of 7 points possible), along with the mean scores, for the IVET and Video-only treatment groups. Although these scores are not statistically different (p=0.076, Eff size = 0.259), the Video-only group scored almost 10% higher on the first exam, suggesting that the IVET was a better promoter of the targeted problem-solving abilities than watching a non-interactive video solution of the same problem. In addition, Table 1 shows the frequency a range of solutions was presented by students in the three treatment groups.

 Range and frequency of solutions to the follow-up problem

Table 1. Range and frequency of solutions to the follow-up problem.

Impact of IVET and Video-only treatments

Fig. 5. Impact of IVET and Video-only treatments on students' abilities in problem-solving.

Similar results for some of the other IVETs can be found on the publications page of this website as well as in an upcoming publication.

Feasibility of an Affect-Adaptive Component

Part of the IVET project is a feasibility study for including an affect-adaptive component to the IVETs. With the advice of an outside consultant, we formulated a single question asking students how they were feeling at the time, which was near the middle of the IVET. They were presented with multiple-choice answers such as "I feel fine: I'm engaged with the problem and feeling good," "I feel confused: The tutorial is moving too quickly," "I feel frustrated: I keep getting the questions wrong, but I understand the concepts," or "I feel worried: I feel that I don't have enough background to follow the tutorial." After the student makes a choice, the student is shown a short video in which a sympathetic narrator offers advice and encouragement related to the student's choice. Based on the results we have seen (described in the consultant's report), we believe the inclusion of an affect-adaptive question into an IVET is useful and will be accepted by nearly all students. The final IVETs that we are disseminating all contain the affect-adaptive question.

Student Feedback

During the research phase of the project, over a thousand students at the University of Cincinnati and Rochester Institute of Technology used IVETs in their classes. At the end of each IVET was a free-response question where students could offer feedback, including their evaluation of the activity and the affect-adaptive question. Some of the submissions were coded and analyzed in an internal report.