Understanding the student experience with emergency remote teaching

In response to the COVID-19 pandemic, colleges and universities transitioned in-person instruction to a new modality we refer to as ‘emergency remote teaching’ (ERT). As many instructors may be facing this same format in future semesters, and in response to future emergency events, it is important to understand the student experience with ERT in order to inform recommendations and best practices that can be used to improve instruction. In this manuscript, we report on preliminary findings from a survey administered to physics students at a large research institution to gain both qualitative and quantitative feedback on what approaches to ERT are being used as well as which were perceived as most effective at supporting student learning. Here, we present four initial themes relating to: interactivity and student motivation; lecture format; exam format; and new challenges experienced by students as a result of ERT. These findings have significant implications for instructors with respect to optimizing ERT.


I. INTRODUCTION & BACKGROUND
In the Spring of 2020, the COVID-19 pandemic forced colleges and universities across the country to cease in-person classes and move instruction to a remote format. This transition, by necessity, happened very quickly-typically on the scale of several days to a week. Remote instruction is not a new phenomenon, and there are a number of possible, well-developed modalities. For example, "distance learning" or "remote learning" encompasses any approach to learning from afar, and long predates the invention of the internet [1,2]. "Online learning," on the other hand, is intentionally designed to exploit the affordances of that an online platform offers [2]; one common example of online learning is the Massive Open Online Courses [3]. However, while the sudden transition away from in-person classes that occurred in Spring 2020 has also exploited an online format, this transition was driven by necessity rather than intentional design, and is thus distinct from the established "online learning" format. To distinguish this point, we adopt the name "emergency remote teaching" (ERT) to describe this new modality adopted as a result of COVID-19, consistent with the suggestion of others [4].
During the transition to ERT, instructors were required to make an unprecedented shift in their instruction, in many cases in the middle of their course(s). Instructors utilized a variety of approaches to accomplish the switch, including many that were very innovative. However, unlike traditional in-person instruction, many instructors did not have the advantage of personal experience with online instruction, either as a student or as an instructor. As such, these instructors generally had minimal insight into the student experience with these different approaches. As many schools are facing uncertainty in when they will return to in-person instruction, it is important to understand what approaches to ERT students have found effective, what approaches they have found ineffective, and what challenges they have faced.
This work explores the student experience with ERT through a survey regarding which approaches they experienced as well as how effective these approaches were in supporting their learning. The goal of the survey was to provide an opportunity for students to directly inform recommendations and best practices for instructors that take into account the students' experiences and concerns. In this paper, we discuss the context and methods used, as well as the design of the survey instrument (Sec. II). We then present key results and implications for instructors faced with ERT (Sec. III). Finally, we end with a summary and discussion of limitations and future work (Sec. IV).

II. CONTEXT & METHODS
We designed the survey to elicit both qualitative and quantitative insight into students' experience with ERT. In order to elicit students' off-the-cuff ideas before they were prompted with the more targeted survey questions, the survey began with four open-response prompts; here, we will focus only on the first two prompts (given below). 1) Following the transition to emergency remote teaching, what was the most effective thing that an instructor did to support your learning in the new online format and why was it so effective? 2) Following the transition to emergency remote teaching, what was the least effective thing that an instructor did to support your learning in the new online format and what might have been a more effective approach? After the open-ended text boxes was a page asking students to select from a list of possible approaches to ERT that they had experienced. Options were organized into sections regarding: lectures, labs, help session, recitations, homework, exams, and projects/presentation. Students' selections here determined conditional followup questions where students could report on how effective the approaches were at supporting their learning. This structure meant students only saw relevant questions and was designed to reduce survey fatigue.
The final portion of the survey discussed in this paper consists of a page of Likert style questions targeting changes in the level of interaction in their courses and the impact of this change on their learning, as well as changes in their motivation and workload. These questions were followed by a further two open-ended text boxes addressing students motivation (prompts given below).
3) What, if anything, was the most effective thing done by an instructor to help you stay motivated and engaged in their course? 4) What, if anything, was the most effective thing you did to help yourself stay motivated and engaged in your courses? The survey ended with a series of demographics questions and an item related to extra challenges associated with completing schoolwork from home.
The survey was distributed at the end of the Spring 2020 semester to students enrolled in at least one physics course at the University of Colorado Boulder (CU). The student population of CU is predominantly white and relatively affluent; this has implications for the generalizability of our findings (see Sec. IV). Physics courses at CU are generally taught using significant active engagement including concept tests, tutorials, and group help session. We received 112 partial or complete responses. Of the 106 students who completed the demographics section at the end of the survey, 44% were freshmen, 23% were sophomores, 18% were Juniors, and 11% were seniors or above. Additionally, 78% identified as white with the majority of the remainder split evenly between Asian and Hispanic/Latinx. Additionally, 43% and 52% identified as women and men respectively. Since it is not clear how many students received the survey, it is not possible to determine an exact response rate; however, this institution has 750 physics majors, so the response rate has an upper bound of 15%, and the true response rate is likely smaller. It is worth noting that, while the survey was long, 106 of the 112 students completed all sections, and 107 of the 112 students completed all of the initial four open-ended text boxes. Their responses to the text boxes were detailed and often long, suggesting that these students were eager for the opportunity to provide feedback.
In the next section, we report findings from both the Likert style questions and the open-ended prompts. We began emergent coding [5] on students' open-ended responses, though statements are not the primary sources of claims for this paper, but rather are presented here to elaborate on quantitative findings. We anticipate conducting and presenting a more robust analysis of this qualitative data in future work.

III. RESULTS & DISCUSSION
In this section, we report on four preliminary themes from the survey data that we believe are particularly crucial for instructors to understand; these themes relate to 1) interactivity and motivation in ERT courses, 2) ERT lecture structure, 3) ERT exams, 4) new challenges stemming from ERT. Data here are pulled from the questions described in Sec. II and are presented as they relate to these themes rather than by question.

Interactivity & Motivation:
We asked students to report on how the level of interactivity in their classes changed as a result of the transition to ERT as well as whether the change in interactivity impacted their learning. Additionally, we asked students to report whether any change in the amount of time they needed to spend on their school work after the transition and any change in their motivation. Figure 1 shows students responses to these four questions.
Taken together, the results in Fig. 1 paint a somewhat bleak, though not unexpected, picture. Students perceived a significant reduction in the level of interaction in their courses, and they also perceived that this reduction in interaction had a negative impact on their learning. Simultaneously, nearly three-quarter of students reported spending more time on their school work and a significant reduction in their motivation to do that work. However, students responses to openended prompts 3) and 4) (see Sec. II for prompts) provide insight into strategies, both personal and instructional, that supported their learning and motivation. For example, 18 students spontaneously brought up embedded clicker questions or participation points for 'attending' lecture as helping them stay engaged and motivated.
-Phys, with the required lectures with clicker questions, definitely helped, classes that suddenly had optional lectures, I was much less motivated for. -Assigned points to coming to lecture at the scheduled time. It made me get up at reasonable hours and have a semblance of structure to my day. -Actually embedding the clicker questions in the lecture helped me listen to the lecture and actually be involved because I needed to get the points. -Having us actively participate in live classes kept my focus and attention on the lecture as well as gave me confidence and motivation after the lecture to complete homework and to study. Additionally, 6 students spontaneously brought up that they found synchronous instruction more motivating than watching recorded lectures.
-Using zoom and real time lectures seemed more motivating than recordings at time of choice. -Having a set time for live lectures helped me stay on track and motivated me to learn new material. With respect to strategies that students used to keep themselves motivated, the single most common strategy was to create, and hold themselves to, a schedule during the day.
-Stick (more or less) with my academic/class schedule I used when I was attending school in person. The routine helped almost 'force' or 'coerce' me into doing my work on time. -Made a new schedule, included time for self-care things like exercise and breaks, tried to stick to it as much as possible and keep school related activities to one area/time. -I tried to establish a schedule. Watch lectures during their respective Schedule time and prepare as if I was going to class like getting dressed and ready to go to make myself feel I was in class. It is worth noting (as many students did) that this type of time scheduling is usually facilitated by the structure of an inperson class, and thus, is a skill many students may not have fully developed. This suggests that instructors may need to provide additional explicit guidance to their students to encourage them to do this type of intentional scheduling. ERT Lecture Structure: Lectures are often a significant fraction of the face-to-face instructional time a student experiences. As such, we anticipated that student would have strong opinions around the format and structure of ERT lectures. For example, ERT lectures can be delivered synchronously, asynchronously, or via a combination of both. Among the students in our survey who had experienced both synchronous and asynchronous lectures (N = 75), there was no difference in their rating of the effectiveness of these approaches, with 70% of the students rating them as 'very effective' or 'somewhat effective' in both cases. However, students' responses to open-ended prompts 1) and 2) (see Sec. II for prompts) suggest that students saw advantages and disadvantages to both synchronous vs. asynchronous instruction.
-I found it most useful to keep everything business as usual, to the best of their ability of course. Now, that being said, I know that for international students having a more asynchronous formatted class probably works better. But the classes I was able to keep up in were synchronous and during live lecture we could maintain normal interaction which was useful. up from being indoors so long. It was nice to be able to watch lectures if I slept through the day and was up at night. Another reason is my internet is really poor. Zoom will boot me off for no reason, or pause entirely because of the poor internet. -I appreciated that [my instructor] made his lectures available for participation points within a window of time, rather than asking students to "go to class" at the same time each MWF. I felt like I had more flexibility in my schedule thanks to him. My life is just to crazy and my mental health is too unstable for me to be able to stick to a strict schedule. The quotes above, along with many of the quotes from the previous section, suggest that both synchronous and asynchronous have significant affordances and significant constraints. This suggests the most effective strategy might be to provide both. Moreover, these quotes suggest that opportunities for the students to interactively engage with the material, their peers, and the instructor need to be built in independent of whether instruction is synchronous or asynchronous.
Additionally, across both synchronous and asynchronous lectures, student had complaints that many lectures: -Ended up moving way too fast since the instructor was not slowed down by needing to write, and there were a lot of things on the page that we weren't actively talking about. This suggests that, in either lecture format, instructors need to present the information (either written or with slide animations) in a way that supports students in being able to meaningfully process and engage with that material in the moment. ERT Exams: One component of an in-person course that caused significant stress and anxiety for both students and instructors was exams; many of the concerns centered around test fairness and security. We asked students to report which approaches they experienced and how effective these approaches were at supporting their learning. We found that the majority of students in our sample experienced timed, singlesitting exams (94%) and the majority were open book (78%). In rating the effectiveness of ERT exam formats, all students who experienced take-home exams rated them as 'very effective' or 'somewhat effective' at supporting their learn-ing. This was in contrast to timed open-response or timed multiple-choice exams, for which 40% and 41% of students respectively rated these formats as either 'somewhat ineffective' or 'very ineffective. ' We also asked students to report on whether they knew of students who had used unauthorized resources on exams, and whether they were personally concerned about the fairness of ERT exams. Of the 106 students who finished the survey, 36% reported knowing of students (themselves or others) who used unauthorized resources on exams, while 40% reported not knowing of any students who did so. Additionally, 38% reported being personally concerned about the fairness of remote exams, while 31% reported not being concerned.
Students were also provided an optional text box asking them to describe any additional concerns they had regarding ERT exams, and 70 students responded. Of these, 42 students (60%) brought up issues around cheating in remote exams.
-The open note ones made it easier to cheat, but I feel were the fairest. -The point of an exam is to determine whether students have a proficient understanding of the content.... It is impossible to ensure that nobody cheats on [on-line] exams...and yet, they are still being used as a major component of grades.... Remote exams should simply not be allowed. -The remote exams were made to be one question at a time to control [cheating].... With one question at a time, you have to hope you think of it perfectly your first time and if you get stuck in a small rut you are just simply ruined. You move onto the next question with a real iffy answer and then boom while working on another problem you think of a different approach to that previous problem. These responses suggest that students were indeed concerned about the increased potential for cheating associated with remote exams. However, these responses also show that students felt some instructors' attempts to curb cheating put students at a disadvantage, especially those who were genuinely trying to engage appropriately.
Additionally, 26 students (37%) brought up open vs. closed book exams. Most of these comments focused the idea that, since the use of outside resources cannot be prevented, the only fair approach is to simply allow their use. Some students also noted that the tests should be crafted in such a way as to make the use of unauthorized resources ineffective.
-Closed book exams were never that effective.... Write harder exams, make them open book, and...give students a valuable learning experience.... I mean this for in person class as well. -Most students I've spoken with have talked about take home exams being explicitly open-book and open-note because instructors are aware that use of unauthorized resources will happen. I think this is probably the only "fair" way to conduct remote exams. -If an exam is being offered with answers that can be googled online, in such a way that a student can get full credit for copying the answer down, I consider that to be a poorly designed test. Only 4 students spontaneously brought up take home exams in the followup text box. Consistent with the quantitative findings, all comments regarding take home exams were positive and indicated a preference for this over a timed format.
-I really like take home exams, because they are hard enough no one can cheat. I also feel like they are much for reflective of a persons ability to problem solve. -Take home exams allow you to think deeper almost like a research project. New challenges stemming from ERT: For many students the disruption associated with the transition to ERT was not limited to only transition to ERT. To investigate this, students could select from a list any additional challenges they encountered as a result of the transition. The distribution of students responses is given in Fig. 2. While these challenges are important for instructors to be aware of, several are particularly noteworthy as they could directly impact students ability to access and engage with the course material. While only a small number of students reported having unreliable access to a device for schoolwork, just over a quarter of students (28%) reported having an unreliable internet connection. This has implications for any instructor considering synchronous-only instruction. Additionally, more than half (60%) of students reported not having access to a quiet study space for schoolwork, and a fifth (20%) reported difficulties associated with being in a different time zone. Both of these issues are of particular concern for ERT exams, particularly if the exams are only available in a specific narrow time interval.

IV. CONCLUSIONS & LIMITATIONS
Here, we report preliminary findings from a survey investigating the student experience with the emergency remote teaching that emerged in response to the COVID-19 pandemic. The survey, which included questions targeting both qualitative and quantitative insight, was distributed to physics students at a large research university in the central United States. Preliminary survey analysis focused on identifying actionable recommendations for instructors who might face ERT in the immediate or distant future. Our findings have implications particularly for the delivery of ERT lectures (e.g., synchronous vs. asynchronous or a combination of both) and exams (with respect to format, security, and fairness). Instructors should consider the realities of students' situation when attempting to learn from home as they structure their course, rather than assuming students will be able to engage with ERT in the same way that they previously engaged with in-person instruction. Our results also highlight the need for instructors to be sensitive to reduced student motivation and new challenges students face as a result of ERT and adjust their assumptions and instruction accordingly.
There are several important limitations to this work, primarily related to the narrow population of students who completed the survey. The survey population came from a single institution with a primarily white and generally affluent student body. This is particularly important when interpreting the extra challenges they encountered in the transition to ERT, especially regarding access to technology, the internet, and quiet study and test-taking spaces. In a different student body, particularly one serving economically disadvantaged groups, the difficulties encountered may be more common and of a different nature. Additionally, the survey was only delivered to students enrolled in physics courses. Our findings may not generalize well to other science disciplines or to the arts and humanities. Analysis of the full suite of survey questions is ongoing and will inform further publications. Future work will include followup interviews with survey respondents who indicated interest in order to gain additional qualitative insight into students' experiences with ERT.