The potential of online education is uniquely compelling. Its most zealous advocates imagine a world in which there is exceptionally broad access to carefully refined courses delivered with consistent fidelity, including some taught by leading instructors from the world’s most selective colleges and universities.
But integral to promises of wide democratization are questions of equity within online classrooms, an issue receiving very little attention. A large body of research has found that female and minority students face a variety of discriminatory practices in traditional classrooms. For example, white teachers are more likely to rate black students’ misbehavior more harshly than the similar behavior of white students, and teachers direct more positive and neutral speech toward white students than toward Latino and black students. Postsecondary faculty also discriminate against female applicants for lab positions and against female and racial minority applicants for doctoral programs.
Whether similar biases exist in online learning environments is an open question. A student’s social identity may be less apparent or relevant in the absence of face-to-face engagement, possibly leading to a lower level of bias. Alternatively, digital environments may reduce the social inhibition that attenuates biased behavior.
The study design
These questions motivated a field experiment we recently conducted in online classrooms where we tested for evidence of bias within 124 massive open online courses. These courses are free to access, taught by faculty from selective four-year universities, and represent a wide range of subjects including computer science, history, music, and health care.
The experiment involved posting eight comments, one from each of eight different fictive students, in the discussion forums of each of the courses. Discussion forums are important contexts in online courses. They serve as the primary, and often sole, opportunity for student-to-instructor and student-to-student interaction. Within such forums, the poster’s name is typically the only source of information about students’ gender and race, as compared to in-person classes where participants have access to many visual clues.
In our study, we created a pool of fictive students with first and last names evocative of a specific race and gender (e.g. black male or Chinese female). We used a set of names that had been used in previous studies examining the effects of bias in non-educational settings. In each course, eight of our fictive students (one from each of the eight different race-gender combinations we employed: black/Chinese/Indian/white and female/male) posted a randomly selected comment in the main course discussion forum. We used a set of 32 comments that were similar to comments posted by real students in course forums.
Evidence of race and gender bias
To identify the existence of bias, we observed the number of responses by instructors and students to each of our posted comments and compared the number of responses across the eight race-gender combinations. Figures 1 and 2 show the probability of instructor or student response, respectively, across the eight race-gender combinations. The horizontal red line in each figure shows the average probability of response across all eight groups.
Figure 1: Unconditional probability of an instructor response by student identity
Figure 2: Unconditional probability of a peer response by student identity
Figure 1 shows that about 7 percent of our comments received a response from the course instructor. Figure 2 shows that about 70 percent received a response from at least one student. Emerging from these figures is the higher probability of response from an instructor when a comment is posted by a white male name. Among students, there is little evidence of bias with comments by all race-gender combinations receiving similar response rates.
When we fully control for comments, courses, and timing of the postings, the white male effect on instructor response is a full 6 percentage points higher than for all other groups combined. This represents an increased likelihood of instructor response by 94 percent. In other words, instructors are nearly twice as likely to respond to a comment posted by a white male than the other race-gender combinations we examined.
Our results clearly suggest the more anonymous context of online discussion forums does not deter instructors from exhibiting bias in favor of white males. Even in the absence of visual clues, female and minority students appear disadvantaged in this rapidly growing educational context.
Potential causes of bias
We consider several potential causes of the observed bias. First, we argue that our findings are not really consistent with the notion that instructors are purposefully discriminatory. For example, we do observe a larger effect on instructor response in favor of white male students in courses taught by a white male; however, it is not statistically significantly different from the effect in courses not taught by a white male. We also find that the instructor bias in favor of white males is concentrated among comments involving social pleasantries or general academic advising, but not among comments dealing explicitly with completion of the course in question, a finding that is not consistent with intentional biases.
Second, we also consider whether the bias is likely due to “statistical discrimination,” the process by which people discriminate against individuals because they assign to that individual the average attributes of that individual’s group. If academic stereotypes, typically in favor of Asian and white students, were the source of bias, we would likely see larger effects in STEM courses and larger effects for Chinese men than black men. Empirically, we observe neither of these results.
Based on this and other evidence, we argue that our findings are most consistent with what is called implicit bias. This study was not designed to test this mechanism directly, but future lines of research have an opportunity to examine the levels of implicit bias in online instructors.
Implications of the observed race and gender bias
How should we consider these findings in the context of the expansion of online education? There is evidence that persistence and performance is already lower in online learning environments than in traditional postsecondary education. Although our study cannot identify the effects of the observed bias on students’ academic outcomes, any bias is an additional obstacle to the academic success of female and minority students in the already challenging online environment. If we care about improving equity in this rapidly expanding educational context, instructors, institutions, and online platforms must collaborate to eliminate bias.
Fortunately, online classrooms are uniquely amenable to the design, implementation, and evaluation of potential strategies for promoting equitable learning environments. For example, one possible approach would be to collect real-time data on instructor’s response behaviors and make it available to them. If the root cause is unconscious bias, then simply making instructors aware of their implicit bias may alter behavior in favor of equity. Similarly, the online context may make it particularly easy to field other teaching-facing interventions that reduce bias.
Alternatively, discussion forums can be made more anonymous such that even information about names is hidden from view of instructors. While there may still be signals about students’ background characteristics in the writing and language choice of their posted comments, there would be less information which instructors could unconsciously use to assign race and gender. There may also be other classroom design features (e.g., single-gender sub forums) that promote equity. Which design interventions may be effective will be an important issue for researchers to take up as we seek to develop online education as a viable pathway for human capital development for students of all backgrounds.
The Brown Center Chalkboard launched in January 2013 as a weekly series of new analyses of policy, research, and practice relevant to U.S. education.
In July 2015, the Chalkboard was re-launched as a Brookings blog in order to offer more frequent, timely, and diverse content. Contributors to both the original paper series and current blog are committed to bringing evidence to bear on the debates around education policy in America.
"The pandemic has highlighted just how intricately related lack of broadband access is to systemic inequality."