Sections

Commentary

Is college choice impacted by data in the College Scorecard?

National Decision Day—the day thousands of high school students finalize their college plans—is around the corner.  Colleges want to know whether students will attend their institution and after the dust settles, they also want to know how students arrived at those decisions.  Getting inside a teenager’s head is both scary and critical for education policy.  Research has shown that, nationally, approximately 41 percent of prospective college students enroll in a college that is of lower academic quality than they could attend (“under-match”).  This is of concern because attending lower academic quality institutions may decrease the likelihood that students will persist to graduation and, as one well-known study points out, the benefits of a high academic quality education do not necessarily come at an increased cost.  So why do high schoolers make seemingly sub-optimal educational decisions? 

One potential explanation for under-matching may be that students do not have the information they need in order to make thoughtful decisions about where to apply to college. In an effort to address this issue head-on, the U.S. Department of Education released a new version of the College Scorecard on September 12, 2015. The College Scorecard highlights each college’s annual cost, average graduation rate, and median earnings among students 10 years after enrolling.  With the exception of median earnings, this information was previously available to students from sources such as US News and World Report’s college rankings.  The appeal of the median earnings data, the easy-to-read format on the Department of Education’s College Scorecard website, and surrounding media attention may have impacted how some students have made college application decisions this year.

Since its release in the fall, scholars have critiqued the content of the College Scorecard and whether it will hold colleges accountable for their students’ successes and failures, but, until now, there has been no evidence about how this information affects high school students’ choices about college.  In a paper released last week, two researchers at the College Board, Michael Hurwitz and Jonathan Smith, provide the first evidence of how the information contained in the College Scorecard has influenced high school seniors.

To conduct their analysis, the authors make use of data on SAT “score sends” to a sample of 1600 four-year institutions.  Prospective college students who take the SAT are given the option of sending their scores to four different colleges free of charge (and each additional college for a nominal fee).  Though not all colleges require that students submit SAT scores, these score sends can be treated as a proxy for where students are applying to college.  For students graduating from high school between 2010 and 2016, the authors document college-specific trends in score send timing and volume.  Then they estimate the effect of the availability of the College Scorecard data on score sending behavior by comparing trends before and after September 2015.  The authors argue that the new data on median earnings influenced SAT score sending, but found no such effects for annual costs and average graduation rates, both of which were available prior to the College Scorecard’s release.

Overall, the authors find that a 10 percent increase in a college’s reported median earnings boosts score sends by 2.4 percent. However, this overall elasticity estimate conceals the variation in student responsiveness by type of high school and student demographics.  For example, the number of score sends from students enrolled in private high schools increases by 4.1 percent with each 10 percent increase in a college’s median earnings. Responsiveness to the College Scorecard is also stronger among Asian and White students compared to Black and Hispanic students. Taken as a whole, the results of this paper demonstrate that the biggest responders to the College Scorecard data are the same students who enter the college application process with the most resources and information to make sound college enrollment decisions.

Though the authors carefully document variation in responsiveness to the College Scorecard across a range of student subgroups, the paper is unable to explain what is driving these differential effects.   On the one hand, the differential responsiveness may occur because well-resourced students are better prepared to access and make use of the College Scorecard. On the other hand, it could be that different types of students viewed the College Scorecard at the same rate but only the well-resourced students changed their behavior in response to the information.  After all, other research has shown that the majority of high school graduates enrolling in college attend an institution within 20 miles of their home.  More research is needed to better understand why some types of students appear to be more responsive to the College Scorecard data than others.  Otherwise, the College Scorecard runs the paradoxical risk of exacerbating inequities in college choice and under-matching through the provision of more information.