Skip to main content
A woman looks at her mobile phone in Walworth, London, Britain October 19, 2016. REUTERS/Stefan Wermuth SEARCH "WERMUTH PHONES" FOR THIS STORY. SEARCH "THE WIDER IMAGE" FOR ALL STORIES. - RC1CA5E061B0
Brown Center Chalkboard

Freezing ‘summer melt’ in its tracks: Increasing college enrollment with AI

and

When people say, “Showing up is 80 percent of life,” they likely mean it in a metaphorical sense. Yet, showing up is literally the problem for many prospective college students. Annually, 10-20 percent of high school seniors who are admitted to college and indicate that they intend to go never make it to the first day of class.

Authors

H

Hunter Gehlbach

Associate Professor of Education and Associate Dean, Gevirtz Graduate School of Education - University of California-Santa Barbara

Director of Research - Panorama Education

L

Lindsay C. Page

Assistant Professor of Research Methodology - The University of Pittsburgh, School of Education

Research Scientist - The University of Pittsburgh Learning Research and Development Center

This phenomenon of “summer melt” often occurs after college-bound high schoolers have committed to attend a particular institution. The act of accepting an offer to attend college unleashes a torrent of administrative tasks and corresponding paperwork. Students must complete financial aid forms, tuition payment plans, scholarships, health records, high school transcripts, roommate questionnaires, meal plan agreements, etc. Some students—particularly those whose parents never attended college and have not navigated the enrollment process—get swept under by the flood of forms as well as the financial realities of paying for college.

So how can policymakers support these motivated prospective students to begin their college careers? One new technology offers a potentially promising way to address the summer melt challenge. To assist potential Georgia State University (GSU) students with their enrollment tasks, the university collaborated with AdmitHub, a technology start-up. AdmitHub built an artificially intelligent, text-messaging chatbot—named “Pounce,” after the GSU mascot—to reach out to students about required transition tasks and pending deadlines.

Whenever Georgia State’s records showed that a prospective student still owed them something—say, a copy of their official high school transcript—Pounce texted that student to remind them of the deadline and to offer the student assistance. When students did not need help because they had completed that task, the chatbot left them alone until the system flagged the next missing requirement with a looming deadline. When students did need help, Pounce provided guiding questions and provided real-time responses to students’ queries. Thanks to a machine-learning algorithm incorporated into the technology, the chatbot was able to answer the majority of student questions automatically and “learned” to provide more and better answers over time.

Although artificially intelligent chatbots that learn over time sound promising, the bottom-line question for us as researchers was this: Does Pounce’s text outreach and guidance actually help more students show up to campus for the fall semester? To test the effectiveness of this intervention, we randomly assigned half of all the accepted GSU students to receive Pounce outreach, while the other half got GSU’s “business as usual” treatment. (For example, they could contact admissions counselors for guidance if and when questions arose.) Because many students decided to attend other schools (and thus stopped receiving Pounce’s texts), our main comparison focused on the students who were committed, at the start of our intervention, to attend GSU. We report in a recent paper that among GSU-intending students, those who received the text outreach were 3.3 percentage points more likely to begin their fall semester at GSU. In addition to this significant increase in enrollment, the outreach helped students to be more successful with a host of pre-enrollment processes, particularly those related to navigating financial aid.

We derive several lessons from this study. First, artificially intelligent systems have the potential to help universities work smarter. Students struggle with administrative processes—such as those involved in college matriculation—in predictable ways that do not require human help. We know this because at GSU, Pounce handled many questions that students texted in automatically. The chatbot routed only a small fraction of questions to GSU counselors and advisors. Thus, by efficiently handling common or straightforward questions in this way, university staff could spend more time working individually with students on questions that truly require professional guidance.

Second, given the technology on which it is based, systems like Pounce have high potential for improvement and refinement over time. That is, because the chatbot can be tested and implemented year-over-year at the same campus with the same administrative processes, it can learn to answer new questions and can learn to provide better answers to recurring questions. As we test the chatbot over more campuses with different student populations, our understanding of how to tailor this intervention for different types of students will also improve.

Third, tools such as this have potential to scale to a great number of post-secondary educational institutions. Although the sending and receiving of texts is not free, it is a tremendously economical form of communication (not to mention one of the most commonly used forms by students). If research can demonstrate similarly promising results at other schools, the potential for wide-scale adoption seems especially promising.

Ideally, educational institutions could help facilitate the stemming of summer melt by reducing the number of forms they require of prospective students and simplifying the matriculation process. While some streamlining may be possible, there are also likely limits to what improvements might be made. Thus, policymakers may wish to: encourage institutions to develop strategies for reminding and supporting students to meet important deadlines; facilitate universities’ capacities to capitalize on data that they typically already possess regarding incomplete enrollment tasks; and to create incentives for entrepreneurs and researchers to continue developing and testing how different nudges might work best for different students in different educational contexts.

The Brown Center Chalkboard launched in January 2013 as a weekly series of new analyses of policy, research, and practice relevant to U.S. education.

In July 2015, the Chalkboard was re-launched as a Brookings blog in order to offer more frequent, timely, and diverse content. Contributors to both the original paper series and current blog are committed to bringing evidence to bear on the debates around education policy in America.

Read papers in the original Brown Center Chalkboard series »

More

Get daily updates from Brookings