Sections

Research

Summer learning loss: What is it, and what can we do about it?

As students return to school this fall, many of them – perhaps especially those from historically disadvantaged student groups – will be starting the academic year with achievement levels lower than where they were at the beginning of summer break. This phenomenon – sometimes referred to as summer learning loss, summer setback, or summer slide – has been of interest to education researchers going back as far as 1906. We review what is known about summer loss and offer suggestions for districts and states looking to combat the problem.

An early comprehensive review of the literature summarized several findings regarding summer loss. The authors concluded that: (1) on average, students’ achievement scores declined over summer vacation by one month’s worth of school-year learning, (2) declines were sharper for math than for reading, and (3) the extent of loss was larger at higher grade levels. Importantly, they also concluded that income-based reading gaps grew over the summer, given that middle class students tended to show improvement in reading skills while lower-income students tended to experience loss. However, they did not find differential summer learning in math, or by gender or race in either subject.

The recent literature on summer loss has been mixed.  One study using data from over half a million students in grades 2-9 from a southern state (from 2008-2012) found that students, on average, lost between 25 – 30 percent of their school-year learning over the summer; additionally, black and Latino students tended to gain less over the school year and lose more over the summer compared to white students. However, an analysis of the nationally-representative Early Childhood Longitudinal Study, Kindergarten Class of 2010 – 11 (ECLS-K:2011) found little evidence of overall loss over the summers after grades K and 1, and the summer socioeconomic status gaps widened in some subjects and grades but not others. Von Hippel and Hamrock re-analyzed two earlier data sets and concluded that gaps “do not necessarily…grow fastest over the summer” (p.41). Thus, it seems summer loss and summer gap-growth occur, though not universally across geography, grade level, or subject.  

Entwisle, Alexander, and Olson’s “faucet theory” offers an explanation as to why lower-income students might learn less over the summer compared to higher-income students. According to the theory, the “resource faucet” is on for all students during the school year, enabling all students to make learning gains. Over the summer, however, the flow of resources slows for students from disadvantaged backgrounds but not for students from advantaged backgrounds. Higher-income students tend to continue to have access to financial and human capital resources (such as parental education) over the summer, thereby facilitating learning.

Students’ achievement scores declined over summer vacation by one month’s worth of school-year learning.

Traditionally, educators and policymakers have relied on conventional summer school programs to combat summer loss and summer gap-growth. In 2000, Cooper and colleagues published a comprehensive meta-analysis of classroom-based summer programs finding positive effects on average. However, they also concluded that middle-income students benefited more from summer programming than did lower-income students.  They speculated that this could be because programs serving more advantaged students were of higher quality, or because of an interactive effect between programming and the home resources available to students.  The result raised the concern that attempts to stem summer learning loss may actually exacerbate summer gap-growth if they are not well targeted.

Kim and Quinn conducted a meta-analysis of 41 summer reading programs from 35 studies published after the Cooper et al. review. Like Cooper and colleagues, Kim and Quinn found summer reading programs to be effective at raising test scores, on average. Unlike Cooper, however, Kim and Quinn found that it was low-income students who benefited most from summer reading programs (even when restricting the comparison to higher- and lower-income students attending the same program).  Furthermore, they concluded that the reason lower-income students benefited more was that lower-income students in these studies were more likely than higher-income students to experience summer loss when not participating in the summer programs.  The authors noted several differences between their review and Cooper et al.’s that could explain the contrasting results:  1) Kim and Quinn analyzed only reading programs, while Cooper and colleagues combined math and reading programs, 2) Kim and Quinn included only two-group experimental and quasi-experimental studies, while Cooper and colleagues included single-group pre/post-test designs, and 3) Kim and Quinn included home-based programs in their review.

Naturally, school-based summer school programs vary in their effectiveness.  Many of the recommendations for creating high-quality programs come in the form of expert opinion.  Common suggestions include blending academic learning with hands-on or recreational activities, professionalizing summer school staff, and forming partnerships with community organizations to leverage resources. We can also draw some lessons from research.  For instance, the recent meta-analysis found that programs were more effective when they used research-based literacy instruction; specifically, programs using instructional strategies identified by the National Reading Panel as best practices had the largest impact on students’ reading comprehension scores (equivalent to moving from the 50th to the 65th percentile of a normal distribution). Program effectiveness also differed by literacy domain—programs were effective at raising students’ reading comprehension and fluency/decoding scores but not their vocabulary scores.  Not surprisingly, research also suggests that programs are more effective when students attend consistently and spend more time on task academically.

While school-based summer learning programs hold promise when they fit the above criteria, they often fail to live up to these expectations. Two important reasons why school-based summer programs can be ineffective are that organizers often struggle to attract high quality teachers and struggle to appeal to students and families for whom the opportunity costs of attending summer school can be high. School-based programs can also be quite costly. Researchers have therefore experimented, with some success, with lower-cost home-based summer programming.

One example of a home-based summer reading program that has been shown to be effective for low-income upper elementary school students is READS for Summer Learning. In READS, which has been iteratively modified over several randomized trials, students receive eight books in the mail over the summer that are matched to their reading level and interests. Along with each book, students receive a tri-fold paper that leads them through a pre-reading activity and a post-reading comprehension check. Students are asked to mail the postage-prepaid tri-fold back; families receive reminders when tri-folds are not returned. Additionally, teachers deliver scripted lessons at the end of the school year to prepare students to productively read independently over the summer with the trifold scaffold.  A recent study found that READS had an effect on low-income students’ reading comprehension the spring following their participation in the intervention (ES=.05 SD on the state reading test), and other work suggests that the tri-fold acts as a mediator of the program effect.

Another recent randomized trial showed that something as simple as sending text messages over the summer to families of elementary school students at risk of summer loss was effective at improving the reading scores of third- and fourth-graders (but not first or second graders), with effect sizes of .21 to .29. The text messages included tips on resources available to students over the summer, ideas for activities to do with children, and information about the value of particular summer learning activities.

Home-based programs such as these can be more cost-effective than school-based interventions. For example, the cost of READS per student is estimated to be between $250-$480, compared to other programs providing supplementary education services that can cost as much as $1,700 per student and have similar or less favorable cost effectiveness ratios.

Kim and Quinn included home-based programs in their meta-analysis, and encouragingly, they found that the effects of home-based programs were not significantly different from their more expensive classroom-based alternatives.  At the same time, the effects from these programs might not be as large as the effects of the highest-quality school-based programs that use research-based instructional strategies.

Schools and districts should want to address the issue of summer learning loss not only because it may exacerbate achievement gaps, but also because it “wastes” so much of the knowledge students have gained during the school year. Summer loss also undoubtedly increases the amount of time teachers have to spend “re-teaching” last year’s content, likely contributing to the repetitiveness of the typical U.S. curriculum. While investing in extensive school-based summer options may be infeasible, it may be cost-effective and strategic for districts to begin to offer targeted out-of-school interventions to the students most at risk of backsliding.  In designing such programs, policymakers should keep in mind the recommendations from the research described above:

  • Center the program around evidence-based curriculum.
  • In addition to academic content, include hands-on or recreational activities to attract students.
  • Ensure that program structure enables sufficient time on task, and have policies or incentives that encourage consistent attendance.
  • Invest in hiring the most effective teachers.

Regardless of the design, these policies should offer engaging options for students over the summer so that summer learning programs do not feel like punishment for students who would rather be enjoying summer vacation.  Doing so would set more students up for success as the school year gets underway.


The authors did not receive any financial support from any firm or person for this article or from any firm or person with a financial or political interest in this article. They are currently not an officer, director, or board member of any organization with an interest in this article. 

Authors

  • Footnotes
    1. Cooper H., Nye B., Charlton K., Lindsay J., Greathouse S. (1996). The effects of summer vacation on achievement test scores: A narrative and meta-analytic review. Review of Educational Research, 66(3), 227–268. http://journals.sagepub.com/doi/10.3102/00346543066003227
    2. ibid
    3. Atteberry, A., & McEachin, A. (2016). School’s out: Summer learning loss across grade levels and school contexts in the United States today.  In Alexander, K., Pitcock, S., & Boulay, M. (Eds). Summer learning and summer learning loss, pp35-54. New York: Teachers College Press.
    4. Quinn, D.M., Cooc, N., McIntyre, J., & Gomez, C.J. (2016). Seasonal dynamics of academic achievement inequality by socioeconomic status and race/ethnicity: Updating and extending past research with new national data. Educational Researcher, 45(8), 443-453. http://journals.sagepub.com/doi/abs/10.3102/0013189X16677965?journalCode=edra
    5. Von Hippel, P.T., & Hamrock, C. (2016).  Do test score gaps grow before, during, or between the school years? Measurement artifacts and what we can know in spite of them. (Social Science Research Network working paper). Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2745527
    6. Entwisle D. R., Alexander K. L., Olson L. S. (2000). Summer learning and home environment. In Kahlenberg R. D. (Ed.), A notion at risk: Preserving public education as an engine for social mobility (pp. 9–30). New York, NY: Century Foundation Press
    7. Borman G. D., Benson J., Overman L. T. (2005). Families, schools, and summer learning. The Elementary School Journal, 106(2), 131–150. http://www.journals.uchicago.edu/doi/abs/10.1086/499195
    8. Cooper, H., Charlton, K., Valentine, J. C., & Muhlenbruck, L. (2000). Making the most of summer school: A meta-analytic and narrative review. Monographs of the society for research in child development, 65, i-127. https://www.jstor.org/stable/3181549
    9. Kim J. S., Quinn D. M. (2013). The effects of summer reading on low-income children’s literacy achievement from kindergarten to grade 8 a meta-analysis of classroom and home interventions. Review of Educational Research, 83(3), 386–431. http://journals.sagepub.com/doi/10.3102/0034654313483906
    10. McLaughlin B., Pitcock S. (2009). Building quality in summer learning programs: Approaches and recommendations (White Paper Commissioned by the Wallace Foundation). Retrieved from: http://www.wallacefoundation.org/knowledge-center/documents/building-quality-in-summer-learning-programs.pdf
    11. Augustine, CH, Sloan McCombs, J., Pane, JF, Schwartz, HL, Schweig, J., McEachin, A. and Siler-Evans, K. (2016). Learning from Summer: Effects of Voluntary Summer Learning Programs on Low-Income Urban Youth. Santa Monica, CA: RAND Corporation. Retrieved from: https://www.rand.org/pubs/research_reports/RR1557.html
    12. Denton D. R. (2002). Summer school: Unfulfilled promise. Atlanta, GA: Summer Regional Education Board. Retrieved from: http://files.eric.ed.gov/fulltext/ED467662.pdf
    13. McLaughlin & Pitcock (2009)
    14. e.g., Kim, J.S., Guryan, J., White, T.G., Quinn, D.M., Capotosto, L., & Kingston, H.C. (2016). Delayed effects of a low-cost and large-scale summer reading intervention on elementary school children’s reading comprehension. Journal of Research on Educational Effectiveness, 9 sup1, 1-22. http://www.tandfonline.com/doi/abs/10.1080/19345747.2016.1164780?journalCode=uree20
    15. ibid
    16. Guryan, J., Kim, J.S., & Quinn, D.M. (2014). Does reading during the summer build reading skills? Evidence from a randomized experiment in 463 classrooms. NBER Working Paper No. 20689. http://www.nber.org/papers/w20689
    17. Kraft, M.A., & Monti-Nussbaum, M. (in press). Can schools empower parents to prevent summer learning loss? A text messaging field experiment to promote literacy skills. The ANNALS of the American Academy of Political and Social Sciencehttps://scholar.harvard.edu/files/mkraft/files/kraft_monti-nussbaum_2017_can_schools_empower_parents_to_prevent_summer_learning_loss_annals.pdf
    18. Polikoff,  M.S.  (2012).  The  redundancy  of  mathematics  instruction  in  US  elementary  and middle  schools.  The  Elementary  School  Journal ,  113(2),  230­-251. http://web-app.usc.edu/web/rossier/publications/66/The%20Redundancy%20of%20Math%20Instruction.pdf