Skip to main content
Students attend a lecture for the entrance exam for postgraduate studies at a hall in Jinan, Shandong
Brown Center Chalkboard

What to look for in the 2017 NAEP results

Scores from the 2017 National Assessment of Educational Progress (NAEP) will be released in the coming weeks. NAEP is often referred to as “the nation’s report card,” providing a reliably accurate estimate of national academic achievement every two years. Fourth and eighth graders took NAEP assessments in reading, mathematics, and writing in early 2017. Scores will be reported for the nation and the 50 states, along with 24 urban districts.[1]

New NAEP scores generate a lot of discussion. Here’s a preview of four topics that will probably receive the most attention.

Trend in scores: Will stagnation continue?

Table 1 displays NAEP scale score changes in reading and math from 2009 to 2015 and immediately preceding periods. Start years of the previous periods are based on when testing accommodations were first permitted on NAEP—1998 in reading and 2000 in math.[2] For both subjects and grade levels, scores have been flat since 2009, not deviating by more than a single scale score point. Reading scores have a history of staying flat. But math scores increased substantially before 2009. Since then, they have barely budged. Will 2017 scores continue the recent period of stagnation?

Table 1. Change in NAEP Scores, reading and mathematics (in scale score points)


1998-2009 reading;

2000-2009 mathematics


reading and mathematics

4th-grade reading +1 +1
4th-grade math +14 0
8th-grade reading +1 +1
8th-grade math +10 -1

Digital difference?

The latest NAEP score release was originally scheduled for October 2017, the traditional date that NAEP scores are published. NAEP officials postponed the release, citing the need to complete analyses of the impact of NAEP’s transition from pencil and paper to digital platforms. To calculate an accurate trend in national achievement, test scores must be comparable over time. Analysts are concerned about what’s known as “mode effects,” that students may score differently based on the mode of an assessment rather than because of true differences in math or reading ability. The results of a 2016 international assessment, the Program for International Student Assessment (PISA), were questioned by some analysts—despite field study results indicating that mode effects of changing from pencil and paper to a digital platform would be minimal. NAEP officials recognize that the legitimacy of scores could be questioned if the transition to digital tests is not handled adroitly.

NAEP and Common Core

New NAEP scores invite explanations as to why scores are going up or down. These explanations are typically speculative and often serve to bolster political advocacy. Critics of Common Core, for example, have had a field day attributing the stagnancy of NAEP scores to the standards. This conclusion is rarely accompanied by serious analysis.

NAEP is better equipped to describe the current status of national achievement than to pinpoint how and why it got that way. My most recent analysis of longitudinal changes in NAEP data—exploiting state variation in implementation of Common Core, allowing each state’s progress to be calculated off its own baseline score, and controlling for demographic changes—generates provisional estimates of Common Core’s effects, but even these findings should be treated as hypotheses for further study. Yes, it’s true: The NAEP scores of states that have enthusiastically embraced Common Core have been dead in the water since 2009. But so have the scores of non-Common Core states.


NAEP and the Common Core assessments interact in other ways. The Obama administration awarded two state consortia $350 million to develop assessments compatible with Common Core: the Partnership for Assessment of Readiness for College and Careers (PARCC) and the Smarter Balanced Assessment Consortium (SBAC). Just as both consortia began rolling out the assessments, political controversy engulfed Common Core. States declaring an intention of using the tests numbered 45 in 2011 and fell to 21 (counting D.C.) in 2017.

Analysts will be interested in comparing states’ NAEP scores with scores on PARCC and SBAC.  After 2015 NAEP scores showed a decline in math, Michael Cohen of Achieve Inc. complained of a “mismatch” between the NAEP math assessment and states’ math standards. An AIR study found “reasonable concordance” between NAEP and the standards. Nevertheless, Common Core supporters urged changes in NAEP to bring it into better alignment with Common Core.

An additional concern emerged in 2017. The two Common Core tests stopped agreeing with each other. The spring 2017 results from PARCC and SABC are strangely out of sync considering that the two assessments purportedly measure student progress on the same standards. Ed Haertel of Stanford University raised concerns about all 14 SBAC states reporting a decline in 2016-2017 English Language Arts scores. Doug McRae produced state-by-state comparisons showing PARCC states posting improvement while almost all SBAC states declined. NAEP data may help resolve the contradiction.

Middle-school advanced math

In the 1990s, enrollment in algebra was characterized as “The New Civil Right” by Robert Moses. The Clinton administration declared completion of an algebra course by the end of eighth grade a national goal. The number of students taking advanced math in middle school skyrocketed. In 1990, only about one in six eighth-graders were enrolled in algebra, with most students (61 percent) in a general, eighth-grade math course. By 2011, 47 percent of eighth-graders were in advanced math (algebra or higher) and only 25 percent in eighth-grade math.[3]

A two decade shift towards advanced math in eighth grade may be coming to an end—and perhaps even reversing. In 2015, enrollment in advanced math fell to 43 percent while eighth-grade math enrollment grew to 32 percent. What happened? The Common Core math standards delineate a single eighth-grade math course, and although that course includes some algebra, a formal Algebra I course is reserved for ninth-graders. An appendix to the standards offers ways that schools can accelerate high achieving math students by compacting three years of curriculum into two, but it might be easier for middle schools simply not to offer advanced math courses. The 2017 NAEP data will let us know if this trend is continuing.

As these four topics illustrate, NAEP data are not only important for measuring national achievement and mapping current trends, but also for gathering reliable data on practices and policies. A unique aspect to the 2017 data is that NAEP itself may receive scrutiny. Analysis and debate about shifting to an all-digital platform—along with questions about NAEP’s alignment with Common Core standards—have barely begun, and are sure to continue over the next several cycles of NAEP assessments.


  1. Writing scores may be reported at a different time than reading and math.
  2. Accommodations are provided to students with special education or non-English speaking status.
  3. See the 2008 Brown Center Report, “The Misplaced Math Student,” pp. 19-31. NAEP data were retrieved with NAEP Data Explorer.

The Brown Center Chalkboard launched in January 2013 as a weekly series of new analyses of policy, research, and practice relevant to U.S. education.

In July 2015, the Chalkboard was re-launched as a Brookings blog in order to offer more frequent, timely, and diverse content. Contributors to both the original paper series and current blog are committed to bringing evidence to bear on the debates around education policy in America.

Read papers in the original Brown Center Chalkboard series »

Get daily updates from Brookings