The field of education research using statewide longitudinal administrative data on programs and outcomes from preschool to K-12 and postsecondary education and into the workforce (known as P-20W data) is relatively new. It has developed without the requisite training in the underlying data generation processes, yet knowledge of the data generation process is critical when conducting empirical research. For example, in the social sciences, researchers are commonly trained on research-ready survey data, where statistical methods courses cover the role of survey design in empirical studies. When teaching new researchers to account for the strata, clustering, and population weights of National Center for Education Statistics surveys, I run regressions showing hours of homework having a statistically significant effect on test scores—which is either positive or negative depending on the survey design adjustments.
The education administrative data generation process is not what anyone would call a “design.” It is more the result of cumulative, occasionally arcane, federal reporting requirements, state data collection and validation rules that vary over time, and even differences in district IT systems. Yet, a thorough understanding of how administrative data is generated is equally essential to researchers. However, the legal allowance for research with (de-identified) administrative data is that the benefits accrue directly to the students represented, and this allowance would not extend to the data’s use as a training tool. Additionally, most lessons would need to be as varied as the data systems themselves. But it is in part because of these legal restrictions and local idiosyncrasies that researchers now have access to an even better learning opportunity—that is the opportunity to work more closely with the local data providers and practitioners.
Increasing researcher-practitioner partnerships
Administrative data’s research potential, coupled with its security requirements and opacity, has resulted in an increasing number of researcher-practitioner partnerships. These range from consultation on research needs and data decisions, to collaboration on national research grant opportunities. The best of these arrangements include experts in each of the following areas: sophisticated research methods, the data generation process, and the local education and labor contexts being studied. The most sophisticated statistical methodology can produce critically wrong results with extremely high certainty if the theoretical modelling of the data or context is naive. The objective of statistical models is to isolate unbiased and efficient relationships by controlling for relevant observables conditional on model assumptions about unobservables—extending what a researcher observes to include the direct knowledge of others can only improve the accuracy and relevance of the results.
Recent posts on the Chalkboard have highlighted individual state education agency research programs that draw on this breadth of expertise as well as mechanisms for researchers and practitioners to share best practices nationally. Another way that these opportunities are occurring is through the expansion of state cross-agency P-20W data and research centers. These centers facilitate researcher collaboration with multiple agencies and stakeholder groups to address a broad range of topics. Like the data systems themselves, the design and the activities of these centers vary from state to state. Information about the centers, such as the data request process, metadata, periodic reports, and links to research papers can be found online for a number of states, including Arkansas, Connecticut, Hawaii, Ohio, Rhode Island, Texas, Utah, Virginia, and Washington. Many of these sites also feature data visualizations and some include secure, interactive data analysis tools. By working with these centers, researchers are able to get actionable research directly into the hands of those who will use it. Many of us are also finding this combined knowledge to be essential to improving the applicability of our research in the first place, and if we are being very honest with ourselves, the validity as well.
Piloting the Research Hub in Rhode Island
This year our research team at American Institutes for Research, in partnership with Research & Analytic Insights, has been working with Rhode Island to build out the new Research Hub researcher interface to the state’s cross-sector DataHUB. When completed, this will include the necessary protocols and resources for making informed data requests, using the data responsibly, and disseminating findings to relevant audiences. In piloting the Research Hub, we were able to work with state agencies, legislators, and educators in Rhode Island to propose a research study based on the state’s policy research agenda and provide the results directly to those same stakeholders, including the K-12 and postsecondary commissioners and the National Principal of the Year. In addition, we were able to solicit and incorporate their direct feedback. This process led to our crafting the research study results as individualized high school reports in addition to the technical write-up of the study itself. Rhode Island is now funding a series of mini research grants to support researchers’ use of the Research Hub to address other questions on the state’s research agenda and provide those findings back as well. This is just one of many opportunities for education researchers and practitioners to put their heads together for better results.
The Brown Center Chalkboard launched in January 2013 as a weekly series of new analyses of policy, research, and practice relevant to U.S. education.
In July 2015, the Chalkboard was re-launched as a Brookings blog in order to offer more frequent, timely, and diverse content. Contributors to both the original paper series and current blog are committed to bringing evidence to bear on the debates around education policy in America.