Sections

Commentary

Who’s ready to change? Tracking adaptations during scaling in education

A teacher sits around a table with her students.
Editor's note:

This blog is the fourth in a series of education scaling-related resources.

For the past seven years, the Center for Universal Education (CUE) has researched education initiatives that are scaling around the world and has found that they share at least one thing in common: Each has proactively and reactively adapted to changing circumstances and contexts. Whether expanding to new communities, becoming embedded into national systems, or responding to the global pandemic, each of these initiatives has needed to alter, tweak, or in some cases overhaul an initiative’s design, delivery, or financing approach.

So, the question is not will our environments change, but how do we adapt. When it comes to education, we’ve found that too often adaptations made are not systematically planned for or well documented, and the opportunity to learn from these modifications is lost. This occurs for many understandable reasons, including that those involved with designing and delivering large-scale education programs often do not have the luxury of space and time to pause, reflect, and course-correct based on new data and changes in the broader environment.

Adaptation tracking tool

In an effort to respond to this reality, CUE has just published an Adaptation Tracker designed to support education practitioners to regularly plan for, document, and learn from adaptations in order to strengthen efforts to scale and sustain an initiative.

The tool is based on the Plan-Do-Study-Act template used in improvement science and directly informed by the experiences of—and input from—Real-time Scaling Lab partners. The tool is intended to be used at various intervals throughout any scaling process, with timely data collected and analyzed to inform quick learning and decisionmaking. It involves four simple steps repeated over time:

  1. Identify what is the overarching scaling goal of the initiative and what key scaling driver or factor contributing to this goal will be the focus of any change.
  2. Plan what adaptation will be tested to respond to a challenge or opportunity related to this scaling driver and how it will be executed and measured.
  3. Test the adaptation in a short learning cycle—capturing any problems that arise, spontaneous changes made, and early results.
  4. Reflect on the results of the adaptation, including what worked, did not work, and any lessons learned. Based on this learning, determine what changes to make to the model or strategy and what further adaptations to try—continuing the iterative learning cycle.

What does this look like in practice?

In the Philippines, the Department of Education (DepEd) has prioritized the effective delivery of teacher professional development (TPD) programs in an effort to improve the quality of education across the country. One flagship program is a blended teacher professional development course—Early Language, Literacy, and Numeracy (ELLN) Digital—rolled out to all K-3 teachers in the country beginning 2019. The ELLN Digital course combines guided independent study of multimedia courseware by the teachers with collaborative learning through school-based teachers’ groups. Given the magnitude of the TPD needs within the system, DepEd faced a major challenge in delivering in-service training to approximately 300,000 K-3 teachers—how to maintain the quality of training as ELLN Digital goes to scale while ensuring the approach is well tailored to the country’s diverse contexts.

In response to this challenge, DepEd partnered with the NGO Foundation for Information Technology Education and Development, Inc. (FIT-ED) to incorporate “Plan-Do-Study-Act” improvement cycles in each school and division. These improvement cycles enable quick feedback loops to inform ongoing adaptation and course correction of ELLN Digital implementation at the school level, and data is planned to be aggregated across schools, divisions, and regions to inform future rollout to more teachers and in more schools. This systemwide iterative learning process has been possible given the space, mandate, and resourcing by the government at the central level.

Over the past three years, a number of lessons are emerging from embedding an iterative adaptive learning cycle into the rollout of a national TPD program. These include:

  • The need to understand readiness for scaling prior to rolling out any new initiative.
  • The critical importance of the enabling environment, including the political, cultural, economic, technological, and institutional conditions of the local context.
  • The centrality of fostering agency among implementers—in this case teachers, school leaders, and coaches—by capacitating them for problem-solving and decisionmaking, and by creating spaces for experimentation and collaboration.
  • The challenge of building a critical mass of expertise at local levels through—among others—professional learning communities for teachers and instructional leaders collaborating with community members and other education stakeholders.

With so many unknowns in the world, one thing is certain: Our environments are dynamic and constantly evolving. Sustainable scaling must take these realities into account and be prepared to respond and adapt. This requires fostering and strengthening adaptive capacity and the use of data for learning among different stakeholders involved with scaling. This tool, and a suite of other complementary scaling resources, are intended to support these important efforts.

We welcome any thoughts, suggestions, or questions related to this tool. To share your experience or offer feedback for future editions, please email cue@brookings.edu.

Note: This work was carried out with the aid of a grant from the International Development Research Centre, Canada to the Foundation for Information Technology Education and Development (FIT-ED). The views expressed in this work are those of the authors and do not necessarily represent those of the International Development Research Centre, Canada or its Board of Governors; or the Foundation for Information Technology Education and Development.

Authors