Sections

Commentary

Latin American countries pay attention to global assessments of student learning

The results of the Programme for International Student Assessment (PISA) for 2015 were publicly released on December 6, 2016. Every three years since 2000 PISA evaluates what 15-year-old students know and can do with that knowledge in reading, mathematics and science. In 2015 more than 70 countries took part, 10 of them from Latin America and the Caribbean. Once again their education systems ranked at the bottom. The region’s highest-performing country, Chile, only achieved 44th place among the 72 participating countries, and the Dominican Republic was last. The other participating countries from the region, Brazil, Colombia, Costa Rica, Mexico, Peru, Trinidad & Tobago, and Uruguay, fell between these two extremes.

Ranking in science, PISA 2015
Figure 1: Ranking in science, PISA 2015

The dismal performance of the region means that more than half of the students assessed did not reach a baseline level of performance that would have put them on a path toward continued learning and contributing meaningfully to society. And that does not include the many in the region who still do not participate in the assessment, either because their country did not take part, because they have left the education system before completing compulsory education, or because they are so far behind (still in primary) that they do not meet the standards to be assessed. In Mexico and Costa Rica, PISA covers less than 65 percent of the underlying student-age population (15-year-olds); in the Dominican Republic, Brazil, Uruguay, and Peru less than 75 percent, Colombia and Trinidad and Tobago about 75 percent and Chile 80 percent. No one knows how well (or poorly) those not assessed would have performed, but it is clear that they are poor, uneducated, and excluded.

Yet, there is some good news. First, an increasing number of countries from the region are choosing to participate in PISA, knowing well that the results will not likely be flattering. But like most who run marathons, the goal is not to win it but rather to learn something about themselves.

In PISA 2012, Peru was 65th out of 65 participating countries (the Dominican Republic did not participate that year). Peru’s poor performance covered the front pages of the Peruvian newspapers, which actually was a positive sign that opinion makers were paying attention. More importantly, it facilitated the work of a Ministry of Education bent on assessing student learning, calling attention to the need to spend more money and more efficiently and, most importantly, raising the profile of the teachers by professionalizing teaching and linking selection and promotions with merit and performance. These reforms are difficult to introduce in the region, where teacher unions often resist them. But they are important to reduce political interference in education and to create an environment where performance is valued and rewarded. In part because of these reforms, Peru had one of the fastest rates of improvement in student learning outcomes as measured by PISA among all participating countries.

Figure 2: Participation of Latin American and Caribbean Countries in PISA

Participation of Latin American and Caribbean Countries in PISA.

Figure 3: PISA participants by year: Latin American and Caribbean countries

PISA participants by year: Latin American and Caribbean countries

This leads us to the second piece of good news: several Latin American education systems stand out as rapid improvers. Peru is among the top 10 fastest improving participants across all cycles in all three domains, and Colombia and Trinidad and Tobago join in this selective club in science. Across all the assessments, only Costa Rica showed a negative result, which was in reading. All other countries have managed to increase in at least one area while maintaining their performance in all others (except Uruguay, with unchanged performance across the board). All of this amid important increases in the populations covered by the assessment, either because of population growth or, more importantly, because of the region’s efforts to expand access to schooling and increase retention.

Third, some countries that have participated in several rounds of PISA, and thus have benefitted from years of capacity building for conducting student assessments and interpreting the results, are starting to use the results in new and sophisticated ways and a few are even questioning some of the OECD’s methodologies and reports. Slowly but surely, many countries are establishing a culture of formative assessment and evaluation that will serve them well for years to come. They are making their voices heard at home and abroad and making a more rigorous and nuanced uses of these data and evidence.

An increased participation in international assessments has come hand in hand with the development or strengthening of evaluation agencies in the region. Institutions like the Agencia de Calidad in Chile, ICFES in Colombia, INEP in Brazil, or INEE in Mexico have become bigger, stronger, more autonomous, and more involved in the development of data and evidence for policymaking. These agencies are poised to play a key role in educational improvements for their countries. They can help make the debate over what to do more about evidence and facts and less about politics and ideology.

Finally, as in the case of Peru, governments are using PISA results to introduce and strengthen reforms that are aligned with the policies of high-performing systems. While the pace of these reforms needs to be accelerated, it is exciting to see many governments in Latin America committed to improve educational opportunity for all children and youth, and willing to—in many cases—pay the political costs required to face vested interests that have kept education quality stagnant.

Authors