Glossary
Constitutional AI: Technical frameworks attempting to encode behavioral constraints into AI systems, despite lacking clear definitions of the behaviors being constrained.
Country: In Australian Aboriginal knowledge systems, Country represents an interconnected living system requiring reciprocal care and responsibility. The concept emphasizes that “Indigenous knowledge lives in country, and in doing things together in country—not in computers.”
Digital Terra Nullius: An expansion of the New Zealand Māori Te Mana Raraunga’s Data Sovereignty principles, describing the treatment of digital and computational spaces as unclaimed territory, disregarding existing Indigenous frameworks of rangatiratanga (authority) and kaitiakitanga (guardianship) over data.
Kaitiakitanga: A Māori guardianship framework that includes both environmental and digital domains, centered on collective benefit and responsibility in managing resources and knowledge.
Standing on the shores of Country, watching waves inch closer to ancestral lands, I witness a stark reality: Across the Pacific region, 16 nations face an existential collision between rising seas and Silicon Valley’s computational colonialism. From Tuvalu’s 11,400 people to Papua New Guinea’s 10.4 million, our communities confront unequal access to digital resources, with internet penetration ranging from 26.97% to 85.22%. While Tuvalu implements its “Future Now Project” to preserve culture and governance against territorial loss, the computational infrastructure of major tech companies consumes ever-increasing resources without accountability or recognition of Indigenous knowledge systems.
This technological colonialism has quantifiable costs. According to Australia’s Department of Industry, Science and Resources, “data centres currently represent 1-1.5% of electricity use globally,” with individual facilities consuming energy “equivalent to heating 50,000 homes for a year.” In Australia alone, data center energy consumption is projected to grow from 5% to potentially 15% of national energy use by 2030.
The water demands of this computational infrastructure are equally concerning. In locations where artificial intelligence (AI) facilities operate, they can consume up to 6% of district water supplies during peak periods. Colonial resource extraction patterns have implications. Pacifirc Island nations face critical water security issues due to climate change while AI companies construct facilities requiring millions of lites of water for cooling their systems.
Within this context, the AI safety movement’s claims require deep scrutiny. As Lazar and Nelson note in Science, “Big Tech, weary from bad publicity, is seizing the chance to be viewed as saviours from algorithmic harms, not perpetrators of them.” This “safety-washing” occurs while Indigenous data sovereignty frameworks are systematically excluded from global AI governance discussions.
The computational costs of AI development are not theoretical. According to Australia’s Chief Scientist, Dr. Catherine Foley, training a single model like GPT-3 requires “about 1½ thousand megawatt hours…the equivalent of watching about 1½ million hours of Netflix.” This consumption occurs while the full environmental costs remain hidden. As documented in Dr. Kate Crawford’s Senate submission, “the ‘full planetary costs of generative AI’ [are] ‘closely guarded corporate secrets.’”
Indigenous knowledge systems offer alternative frameworks for technology governance. This demonstrates how Indigenous-led initiatives can prioritize environmental and cultural sustainability in technological development. Frameworks can also recognize what Aboriginal knowledge holders have long understood—sustainable technological systems must be grounded in Country and community, not abstract notions of control.
The definitional vacuum at the heart of AI safety approaches becomes most evident in systems security. Constitutional AI attempts to encode behavioral constraints into systems without clear definitions of the behaviors being constrained. Through dadirri—deep, recursive listening—we observe the fundamental absurdity: How can you align or control something whose basic nature you cannot specify?
The tech industry’s response to these concerns often involves promises of efficiency gains. However, as the Senate Committee documents, “increased energy efficiency of AI may not lead to reductions in AI’s total energy use, where there is increased demand overall for AI services.” Investor Goldman Sachs confirms that despite efficiency improvements, “the widening use of AI will still imply an increase in the technology’s consumption of power.”
This is particularly evident in how environmental impact data is controlled—as the Senate Committee notes, “[t]he publicly available information in relation to the energy consumption, operating profiles and expansion plans of data centres are ‘somewhat opaque’ due to ‘commercial sensitivity.’” Meanwhile, Indigenous communities face immediate consequences. The Pacific Digital Economy Programme documents how Island nations must balance essential digital infrastructure needs against environmental sustainability, while the resource consumption of many technology companies continues unchecked.
Indigenous frameworks offer concrete alternatives. Te Mana Raraunga demonstrates how data governance can prioritize “collective benefit” and “reciprocity” while maintaining technological advancement. Digital systems can support cultural preservation, while maintaining Indigenous control over data and resources. The urgent need for reform becomes clearer when examining future projections. It is projected that by 2023, the world’s data centers will consume more energy than India, the most populous nation, driven largely by the rapid expansion of AI infrastructure Yet, Indigenous communities have shown how technological systems can operate at scale while maintaining environmental and cultural sustainability.
Consider how our ancestors in Oceania maintained sophisticated technological networks. Pacific navigators developed complex distributed information systems based on observable phenomena and measurable results. Aboriginal songlines demonstrated how distributed knowledge systems could operate across vast distances while preserving Country. These weren’t just culturally sophisticated practices—they were technically superior approaches grounded in concrete reality rather than shifting definitions.
When Māori speak of kaitiakitanga, we’re not offering abstract principles but demonstrating sophisticated frameworks for managing concrete, definable systems. The concept of rāhui offers more sophisticated approaches to technological management than elaborate rituals of controlling undefined entities. When you’ve maintained sustainable distributed systems for 65,000 years, you don’t need vague concepts to understand system stability.
This understanding is urgently needed as Pacific nations face displacement. Tuvalu demonstrates how Indigenous communities must now preserve nationhood, governance, and culture through digital transformation while facing territorial loss. Meanwhile, the tech industry’s resource consumption accelerates the very environmental crisis threatening these nations.
The path forward requires fundamental change. The Senate Committee recommends “that the Australian Government take a coordinated, holistic approach to managing the growth of AI infrastructure in Australia to ensure that growth is sustainable, delivers value for Australians and is in the national interest.” However, this must expand to include Indigenous governance frameworks.
First, Indigenous data sovereignty must be integrated into AI governance. The Te Mana Raraunga framework demonstrates how rangatiratanga (authority) and kaitiakitanga (guardianship) can guide technological development. As documented in the Pacific case studies, successful digital transformation requires “respect [for] the region’s shared values.”
Second, environmental impact transparency must be mandated. As Dr. Crawford argues in her Senate submission, we need “measuring and public reporting of energy and water use by the AI industry as well as regular environmental audits by independent bodies to support transparency and adherence to standards.”
Third, resource management must be reformed. The Pacific Digital Economy Programme shows how digital infrastructure can be developed while respecting environmental limits. This requires what the Senate Committee calls “a coordinated and holistic Government approach to ensuring the growth of this sector is sustainable.”
There is a saying, Whatungarongaro te tangata toitū te whenua. It means: As man disappears from sight, the land remains.
The ancestors understood what silicon theology obscures—that technological systems must serve Country and community, not abstract notions of control. Our survival depends not on aligning undefined artificial intelligences, but on realigning our relationship with Country and the resources that sustain us. The mathematics is clear—not in theoretical proofs of controlling undefined systems, but in the thermodynamics of resource consumption and environmental impact. The rising waters confirm daily the bankruptcy of technological colonialism, whether digital or physical.
The time for failed experiments in digital terra nullius is over. The future of technology lies not in the worship of undefined superintelligence, but in the sophisticated frameworks for managing measurable technological engagement that Pacific peoples have understood all along. The only question remaining is how much more damage the Silicon priesthood will do before accepting what Indigenous technologists have been saying all along: In a distributed world, true safety comes not from technological prayers to undefined gods, but from balance, relationship, and deep understanding of actual system dynamics that require no faith in imaginary futures to maintain.
The Brookings Institution is committed to quality, independence, and impact.
We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).
Commentary
AI infrastructure’s environmental costs clash with Pacific Island nations’ needs
March 7, 2025