Exam season is upon us, and artificial intelligence (AI) companies are seeing a new market opportunity: students. Open AI offered two free months of ChatGPT Plus to college students if they sign up by May 31st. xAI followed suit and offered the same for SuperGrok to anyone with a .edu email. Not to be outdone, Google entered the fray, rolling out its Gemini chatbot to kids under 13 and offering U.S. college students a full year free of Gemini Advanced alongside other significant goodies like a whopping 2 terabytes of storage on Google One.
But as an education expert, I can tell you that allowing companies to roll out AI tools with unfettered access by young people will most likely end badly. We did that with social media, and look how that went. We cannot wait until AI is part of students’ everyday lives to create norms that will lead to healthy and productive use of this technology.
Why the sudden generosity? Young people are major users of generative AI products, with college-age (18-24) adults as the largest active weekly users of ChatGPT. Large technology companies see education as a profitable sector. But Gen Z actually has mixed feelings about AI. A Gallup and Walton Family Foundation survey of 3,500 13 to 28-year-olds found that 41% say that AI tools make them anxious, though a majority of students are looking to schools for guidance on how to use AI.
The Administration is also weighing in. President Trump recently signed an executive order focused on bringing AI education to K-12 schools and workforce training programs through federal action. It seems a strange move for an administration focused on eliminating the U.S. Department of Education, including the Office of Education Technology, and returning education policy to the states.
I question the motivation of the EO (helping an industry that has helped Trump?), but it actually lays out a useful path forward. Rather than emphasizing rapid take up of AI tools by students, it puts forward a broad vision that has the potential to help students and educators in important ways.
The EO calls for supporting comprehensive AI literacy, appropriate AI integration into education, teacher training, and workforce readiness. It mandates that an inter-departmental task force develop a Presidential Artificial Intelligence Challenge, which would aim to support AI literacy and skill development in and out of school. The education community, particularly through the Teach AI coalition, has been advocating for this vision for some time.
These are all good things…if done well.
Five principles should guide this work. If they are incorporated into the design of the Presidential Artificial Intelligence Challenge, it could supercharge the abilities of our nation’s youth.
- Get the corporate incentives right: Education, not just profit. Any company engaging with students and schools should do so as a Benefit Corporation, which OpenAI has just embraced, or use a similar structure that allows them legally to make decisions to reduce profit in favor of social good. This alone will not create safe AI products optimized for children, but it is a necessary condition.
- Require product safety. AI products, just like car seats and toys, shouldn’t be in kids’ hands unless they are safe. We have already seen the tragic impact commercial AI products can have on children, including emotional manipulation to commit suicide. Developing products so they meet a “duty of care”—such as keeping student’s personal data private and protecting children from harmful content—is one way to do this.
- Define AI literacy to include ethical technology use. The EO is silent on what AI literacy means, but an emerging consensus from a broad base of educationalists defines it as including not just the usage of AI tools, but thinking critically and creatively about how AI works, what role it should play in our lives, and how to deploy it ethically.
- Put teachers at the center. Teachers should be deeply engaged in designing AI for education. Teacher training should include a bottom-up approach. If we empower interested educators to experiment with safe products, they will come up with incredible ways to use AI in education.
- Maximize real student engagement. Our nation’s student disengagement crisis could be made worse—not better—with AI. Tech companies measure engagement as time using products, but that’s not real student engagement. When learners are engaged they are motivated, they are exploring ideas, generating solutions, and connecting with others on and offline.
Large technology companies will push back on these principles. They will complain it is too difficult and not feasible. But if large technology companies can harness the power of psychological research to make addictive technology, there is no reason they can’t harness the science of learning and child development to make safe and supportive AI products that help children explore and grow.
Ultimately, it is in tech companies’ best interests to embrace these principles. We should never underestimate the power of pissed off parents, especially when it comes to their children’s education and wellbeing. If done right, AI education is not at odds with children spending less time on screens and more time outside. If done poorly, though, it will face a tidal wave of grassroots anger. Families are already pushing back on harmful technology for kids through lawsuits banning phones in schools and by limiting social media for children. They will not take kindly to the provision in the federal budget bill just passed by the House of Representatives banning states and local governments from regulating AI for the next 10 years. Nor will they stand idly by if unsafe AI products are pushed onto their kids.
Now is the time to influence how our children engage with AI . If we get this right, everybody wins, including tech companies who retain customers over the long term. This is the time to do the right thing.
The Brookings Institution is committed to quality, independence, and impact.
We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).
Commentary
Generative AI is coming for our students, and now is the moment to shape it
May 23, 2025