Designing for Humanity as AI Saturates Education Systems

Article
April 16, 2026

By: Kate Westrich

KEY TAKEAWAYS:

  • As AI becomes a constant presence in learning, education systems face a higher-stakes question about how to protect relationships, agency and what makes learning human
  • Drawing on KnowledgeWorks’ forecast on the future of learning and a new report from the Brookings Institution, we explore how unchecked efficiency and automation can quietly undermine trust, development and student autonomy
  • AI can support learning only when leaders prioritize human judgment, adaptability and care over speed or convenience

Artificial intelligence is already shaping students’ daily learning experiences. It’s helping teachers plan lessons, supporting tutoring and personalization and increasingly showing up outside of school as a source of answers, reassurance and advice.

The question facing education systems is no longer whether AI is coming. It’s whether we are designing learning environments that still protect and nurture what makes learning human. As KnowledgeWorks says in Charting a New Course for Education, this moment calls on leaders to “prioritize learners’ humanity, joy, academic and human development and ability to determine their own paths.”

Learning is not just about speed or efficiency. It is social, relational and deeply developmental.

That concern runs through both our future forecast and The Brookings Institution’s A new direction for students in an AI world: Prosper, prepare, protect. While the two publications have different focuses, they share this insight: learning is not just about speed or efficiency. It is social, relational and deeply developmental. The authors of the Brookings report observe that “children’s learning is fueled by social relationships and their holistic development.” Cognition, motivation and emotional well‑being grow together. When systems weaken one dimension, learning suffers overall.

AI complicates this reality because it does more than make learning experiences faster or more personalized or education systems more efficient. It can quietly reshape how young people experience effort, problem‑solving and agency. A New Direction for Students in an AI World warns that overuse can lead to “AI dependence [that] can erode students’ autonomy and agency,” especially when tools begin doing the thinking instead of supporting it.

KnowledgeWorks’ forecast similarly explores the potential for cognitive atrophy to become widespread, leading to AI dependence and attachment disorders. It also surfaces a parallel risk associated with the growing relevance gap between schooling and students’ lived experiences. When learning feels disconnected from their pressing concerns and aspirations, young people turn elsewhere for meaning and support. Some learners are increasingly turning to AI to close that gap.

AI, Education and the Futures We Choose

Watch a webinar featuring Rebecca Winthrop, one of the authors of A New Direction for Students in an AI World, and Katherine Prince, one of the authors of Charting a New Course for Education, moderated by KnowledgeWorks Vice President of Policy and Strategic Advancement Lillian Pace.

Designing for humanity does not mean rejecting AI or wishing it away. Both publications acknowledge the potential for real benefits when AI is used intentionally. Brookings notes that AI can “optimize teacher time for greater focus on students” when it reduces administrative burden rather than replacing instruction. In Charting a New Course for Education, we argue that meaningful integration of AI into education is possible, but only if system leaders are willing to let go of stability in favor of adaptability. As our forecast observes, approaches that rely on fixed answers and rigid best practices struggle in periods of disruption.

The risks of incorporating AI into education increase when efficiency becomes the goal rather than a byproduct. Faster grading, instant summaries and automated decisions can look like progress while quietly hollowing out learning. In addition to undermining the relational focus of education, they keep the focus on optimizing current educational approaches and structures instead of pursuing deeper systems transformation.

Trust is often one of the first casualties when the trade‑offs between technology and humanity go unmanaged. Brookings describes how AI can “degrade trust in education,” weakening relationships between students and teachers. In Charting a New Course for Education, we situate that erosion of trust within a broader decline in confidence in public institutions, one of the drivers of change shaping education over the next decade. Rebuilding trust requires more than putting in place acceptable AI use policies or rolling out new tools that make use of the technology. It requires visible commitments to human judgment, transparency and care, especially in decisions that affect young people’s growth and development.

Designing for humanity means making deliberate choices about what education systems protect and nurture. In an AI‑saturated world, those choices are becoming increasingly essential.

THE AUTHOR

Kate Westrich
Vice President of Marketing and Communications

Related Resources

To counter polarization and prepare for an uncertain future, civic education must be reimagined as a trusted, well-supported and adaptive system that equips young people to engage constructively in democracy.

Katherine Prince
Vice President of Foresight and Strategy

As everyday disruptions intensify, education systems must transform by becoming more humane, flexible and community-centered.

Matt Williams
Executive Vice President and Chief Program Officer

Signals of change show how today’s innovations can help schools build future-ready learning environments. Get examples.

Katie King
Senior Director of Strategic Engagement

Menu

Search