AI is Changing Everything: How Progressive Schools Are Preparing Students for an Uncertain Future

Jan 20, 2026 | Blog

AI is already grading essays and tutoring students.

So what should schools actually be teaching in 2026? Here’s how progressive education prepares students for a future we can’t predict.

As AI automates routine cognitive tasks, education must pivot from teaching what AI can do to developing distinctly human capabilities like critical thinking, creativity, adaptability, and ethical reasoning that remain essential regardless of technological change.

The eighth grader sitting across from you can ask an AI to write their essay. Grade their math homework. Explain historical concepts. Generate code. Analyze data. Solve complex problems.

All in seconds.

So what’s the point of school?

It’s January 2026. AI isn’t coming to education. It’s here. Already embedded in how students work, how teachers teach, how learning happens. The question isn’t whether AI will change education. The question is: what kind of education remains valuable when AI can do so much of what we traditionally taught?

This isn’t hypothetical. It’s urgent. And the answer isn’t to ban AI or pretend it doesn’t exist. The answer is to fundamentally rethink what education is for.

The 2026 education landscape: what’s really changing

Five years ago, AI in education was mostly theoretical. Pilot programs. Research studies. Experimental platforms. Something people discussed as a future possibility.

Now? It’s infrastructure. AI already handles scheduling, admissions, student support, compliance, reporting. These behind-the-scenes applications feel safe, practical, uncontroversial. Schools adopt them because they work.

But the more significant shift is happening in classrooms and at kitchen tables. Students use AI for homework. Not occasionally. Routinely. They generate essays, solve problems, get explanations, receive feedback. Some teachers embrace this. Some resist it. Most feel caught between two realities: the AI that exists and the educational system designed for a pre-AI world.

Research from 2026 confirms something educators already know instinctively: when students recognize they’re being asked to develop skills that seem irrelevant, they disengage. And AI makes irrelevance impossible to ignore. Why practice procedural tasks AI handles instantly? Why memorize information AI can retrieve perfectly? Why spend hours on assignments AI completes in seconds?

This creates what researchers call “rational disengagement.” Students aren’t lazy. They’re logical. They correctly identify that certain traditional academic tasks provide minimal future value, leading them to seek efficient completion rather than meaningful learning.

The problem isn’t that students use AI. The problem is that much of conventional education was already teaching things AI does well: recall, procedural execution, pattern recognition, routine problem-solving. Those skills mattered when humans were the only ones who could perform them. Now they don’t.

So what remains? What can’t AI do? What will always require human capabilities?

Skills AI can’t replace: why critical thinking matters more than ever

AI is remarkable at many things. It processes vast amounts of information instantly. It recognizes patterns. It follows rules precisely. It generates content quickly. It automates routine cognitive tasks.

But AI fundamentally lacks certain human capacities. It doesn’t truly understand context. It can’t navigate genuine ambiguity. It doesn’t possess judgment. It can’t create original ideas beyond recombination of existing patterns. It has no values. No wisdom. No ethical framework. No lived experience.

Most crucially: AI can’t ask the right questions.

It answers what you ask. But it can’t determine what you should be asking. It can’t recognize when a problem is framed incorrectly. It can’t notice what’s missing. It can’t challenge assumptions. It can’t say “wait, we’re approaching this wrong.”

These distinctly human capabilities, what education experts call “21st century competencies,” matter more in 2026 than ever before. Critical thinking. Creative problem-solving. Complex reasoning. Ethical judgment. Adaptability. Communication. Collaboration. The ability to work with ambiguity and uncertainty.

Research on AI in education consistently identifies these as the skills that remain uniquely human and increasingly valuable. Studies emphasize that as AI handles routine cognitive work, education must prioritize critical thinking, analytical reasoning, creativity, and the ability to work effectively with AI systems while maintaining human judgment.

Here’s what that means practically: education needs to shift from teaching students what to think to teaching them how to think. From knowledge recall to knowledge application. From following procedures to solving novel problems. From consuming information to evaluating it critically.

This isn’t new educational theory. It’s existed for decades. But AI makes it non-optional. When students can access any information instantly, memorization becomes pointless. When AI can follow any procedure, procedural knowledge becomes insufficient. When standard problems have automated solutions, the only valuable skill is approaching non-standard problems.

The goal isn’t to compete with AI. That’s impossible and unnecessary. The goal is to develop the human capabilities that complement AI. To do the thinking AI cannot do. To ask the questions AI wouldn’t think to ask. To make the judgments AI cannot make.

Education focused on these capabilities looks fundamentally different than traditional schooling. It’s less about transmitting information and more about developing thinking. Less about right answers and more about asking better questions. Less about individual performance and more about collaborative problem-solving.

This is where progressive education approaches like ours become essential rather than alternative.

Project-based learning as preparation for an AI-driven world

Consider two scenarios.

Scenario one: A student learns about environmental science through textbooks, lectures, tests. They memorize the carbon cycle. They answer multiple-choice questions about ecosystems. They write an essay summarizing research others conducted. All tasks AI could complete or significantly assist with.

Scenario two: A student investigates actual watershed health in their local area. They collect water samples. They interview community members. They research relevant environmental policies. They collaborate with scientists. They design interventions. They present findings to local officials. They encounter genuine problems with no predetermined solutions.

Which student develops skills AI can’t replace?

The second scenario isn’t hypothetical. It’s how we’ve approached education for 67 years, and it’s exactly what our High School program does daily. Students place their passions at the center of their education, designing projects that meet curriculum requirements through work they genuinely care about.

Project-based learning doesn’t just make education more engaging, though it does. It fundamentally changes what students develop. When you work on real problems, you encounter true complexity. Ambiguity. Conflicting information. Stakeholders with different perspectives. Solutions that create new problems. Reality doesn’t provide neat answers. Neither does project-based work.

This matters enormously for preparation for an AI world. AI excels at well-defined problems with clear parameters. It struggles with genuine ambiguity, with situations requiring nuanced judgment, with problems where the solution depends on human values and priorities.

Students working on authentic projects develop exactly those capacities AI lacks. They learn to frame problems, not just solve pre-framed ones. They navigate uncertainty. They make decisions with incomplete information. They communicate with diverse stakeholders. They iterate based on real-world feedback. They balance competing priorities. They make ethical judgments about which solutions to pursue.

Our High School students complete eight projects over their junior years, covering core curriculum through integrated, interdisciplinary work. By senior year, they design year-long capstone projects driven by their own questions and interests. These aren’t school assignments disconnected from reality. They’re genuine investigations into questions that matter.

Research confirms what we observe: when students engage in project-based learning that requires critical thinking, problem-solving, and creative application of knowledge in authentic contexts, they develop exactly the skills that remain valuable as AI automates routine cognitive work. These skills transfer across domains. They adapt to new situations. They remain relevant regardless of technological change.

The key is that project-based learning requires students to actually think, not just follow procedures or recall information. It pushes them to approach problems from multiple perspectives, to analyze information critically, to develop solutions creatively. These are the capacities education must develop if it wants to remain relevant.

Real-world problem-solving in action

Let me get specific about what this looks like in practice.

A student interested in marine biology doesn’t just learn about ocean ecosystems from textbooks. They design a project investigating local marine environments. They spend time on our campus trails hiking down to Witty’s Lagoon, observing actual tidal zones, collecting data on organism populations, researching climate impacts, connecting with marine scientists, analyzing their findings, and presenting recommendations.

The learning isn’t theoretical. It’s experiential. And crucially, it involves cognitive work AI cannot replicate. The student must ask: what question actually matters? How should I approach this investigation? What data do I need? How reliable are different sources? What do these findings mean? What should we do about it? How do I communicate this effectively?

These questions require judgment, creativity, ethical reasoning, critical thinking. All distinctly human capacities.

Or consider a student passionate about social justice who designs a project examining housing equity in Victoria. They research zoning policies. They interview affected residents. They analyze economic data. They study successful interventions in other communities. They collaborate with local advocates. They develop policy recommendations.

Again: genuine complexity. No predetermined answer. Requires navigating multiple perspectives, evaluating conflicting information, making ethical judgments, communicating effectively with diverse audiences. Everything AI struggles with.

This approach doesn’t minimize content knowledge. Students still learn science, history, math, literature. But they learn it in service of answering questions they genuinely care about. The content becomes meaningful because it’s useful for something that matters.

And importantly, they learn that AI is a tool they can use in this work. They might use AI to help analyze data, research background information, organize their findings. But AI can’t do the fundamental thinking: framing the question, determining what matters, making judgments, designing interventions, communicating with stakeholders.

Teaching with AI vs teaching about AI: getting the balance right

There’s confusion about AI’s role in education. Should we teach students to use AI? Should we teach them how AI works? Should we restrict AI use to preserve traditional skills? Should we embrace it fully?

The answer: yes to all of these, appropriately balanced.

Students need technological literacy. They need to understand how AI systems function, what their capabilities and limitations are, when to use them and when not to. This is essential 21st century knowledge. Ignoring AI in education would be educational malpractice.

But students also need to develop human capabilities that remain valuable regardless of AI advancement. Critical thinking. Creativity. Ethical reasoning. Communication. Collaboration. Problem-solving. These can’t be outsourced to AI.

The balance looks like this: use AI as a tool while developing capacities AI cannot replace.

In our programs, students use technology extensively. Our High School Exploration Lab includes 3D printers, laser cutters, CNC routers, and yes, AI tools. Students learn to leverage these resources effectively. But the emphasis remains on human thinking. Students use AI to support their work, not to do their thinking for them.

Research on responsible AI integration in education emphasizes this principle: AI should support learning without replacing the cognitive work essential for developing future-ready skills. Studies recommend that AI provide scaffolding and feedback while ensuring students maintain genuine engagement with critical thinking, creative problem-solving, and analytical reasoning.

Practically, this means: AI might help students research background information, but students must determine what questions to ask. AI might analyze data patterns, but students must interpret what those patterns mean. AI might generate initial drafts, but students must critically evaluate and substantially revise. AI might provide feedback, but students must develop their own judgment.

The goal isn’t to eliminate AI use. That’s impossible and counterproductive. The goal is to ensure AI enhances rather than replaces the thinking students need to develop.

This requires significant pedagogical skill from teachers. They need to design assignments that require genuine human thinking even when AI is available. They need to teach students when AI use is appropriate and when it undermines learning. They need to help students develop the judgment to evaluate AI output critically rather than accept it uncritically.

It also requires honest conversations about why certain work matters. Students will ask: why shouldn’t I use AI for this? If the only answer is “because those are the rules,” that’s insufficient. Students need to understand that certain cognitive work develops capacities they need, even if AI could complete the task faster.

The most effective approach treats AI neither as threat nor savior but as one tool among many, useful for some purposes and inappropriate for others, requiring human judgment to deploy effectively.

How Westmont’s High School prepares students for careers that don’t exist yet

Here’s a question that should terrify traditional education: how do you prepare students for jobs that don’t exist yet, using technologies that haven’t been invented, to solve problems we haven’t identified?

You can’t teach specific job skills for roles that don’t exist. You can’t provide training for tools not yet created. You can’t prepare students for challenges we can’t predict.

So what can you do?

You develop adaptability. You cultivate curiosity. You teach how to learn. You build confidence in navigating uncertainty. You create students who can figure things out, who can teach themselves, who can collaborate effectively, who can think creatively about novel problems.

This has always been our educational philosophy. It’s just become non-optional.

Our High School program operates on principles that align perfectly with preparing students for an uncertain future. We don’t try to predict what knowledge will remain relevant in 2040. We develop learners who can acquire whatever knowledge becomes relevant.

We do this through several key practices:

We emphasize self-directed learning. Students choose their projects, design their approaches, manage their time, take responsibility for their education. They’re not passive recipients of predetermined curriculum. They’re active agents of their own learning. This develops exactly the autonomy and self-motivation essential for lifelong learning in a rapidly changing world.

We provide mentorship from professionals in various fields. Students work with experts who help them understand how knowledge applies in real contexts. They see how people actually work, how they approach problems, how they continue learning throughout their careers. This demystifies professional environments and shows students that learning doesn’t end with school.

We structure the year in four eight-week cycles, each culminating in immersive learning experiences: outdoor education, university campus visits, biennial international trips. These experiences expose students to diverse contexts, broaden their perspectives, and help them understand how different environments require different approaches.

We offer an Exploration Lab equipped with current technology. Students learn to use tools available now while developing the fundamental understanding that allows them to adapt to whatever tools emerge later. The specific technologies will change. The capacity to learn new technologies transfers.

Most importantly, we create a learning culture where uncertainty is normal, where not knowing is the starting point for investigation, where questions matter more than answers, where creativity and critical thinking are valued over compliance and memorization.

Research on future workforce preparation confirms that adaptability, continuous learning, and complex problem-solving are the capacities organizations will value most. Studies indicate that employers increasingly seek professionals who combine technical literacy with critical thinking, emotional intelligence, creativity, and the ability to work effectively with emerging technologies.

Students who graduate from programs emphasizing these capacities don’t emerge with perfect knowledge of everything they’ll need. That’s impossible. They emerge with the tools to figure out what they need to know and the confidence to teach themselves.

That’s the only kind of preparation that makes sense for an uncertain future.

Independence and adaptability: the ultimate future-proof skills

There’s a common thread running through all the capacities education should develop: they all require independence.

Critical thinking requires thinking for yourself rather than accepting what you’re told. Creative problem-solving requires generating your own solutions rather than following prescribed procedures. Adaptability requires adjusting your approach when circumstances change rather than rigidly following plans. Lifelong learning requires taking responsibility for your own development rather than waiting for someone to teach you.

All of these rest on independence. And this is where traditional education often fails most dramatically.

Traditional schooling frequently treats students as passive recipients of predetermined curriculum. Teachers decide what to learn, when to learn it, how to learn it, what counts as success. Students follow. They comply. They demonstrate they’ve absorbed what was presented. But they rarely develop genuine independence.

This worked adequately when the goal was preparing workers for relatively predictable roles in stable industries. It fails catastrophically when preparing students for an unpredictable future where they need to direct their own continuous learning.

Montessori education has emphasized independence for over a century. We believe children are capable of directing significant aspects of their own learning when provided appropriate support and environments. From our Early Years program through High School, we structure education to develop independence systematically.

Young children choose their work within prepared environments. They learn to assess their own progress. They develop self-regulation. They experience the satisfaction of mastery achieved through their own effort.

As students progress, independence increases. Middle School students make more choices about their learning paths. High School students design their own projects, manage their time, seek resources they need, take ownership of their education.

This isn’t absence of structure or guidance. It’s structure that serves the development of independence rather than enforcing compliance. Teachers provide frameworks, offer resources, give feedback, ensure students develop necessary competencies. But students maintain agency. They’re learning to be independent learners, not compliant recipients of instruction.

This matters profoundly for future readiness. When AI can provide answers, tutorials, explanations, and feedback on demand, the most valuable skill is knowing what you need to learn and taking initiative to learn it. That’s independence.

Research on self-determination theory shows that autonomy is one of three fundamental human needs driving intrinsic motivation. When students experience genuine agency in their learning, they develop the internal drive to continue learning throughout their lives. This intrinsic motivation proves essential when the external structures of formal education end.

Students who develop independence in school don’t graduate waiting for someone to tell them what to do next. They identify opportunities. They recognize gaps in their knowledge. They seek resources. They teach themselves. They adapt. They persist through difficulty because they’ve learned to trust their capacity to figure things out.

That’s the ultimate future-proof skill. Not any specific knowledge. Not any particular technical capability. But the confidence and capacity to learn whatever becomes necessary.

It’s January 2026. AI isn’t the future. It’s the present. And education can’t pretend otherwise.

The question isn’t whether AI will disrupt traditional education. It already has. The question is what kind of education remains valuable when AI can do so much of what we traditionally taught.

The answer: education that develops distinctly human capabilities. Critical thinking. Creativity. Ethical reasoning. Adaptability. Independence. Communication. Collaboration. The capacity to ask better questions, to navigate genuine complexity, to make judgments AI cannot make, to solve problems AI wouldn’t recognize as problems.

This isn’t new educational philosophy. It’s the foundation we’ve built on for 67 years. It’s just become essential rather than progressive.

Students who develop these capacities won’t compete with AI. They’ll work with it effectively. They’ll do the thinking AI cannot do. They’ll create the future AI cannot predict. They’ll ask the questions AI wouldn’t think to ask.

That’s what education for the future before us actually means. Not predicting what knowledge will matter in 2040. Not trying to teach every possible skill students might need. But developing learners who can figure out whatever they need, who can adapt to whatever emerges, who can think independently and critically about an unpredictable future.

That’s the kind of education that remains valuable regardless of how AI evolves. That’s what progressive schools do. That’s what students need.

Ready to Learn More?