AI Is Education’s Next Big Test
Nov 20, 2025
Artificial intelligence has unleashed economic turbulence and existential dread. But when it comes to education leaders and reformers, it has fueled something more familiar, facile, and ultimately dangerous: cheerleading.
Secretary of Education Linda McMahon has enthused, “AI development. I mean, how can we educate at the speed of light if we don’t have the best technology around?” She added, “There is a school system that’s going to start making sure that first graders or even pre-K have A1 [sic] teaching . . . that’s just a wonderful thing.” The Gates Foundation’s K–12 education chief promises that AI tutoring will allow students to pursue “their learning journeys” and to shift from “binary right-or-wrong thinking to curiosity and exploration.”
The University of Utah’s Hollis Robbins, humanities scholar turned AI enthusiast, touts the much-discussed Alpha School as evidence that AI will yield up to a “5x” increase in student learning. The American Federation of Teachers is launching a National Academy for AI Instruction in Manhattan, along with Microsoft, OpenAI, and Anthropic.
As an education scholar and pundit, I’m bombarded daily by vendors, flacks, and advocates telling me that AI is “reshaping homework, instruction, and assessment across K–12 and higher education.”
The irony? At this very moment, we’re scrambling to combat the toxic influence of cellphones in schools and reduce the astronomical amount of time our kids spend on devices.
Cellphones and social media in the classroom turned out to be engines of distraction — the 24/7 ability to gossip, bully, check texts, shop, gamble, access porn, and scroll is irresistibly alluring to huge numbers of kids (and adults, for that matter). Are we surprised? In 2023, Common Sense, a nonprofit focused on the role of media in children’s lives, reported that 97 percent of students aged eleven to 17 checked their phones during the school day. The average student picked it up more than a dozen times. Sixty percent of students received more than 200 text messages a day, with about a quarter of those arriving during school hours.
Principals and teachers are begging policymakers to restrict the use of phones in school. All those distracted millions are tuning out their teachers and ignoring their peers in the same room. Students are reading less and misbehaving more (indeed, at record rates), and achievement has been declining for over a decade. Whether cellphones are causing these problems or just aggravating them, our fancy tools seem to have caused more harm than good in the classroom.
Harvard University’s Martin West, vice chairman of the National Assessment Governing Board, has observed that the twinned rise of smartphones and social media platforms are the most likely culprits for widespread academic declines that date to the early 2010s.
The downsides of cellphones and ubiquitous screens may seem obvious now, but much less obvious is the stark reversal in the country’s enthusiasm for classroom technology from 15 years ago, when the iPhone was shiny and new. Back then, the nation’s most influential politicians, philanthropists, and education reformers touted the promise of “blended learning” and “one-to-one” devices.
In 2010, in “iPod, iListen, iRead,” Edutopia excitedly reported, “We have heard teacher after teacher say, ‘This has totally transformed my teaching!’” and “‘Using iPods with microphones has engaged students more than anything I’ve ever experienced!’” In 2011, in “How the iPad 2 Will Revolutionize Education,” Fast Company insisted that “skeptical educators can be relieved that the iPad was deemed classroom ready by Reed College, and that remote learning can be just as effective as in-class lecturing.”
In 2012, the Washington Post profiled the D.C. area’s “ultra-wired” Flint Hill School, describing how it had “given each child a device — starting with an iPad for every preschooler.” The dean of faculty explained, “Tech is like oxygen. It’s all around us, so why wouldn’t we try to get our children started early?” That year, in “How the iPad Can Transform Classroom Learning,” Edutopia noted that many educators wondered how to keep students from playing games. The answer was purportedly very simple: “If students are given engaging, open-ended problems to solve, they won’t want or need to play games.”
In 2013, the Brookings Institution’s Darrell West argued that technology “enables, empowers, and engages learning” but that “sadly, not every student has access to a computer and the Internet.” His solution? “Most young people have phones.” Since “students love mobile technology,” he urged educators to “harness” cellphones to “transform education.”
In 2014, Lalitha Vasudevan, now managing director of Columbia University’s Teachers College Digital Futures Institute, insisted that cellphones “afford young people the chance to be seen and engaged as actors with a repertoire of literate practices and a sense of agency.” Indeed, she said, they “serve as powerful resources in reconfiguring the educational landscape.” Columbia’s Julie Warner described the “rich literacy practices” of high schoolers using phones. Warner thought it “irresponsible” to “eschew the powerful tools that travel with youth throughout their daily lives.”
I recall the decorated high school principal who bragged to me in 2013 about a student typing a research paper on his iPhone. The principal explained, “We encourage our students to use mobile devices,” and “that’s the device he was comfortable with.” When I noted that this seemed oddly inefficient, the principal told me I didn’t get it — his school was “unapologetically committed to being future-forward.”
This misguided enthusiasm was consistent with the confidence that social media would promote empathy and enrich democracy. Indeed, in 2012, tech philosopher David Weinberger hailed the “utopian possibilities” of online culture.
To resurface those old quotes in 2025 is to elicit weary sighs at the naïveté of it all. What happened? Why did we get this so wrong? And what can this story tell us about how we should handle AI?
First, there’s something inevitably facile and appealing about technology in education. It promises easy solutions to stubborn problems. It offers hopeful shortcuts to improving teachers’ skills and motivating students. Additionally, school leaders like to see themselves (and to be seen) as “future-forward.” In an effort not to be left behind, educators did not look before they leaped.
Then, we sorely underestimated the distraction factor. When it came to screens, we were used to thinking in terms of TVs and desktop computers. Those were never portable and were not designed to bombard us with a buffet of personalized distractions. How can a teacher compete with a dinging, buzzing bastion of gossip and gambling that’s tucked into a 13-year-old’s pocket?
Above and beyond the innate problems with cellphone-filled schools, enthusiasts vastly overestimated the ability of teachers to productively use these devices as educational tools. Recall the constant refrain during pandemic-era school closures that teachers had never been trained on technology.
Finally, the push to embrace the internet, laptops, and cellphones became an excuse for progressive educators to neglect real knowledge-building. Tech evangelists insisted that students no longer needed to know South American geography, Civil War battle facts, or the periodic table of the elements because (as the then-dean of Stanford’s School of Education told Time magazine way back in 2006) “you can look it up on Google.” By 2022, the National Council of Teachers of English was urging schools to “de-center book reading and essay writing” in favor of tech-friendly “literacy,” with schools instead focusing their attention on digital content like memes and videos.
Schools threw open their doors to distraction machines while failing to maintain expectations or norms. Along the way, they scrapped textbooks and paper assignments and moved everything online. This meant students had to be in front of screens to do their homework, making it tougher for parents to limit at-home screen use. Schools required teachers to use learning-management systems like Canvas and Google Classroom, which can be frustrating to navigate for parents who aren’t technically adept.
By embracing the “digital future,” schools invited distraction, took printed books out of students’ hands, made it tougher for parents to limit screen time at home, and aggravated the relationship between parents and teachers. That’s quite a legacy.
Technology is a tool, which means it’s useful only when wielded well. Most parents wouldn’t tell their nine-year-old, “You’re going to drive one day, so here are the keys to the pickup.” Instead, we recognize that there’s a time and place for introducing children to new tools, and that time should be marked by forethought and mentoring. The lesson is not that “the Luddites are right” but that the how and why of technology matter.
This brings us to AI.
To be clear, millions of students have been using algorithm-fueled reading and math programs, tutoring modules, and assessments for years. So, in some senses, “artificial intelligence” isn’t wholly new in schools. But the power of generative AI certainly is. It will universalize access to tutoring and knowledge, create powerful new tools for learning, and alter what graduates will be expected to do after graduation in order to find or keep a job.
At the same time, AI threatens to retard human development. Earlier this year, OpenAI surveyed college students and found that the most common uses of AI included “summariz[ing] texts,” “mathematical problem-solving,” “essay drafting,” and coming up with “exam answers.” That kind of AI use promises to erode the core purposes of higher education, as students outsource the hard work of thinking, analyzing, and writing to large language models.
Alex Kotran, CEO of the AI Education Project, has cautioned,
In instances where I’m struggling to express something complex, I now reflexively check to see what the AI’s take would be rather than push through the challenge. Proponents of AI would say that’s the point: I’m . . . unlocking that brain power for other, more important work. That may be so. But I have the unsettling intuition that my writing muscles will atrophy as that reflex to tag in the AI becomes second nature.
Given what we know about the pitfalls of technology in education, what do we need to do to better navigate the AI path? I propose four guideposts.
First, remember that cellphones didn’t make knowledge passé and neither will AI. While it’s true that our kids will inhabit an AI-dominated world, that only makes it more vital that they know enough history, civics, science, and literature to distinguish fact from fiction. Don’t be seduced by the proponents of “21st-century skills” who insist that “mere facts” are unimportant when students can simply look things up. This was always nonsense, and it’s more nonsensical than ever in a world of deepfakes and AI hallucinations.
A second, related point: AI is “a knowledge amplifier, not a knowledge substitute,” as the American Enterprise Institute’s Robert Pondiscio recently observed. AI shouldn’t be writing students’ essays for them. It’s not enough for a student to do something once and then hand it off to AI forever after. After all, how do we learn to ride a bicycle, drive, or type? By deliberate, studied repetition. If students are deprived of the chance to develop reflexive mastery of computation, reading, or writing, they’ll be stripped of agency. Rather than deciding when and how to use these powerful tools, they’ll be dependent on them, ill-equipped to spot or address problems.
Third, the human need for connection isn’t going anywhere. A diet of small screens and social media has left America’s youth lonely, anxious, and alienated. Our faith that social media would help youth feel connected and empowered has turned to ashes. There is no substitute for real-life family, friends, mentors, and bonds of affection. AI tutors may have much to offer, but they can’t provide a wise or reassuring hand on the shoulder, no matter how compelling the illusion.
Finally, how schools approach AI use will have ripple effects on kids and parents at home. If schools routinize student interaction with AI companions, it will limit the ability of families to make decisions about when and how their children should use those tools. If schools default to AI tutors to support students, it’ll be that much tougher for parents to limit at-home tech (or even to gauge when a quiet conversation with an AI companion is actually homework or something different and potentially troubling).
There aren’t easy solutions for any of this. When it comes to education, AI poses a raft of trade-offs amid a sea of uncertainty. But schools need to steer clear of familiar missteps. This will require much more than glib enthusiasm, rash promises, or rapt wonder. It requires judgment and wisdom. In other words, just the sort of things we can’t delegate to ChatGPT.
Click here to explore this article's original source for more.