TalentGenius

For Professionals

For Companies

About

TalentGenius

For Professionals

For Companies

About

AI

Cheating with AI on Campus: A Curious Upside

Cheating with AI on Campus: A Curious Upside

Cheating with AI on Campus: A Curious Upside

Jun 16, 2025

AI cheating on college campuses is on the rise—but could it have an unexpected upside? This provocative piece explores how students who “cheat” with AI may actually be gaining critical skills for the modern workforce.
AI cheating on college campuses is on the rise—but could it have an unexpected upside? This provocative piece explores how students who “cheat” with AI may actually be gaining critical skills for the modern workforce.

Cheating with AI on Campus: A Curious Upside

Cheating with AI on college campuses is running rampant.

But… is that such a bad thing?

I know...sounds like a crazy question to ask.  Heresy. Blasphemy. 

After all, cheating is grounds for academic excommunication. We’ve all memorized the catechism, for it represents:

  • Stealing from others: On a curve, a cheater isn’t just lazy—they’re a thief.

  • Stealing from yourself: If you fake the work, you fake the learning. And if that occurs, what was the point of tuition again?

  • Dishonor: It’s a slippery slope. First it’s ChatGPT on the term paper, then it’s forging tax returns or embezzling pension funds.

So, to be clear: I’m on Team No-Cheating. Scout’s honor.

But let me tell you an example of how deep this now runs on campus.

Orin Starn, a professor at Duke, is fighting a battle he’s already lost.

His students are feeding prompts into ChatGPT, watching it spit out polished essays in seconds, and turning them in without blinking. Starn spots them instantly—the sterile grammar, the seamless transitions, the absence of anything resembling a human voice. He notes the technology “bleaches out errors as well as individuality into bright white AI empty word carbs.” He's not guessing. He knows.  

Recently, New York Magazine ran a story titled “Everybody Is Cheating Their Way Through College.”  Other headlines—“The AI Cheating Crisis in Higher Education Is Worse Than Anyone Expected” and “Inside the University AI Cheating Crisis”—echo the same note: something big has broken loose, and no one has managed to bottle it back up.

Starn, like many in his position, is frustrated. “You learn zero from plugging a prompt into AI,” he says.

And he’s not wrong.

But here’s the thing—maybe that’s not what college should be about anymore.

Not in the way it used to be.

A century ago, the mission was clear: send kids out with just enough polish and sufficient skills to make them useful to someone with a bigger office. But something happened. Two things, really.

  • First, the academy began to drift—call it a slow float away from relevance.into irrelevance. Faculty fell in love with ideas that didn’t always love them back. Course catalogs filled with courses that didn’t build skills or teach critical thinkingabstractions. Employers? They weren’t consulted. The disconnect between what employers look for and what graduates were equipped to do grew so wide you could drive a tenure committee through it.

  • Second, and more explosively: AI arrived. The machine is voraciously eating the bottom rungs of the career ladder. The boring stuff—the tasks once reserved for interns and fresh grads trying to prove they belonged—is quickly vanishing. No more copyediting email blasts. No more grinding through spreadsheets or summarizing white papers. That work now belongs to algorithms. Cheap ones. Tireless ones.  And it’s showing up in the new-hire data, as new college grads in the US face an increasingly difficult job market, with underemployment for them at a record 40.6%.

This environment has created a distinct Mason - Dixon line for young knowledge workers: Those who don’t know AI are getting hammered.  Those facile with it are getting great jobs - and quickly wowing their (non-AI literate) bosses with their incredible productivity.  

And - here’s the key point: Guess which entry-level workers know how to work with AI?  Yep, the students who’ve been “cheating.”

They weren’t just copying—they were training. Getting fluent in the interface. Learning how to write prompts that work, which means learning how to ask the right questions that get AI to generate the right output. Building muscle memory for managing AI outputs. And upon graduation, they walk into a job market that no longer values rote tasks—but does reward people who know how to orchestrate AI to deliver outcomes..

After all, the future belongs to those who assemble and harness hybrid teams.  That is, the human who pulls together the right set of AI agents (and manages them like employees) to accomplish a task.  And that’s exactly what employers are looking for: individuals who’ve done more than experiment with AI—they’ve deployed it. On real tasks. With deadlines, stakes, and competition. And have successfully shown that they can ask AI the right questions to generate the right answers in the right format.

I know.  It’s messy. It’s ethically fraught. It’s also… kind of logical.

In my view, it’s time for the academy to get with the program—to stop treating AI use as a form of academic treason and start recognizing it as a tool. Decriminalize it. Normalize it.

Yes, that will mean rethinking how we teach and how we grade. It will mean rewriting rubrics and reimagining what “original work” even means. But in the process, something better might emerge.

Because if done right, a student might actually learn more—anthropology, economics, chemistry, even Chaucer—not less. And more importantly, they’ll learn the one lesson that will matter most in the years ahead: how to harness the new great machine to do great things.


Share this Insight on:

Sign up for TalentAgent and transform your career

Sign up for TalentAgent and transform your career

Sign up for TalentAgent and transform your career

Recent Insights

Recent Insights

Recent Insights