When AI Helps Me Teach—and When It Helps Me Avoid Myself

I recently read an opinion piece in PNAS that introduced a term I haven’t been able to shake: machinal bypass.

The authors describe machinal bypass as using artificial intelligence not just to support our work, but to sidestep the uncomfortable, human parts of it—uncertainty, creativity, vulnerability, and presence. It’s modeled after the idea of “spiritual bypass,” where people use spirituality to avoid difficult emotions rather than engage with them.

As a teacher who uses AI regularly, that idea landed a little too close to home.

When I’m in a pinch—and honestly, that’s often—I rely on AI to help me generate lesson ideas or activities. The results are usually strong: well-structured, standards-aligned, clear, and efficient. They reduce my mental load at the end of a long day. They give me a place to start when I’m tired or stuck.

And most of the time, I personalize what AI gives me.

But reading about machinal bypass forced me to ask a harder question: Am I using AI as a tool—or am I slowly replacing parts of myself with it?

Here’s the uncomfortable truth: sometimes the AI-generated lessons feel better than what I think I’d come up with on my own. They’re more polished. More complete. More “professional.” And when you’re exhausted, that’s incredibly appealing.

But teaching has never been about producing the most optimized lesson.

The most meaningful moments in my classroom don’t come from perfectly designed activities. They come from relationships. From reading the room. From knowing when to slow down, when to push, and when to throw the plan out entirely because something else matters more that day.

I work with students where relationships are not a “nice add-on”—they’re the foundation. Trust, consistency, and presence matter more than any worksheet or protocol. AI can help me with content. It can organize ideas. It can suggest approaches. But it cannot build relationships. It cannot notice when a student is off. It cannot make a judgment call rooted in lived experience.

That’s where the idea of machinal bypass really clicked for me.

The article argues that bypass happens when we let AI stand in for presence—when a task requires us, but we substitute ourselves with a machine instead. And while lesson planning might not seem deeply personal on the surface, it’s often where my professional judgment, creativity, and growth actually develop.

If I always start with AI, I skip the struggle. I skip sitting with uncertainty. I skip asking myself what this group of students needs right now. Over time, that matters—not just for my students, but for me as a teacher.

I don’t want to stop using AI. I want to use it more intentionally.

I want AI to handle the content, the structure, and the logistics—so I can invest more of myself in relationships, responsiveness, and judgment. But I don’t want AI to become the place where my thinking starts every time. Because if I’m not careful, I’m not just outsourcing efficiency—I’m outsourcing growth.

The authors of the article end with a simple but powerful idea: if a task requires you—your presence, your uncertainty, your lived experience—then bypassing yourself comes at a cost.

That’s the balance I’m trying to be more mindful of now.

AI can help me teach.
But it shouldn’t help me avoid myself.