When The Ground Shifts: AI, Expertise, and Moral Disorientation 

|

By

Callid Keefe-Perry

We invited faculty members in the 2024-2025 Purposeful AI Working Group to reflect on their experiences building custom chatbots that align with their teaching values and enhance student learning.

I’m often part of conversations where people talk about the promise and peril of AI in education, about how it might help students learn faster, make grading more efficient, or upend our assumptions about plagiarism. In workshops and consultations, we’ve talked about updating syllabi, rethinking assignments, and navigating new questions of academic integrity. But under all those headlines, I think there is something more personal and more difficult to name.

It’s not just that AI tools are changing how we work. For some faculty, they’re quietly changing who we think we are.

At many institutions, including our own, many educators built their professional identities on being experts: on knowing more than students, guiding learning through a carefully honed set of practices, and making ethical judgments about what constitutes quality work. These aren’t just functions of the job. For many, they’re part of our identity: I am someone who knows about my field in depth and helps others to know similarly.  

So when AI enters the classroom and suddenly “everyone” has access to articulate prose, quick summaries, and plausible arguments on demand, it can feel like the ground has shifted beneath us. The reaction isn’t always dramatic, but it’s there. In the tension during meetings, in the skepticism toward new tools, in the quiet grief of faculty who say, “I just don’t know what my role is anymore.”

This is more than discomfort. It’s what I’ve been beginning to call “moral disorientation.” And to understand it, we need more than troubleshooting guides or software tutorials. We need space to name what’s shifting, to reflect on what we’re holding onto, and to imagine together what teaching looks like when the compass has spun but the journey continues.

Consider the literature professor who built her pedagogy around close reading. The work of guiding students through the unpacking of metaphors, tracking themes, discovering meaning through wrestling with a text. When a student submits an AI-generated analysis that hits all the right notes but bypasses the formative struggle entirely, she feels something beyond frustration. It’s as if the very ground of her teaching has been unsettled, any transformative encounter between student and text skipped.

Or the economics professor who discovers her students have built AI tutors that walk them through problem sets at midnight in their own native language, explaining supply and demand curves in Mandarin until concepts click.  She watches test scores improve and sees genuine understanding in their work. Yet she finds herself unsettled: if AI can provide personalized, patient instruction whenever students need it, adapting explanations until comprehension dawns, what becomes of office hours? Of the teaching assistants? Of class itself? The disruption here isn’t about cheating or shortcuts. It’s about watching a genuinely beneficial tool fulfill needs she’s devoted her career to addressing.

Making sense of it all can be dizzying. 

One helpful idea I‘ve found comes from the work of military chaplain and practical theologian Zachary Moon. He suggests that all of us live with what he calls a “Moral Orienting System” (MOS), a set of values, beliefs, relationships, and behaviors that help us make meaning and guide our actions, especially in high-stakes environments. It’s the internal compass we rely on when making decisions, interpreting our roles, and navigating difficult terrain.

Put simply, your MOS is the invisible framework that shapes how you understand your purpose and make choices in your work. It’s like the operating system running in the background of your life: you don’t think about it until something disrupts it, and suddenly nothing works quite the way it used to.

Moon’s framework is not just a theory. It’s a diagnostic tool we can apply to our own professional lives here as educators. Consider the Moral Orienting System of a Boston College educator:

  • Values: We are animated by Jesuit ideals like cura personalis (care for the whole person), academic excellence, and the formation of students for a just world.
  • Beliefs: We hold a deep belief in the power of rigorous inquiry, the importance of expertise cultivated over time, and the ethical responsibility that comes with knowledge.
  • Behaviors: Our work is defined by practices like mentoring, grading with thoughtful feedback, and designing assignments that foster deep, critical thinking.
  • Relationships: The teacher-student relationship is central, often understood as a form of accompaniment on an intellectual and personal journey.

Generative AI doesn’t just introduce a new tool, it sends a shock through this entire system, creating dissonance between the values we profess and the new behaviors we must now consider. This shakeup is the very source of the disorientation so many of us are feeling.

Most of the time, we don’t think about our MOS. It’s just part of how we move through the world. But when something happens that throws our internal compass out of alignment, when the practices we once trusted no longer yield the results we expect, we feel disoriented. Not just practically, but inwardly too. We might feel anxious, off-balance, even ashamed, or angry without always knowing why.

Generative AI is doing that to many educators. And not only if we don’t “get it.” I think it is also at least partly because it’s challenging our unspoken sense of what makes teaching meaningful, what counts as real learning, and what it means to be a guide in the digital age.

This doesn’t mean we should fear AI, or reject it out of hand. But it does mean we need to name the deeper disruptions it can cause: not just in workflows, but in worldviews. 

A Call for Empathy and Reflection

For leaders in Higher Ed interested in what comes next with AI, this is a call to empathy. Not every hesitation is technophobia. Sometimes, it’s a sign that someone’s moral orienting system is under stress. It is the distress signal of a professional identity trying to maintain its integrity when the core practices of mentoring and assessing authentic work are suddenly thrown into question. What education means might itself be in flux and that can be disorienting when you’ve built so much of your life around understanding it.

And for educators, it’s a call to reflection. What parts of your teaching identity feel most under pressure right now? What deeper values are being surfaced or challenged? Who are the people you trust enough to talk these questions through with? These questions are not a distraction from our work; they are the work of teaching with integrity in a moment of profound change.

We are not the first generation of teachers to face this kind of shift, and we won’t be the last. But if we’re willing to attend not just to the tools, but to the meaning we make around them, we might find a way through, not just with new techniques, but with renewed clarity about what matters most.

In truth, many of the pedagogical questions AI raises (about authentic learning, the purpose of struggle, the nature of expertise) were already worth asking. We’ve just been able to defer them. Now we can’t. Will our teaching change? Yes. Will all the changes be welcome? No. Will they happen regardless of our preferences? Also yes. Which is precisely why this reflection matters. The choice isn’t whether to face these disruptions, but how: as passive recipients of technological change, or as educators actively shaping what teaching with integrity looks like in this new landscape. The tools will evolve. The questions will multiply. But the clearer we are about our own commitments (and the more honest we are about what’s shifting), the better equipped we’ll be to navigate what’s coming next. And what’s already here.


Callid Keefe-Perry

Callid Keefe-Perry is Assistant Professor of Contextual Education and Public Theology, Clough School of Theology and Ministry