Five Practices to Help Students Navigate AI Consciously

Instead of policing AI use or pretending it doesn’t exist, we can help students build awareness of how it’s shaping their learning.

Five Practices to Help Students Navigate AI Consciously
Photo: Dave Edwards

An article published this week in The Atlantic argued that colleges should take extreme measures to “abolish” unauthorized AI use—going so far as to suggest banning laptops, smartphones, and even Wi-Fi. This seems unrealistic on its face. Students already live and think in hybrid ways. They’re not going to type essays on digital typewriters while the rest of the world moves ahead.

There is no way to seal AI off from higher education. The issue is more, how can we help students consciously adapt to it? Our research shows that this adaptation happens along three dimensions: Cognitive Permeability (how open you are to letting AI into your headspace), Symbolic Plasticity (how much you let it reshape the meaning of things), and Identity Coupling (the extent to which AI fuses into your sense of self). What’s at stake is both academic integrity and how young people are learning to think, create, and define themselves in an AI-saturated world.

Our research with over 1,000 people shows that what’s happening is less about shortcuts and more about psychological adaptation. Like the professionals we’ve studied, students are reorganizing their cognitive processes around these systems, often without any framework for noticing how it’s changing their thinking.

With awareness, students can develop relationships with AI that are genuinely empowering. Without it, they become over reliant on AI.

Instead of policing AI use or pretending it doesn’t exist, we can help students build awareness of how it’s shaping their learning. These practices are small moves that invite reflection and keep the human in the loop.

1. Name AI’s Role Before You Start

Ask students to declare the role AI will play in an assignment. Is it there to spark ideas? Handle the repetitive parts while they focus on analysis? Collaborate through the whole process? Do the thinking outright?

Different roles create different learning outcomes. A simple reflection box—“What role did AI play here, and why?”—is enough to surface the intention behind the choice.

2. Check Where Identity Gets Tied Up

Confidence can drift from being about capability to being about whether AI delivered. Students can pause to notice:

  • How would I feel if others knew exactly how much AI contributed here?
  • If I couldn’t use AI tomorrow, would I still feel confident approaching the same challenge?
  • What emotions come up when I work with versus without AI?

3. Track the Human Contribution

AI makes it easy to lose track of what’s actually yours. Build in questions to prompt reflection like:

  • What reasoning came directly from you?
  • Could you re-explain the logic without AI if asked next week?
  • What did you add that AI couldn’t?

4. Protect AI-Free Zones

Not everything should be AI-assisted. Some skills only develop through friction and practice. In-class problem solving, timed writing, oral explanations—these carve out spaces where students can hear their own minds work. Frame it as training cognitive muscles, not banning a tool.

5. Treat Discomfort as Data

When students feel anxious or displaced by AI, that reaction is meaningful. It often signals that their assumptions about creativity, knowledge, or authorship are under strain. Invite reflection:

  • What assumptions might need revising?
  • What does “being knowledgeable” mean when AI is always in the room?
  • What parts of learning do you want to preserve as distinctly human?

We see students go two totally different ways with AI. Some treat it like a sparring partner — they throw half-formed ideas at it and come back sharper, like it’s stretching their thinking in directions they wouldn’t have gone alone. Others just slide into letting it do all the heavy lifting. The papers get turned in, the grades look fine, but underneath there’s a weird hollowness, like they’re not sure the ideas were ever really theirs.

No set of “best practices” is going to make AI simple in education. What actually matters is whether students start noticing what’s happening in their own heads. Are they using it to stretch how they think, or is it slowly replacing that thinking altogether? The point isn’t to ban or control AI. It’s to raise a generation that can tell the difference between “this is me thinking, with some help” and “this is the system thinking for me.” And to know which is the right approach at any particular time.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Artificiality Institute.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.