Five Ways to Five Ways To Lose Yourself With AI (and How to Stay Grounded) and a Conversation with Christine Rosen
Five Ways To Lose Yourself With AI (and How to Stay Grounded) Our latest research reveals how AI reshapes not
Our latest research reveals how AI reshapes not just what professionals produce, but who they become in the process. Through hundreds of stories, we've identified five patterns that show how identity and expertise can drift when AI integration happens without conscious frameworks.
Voice loss begins when your familiar turns of phrase start coming from the system rather than you. Writers notice their older work reads as thought-in-progress while newer pieces feel polished but somehow thinner. Erosion of confidencedisguises itself as thoroughness—decisions you've already made get rerouted through AI "just to be sure," shifting your professional assurance from internal judgment to external validation.
Role confusion emerges when authorship becomes impossible to trace. People describe their work in terms of "we"—themselves and the system—struggling to identify which parts of hybrid outputs are genuinely their own. Avoidancemeans some professionals opt out entirely to preserve their voice, keeping their expertise intact but never testing it against conditions where AI is already embedded.
Symbolic overload shows up as restless experimentation—cycling between tools, keeping multiple competing outputs, producing streams of inconsistent drafts without a stabilizing frame for coherence.
The professionals who adapt successfully maintain clear boundaries. They use AI as input but keep judgment anchored to themselves. A policy analyst runs scenarios through the system, then pauses before drawing her own conclusions. A journalist takes machine outlines and reworks them until the cadence is unmistakably her own.
Our research suggests that staying grounded requires active awareness of how AI changes you, coupled with conscious choices about those changes rather than letting them happen automatically. The frame that works: permeability that's active but bounded, with identity coupled to your work rather than your tools.
Christine Rosen's The Extinction of Experience offers a vital framework for understanding what we're losing as we migrate from direct to mediated engagement with the world. Drawing from naturalist Robert Michael Pyle's concept of "the extinction of experience"—when something disappears from our environment, subsequent generations don't even know to mourn its absence—Rosen shows how embodied practices that have shaped human cognition for millennia are being quietly abandoned for digital substitutes.
The stakes aren't nostalgic. When we lose handwriting, we lose embodied cognition that affects how we think and remember. When we default to screens for understanding experience, we diminish our capacity to read emotions and navigate social complexity. When we expect human relationships to operate like technological interfaces—complete with delete keys and escape commands—we're applying machine logic to irreducibly human experiences.
Rosen reveals how Silicon Valley's framing of technological advancement as inevitable progress obscures the conscious choices we're making about what to preserve versus optimize away. The people designing our technological future surround themselves with "human nannies and human tutors" while prescribing AI substitutes for everyone else.
Her call for "defending the human" isn't Luddite resistance but conscious choice-making about preserving friction, inefficiency, and embodied experience. In a world designed to treat humans as inefficient machines needing optimization, seemingly small acts—handwriting a letter, sitting with uncomfortable emotions—become forms of resistance that preserve essential human capacities.
Watch/Listen on Apple, Spotify, and YouTube...
Upcoming community opportunities to engage with the human experience of AI.
Foundational explorations from our research into life with synthetic intelligence.