How AI is Reshaping the Way Teams Think, Steve Sloman at the Artificiality Summit 2024
Our AI Course for Leaders starts this Wednesday! Become More Essential with Artificial Intelligence. Our research-backed course helps experienced professionals
Become More Essential with Artificial Intelligence. Our research-backed course helps experienced professionals co-evolve with AI through psychological strategy rather than technical training. Based on our Chronicle study of over 1,000 professionals, we've identified that success in AI collaboration depends on developing cognitive permeability, identity coupling, and symbolic plasticity. This intimate, four-session program, starting August 6th, guides participants through building conscious boundaries for AI collaboration while preserving the compressed expertise that makes human judgment irreplaceable.
We've collected hundreds of stories from professionals navigating AI in their daily work, and one pattern keeps emerging: teams are thinking together differently.
Initially, most groups experience acceleration—faster documents, smoother processes, ideas moving quickly from draft to discussion. People adopt AI naturally, first for small tasks, then for more central workflow.
But over time, something shifts. Meetings become quieter with fewer spontaneous exchanges. Group documents reflect smooth consensus even when no real discussion happened. People describe alignment but struggle to explain how decisions were made.
This maps to what we call Integration and Blurring phases, where AI becomes fully embedded in workflow. Teams with high Cognitive Permeability absorb AI suggestions easily, letting the model frame conversations. Without Symbolic Plasticity—the ability to reinterpret and reclaim meaning—groups start speaking in the model's voice rather than their own.
The result is a quiet re-patterning of collective intelligence. Teams don't lose their capacity to think together, but the pathway changes. Instead of interpersonal exchange and negotiated meaning, thinking increasingly follows AI's structural patterns.
Teams that stay flexible—pausing to notice AI's influence, asking different questions, preserving authorship—maintain stronger long-term cohesion. The changes are often subtle: different document tone, premature agreement, missing friction that once made collaboration feel more real.
Something relational, something symbolic, is being reconfigured in how we work together.
At the Artificiality Summit in October 2024, Steve Sloman, professor at Brown University and author of The Knowledge Illusion and The Cost of Conviction, catalyzed a conversation about how we perceive knowledge in ourselves, others, and now in machines. What happens when our collective knowledge includes a community of machines? Steve challenged us to think about the dynamics of knowledge and understanding in an AI-driven world, about the evolving landscape of narratives, and ask the question can AI make us believe in ways that humans make us believe? What would it take for AI to construct a compelling ideology and belief system that humans would want to follow?
Upcoming community opportunities to engage with the human experience of AI.
Foundational explorations from our research into life with synthetic intelligence.