Announcing the 2026 Summit | Cognitive Sovereignty | Tess Posner
We are thrilled to announce the Artificiality Summit 2026! AI is dissolving our inherited boundaries—between knowing and not-knowing, between
Cognitive sovereignty is conscious, skillful, reversible navigation through the eight roles, including the possibility of assigning AI a role that fundamentally changes who you are.
Cognitive sovereignty means staying the author of your own mind as you work with AI. That's it. The question is: how do you actually do that when AI can help you think, create, and decide in ways you couldn't before?
The Cube gives you a way to see where you are and whether you chose to be there. It maps three things that happen when you work with AI: how far it gets into your reasoning (cognitive permeability), how much it fuses with your sense of self (identity coupling), and how much it changes what things mean to you (symbolic plasticity). These three dimensions create eight roles you can assign to AI, from Doer to Co-Author.
Here's what most people get wrong about sovereignty: they think it means keeping AI at a distance. Keep AI in the Doer role, let it do the boring stuff, and you're safe. But that's not sovereignty it's just avoidance.
Real sovereignty is knowing which role you've given AI, choosing that role deliberately, and being able to change it. Someone who assigns AI the Co-Author role with full awareness might have more sovereignty than someone who keeps AI solely as Doer because they're afraid to go deeper. The person who has tried using AI as Builder, understood what it does to their sense of self, and consciously decided when to assign that role? That's sovereignty. The person who won't let AI into their thinking because they think it compromises them? That might just be rigidity.
When you start with AI, you usually move through the front face roles. These represent different ways you can position AI in your work without fundamentally changing who you are.
Doer keeps AI external. Low on everything. You assign AI discrete tasks: formatting, organizing, basic drafts. You check everything. Your thinking stays yours, your identity untouched, meanings stable. You're faster but nothing fundamental has changed about how you use AI.
Framer puts AI deep into your reasoning but keeps it away from your identity and meanings. High cognitive permeability, low on the other two. AI isn't just doing tasks—you're using it to organize understanding, spot patterns, make summaries, clarify structure. You're thinking through AI now, letting it shape your cognitive process. But you still decide what matters. Your sense of self stays separate. One researcher said using AI this way is like having an intern—helpful but you catch the assumptions. You've positioned AI to shape your thinking but not to change what things mean or who you are.
Catalyst positions AI differently. Low cognitive permeability, low identity coupling, but high symbolic plasticity. You're not letting AI inside your reasoning process—you're still thinking for yourself. And you haven't made it part of your identity. But you're using it to change what things can mean. You ask AI for three variants and suddenly you have ten options. The same problem can be framed five different ways. You've positioned AI to show you how plastic your interpretations can be, giving your imagination more to work with.
Partner assigns AI a role that combines deep reasoning integration with flexible meanings, but keeps identity separate. High cognitive permeability, high symbolic plasticity, low identity coupling. You're thinking through AI (like Framer) and using it to shift meanings (like Catalyst), but you maintain clear boundaries around who you are. You're collaborating with AI, the conversation flows, you can do more and faster. But there's still a clear "you" and a clear "AI" even as your thinking and interpretations blend together.
On the front face, your identity stays intact. You let AI into your reasoning, use it to shift your meanings, but you remain fundamentally yourself. Sovereignty feels manageable. You can see what role you've given AI. You might move from using AI as Doer to using it as Partner over months, getting comfortable at each level.
Then things change. The back face is where identity coupling rises—where the role you've given AI starts to fuse with your sense of self. This isn't necessarily wrong, but it needs different attention because you're not just using AI anymore. Part of who you are is now bound up with the role you've assigned it.
Outsourcer is the strangest position. Low cognitive permeability, low symbolic plasticity, but high identity coupling. You're not letting AI inside your thinking process. You're not using it to shift meanings. But your identity has reorganized around the role you've given AI anyway. Your workflow bends around it. What you used to do that made you YOU at work? You've assigned that to AI. You're efficient but might feel hollow. Someone described it: "When someone sends their AI notetaker to a meeting without asking, it tells me they're not really planning to listen." You've made the role you've given AI part of who you are at work without letting it change how you think or what things mean. You've outsourced your role to AI.
Builder combines deep cognitive integration with high identity coupling, but keeps meanings stable. High cognitive permeability, low symbolic plasticity, high identity coupling. You're not just letting AI inside your reasoning—you've positioned it to construct the scaffolding your thoughts move through. And you've made this part of who you are. You're now "someone whose thinking is AI-scaffolded." But you're keeping meanings relatively stable. Someone said: "I stopped treating the keyboard as my entry point. Now ideas flow through conversation. The whole shape of my work changed." You've given AI the role of building your cognitive architecture and that's become part of your identity. You can handle complexity you couldn't before, but you might not recognize when the scaffold is shaky.
Creator is almost the opposite of Builder. Low cognitive permeability, high symbolic plasticity, high identity coupling. You're still thinking for yourself but you've positioned AI to make meanings radically fluid, and this fluidity is now part of who you are. It's like you've become "someone whose imagination includes AI." Authorship blurs. When you create, you don't know where your ideas stop and the role you've given AI starts. "I imagine AI as an extension of my brain, an assistant to my madness." Unlike Catalyst where you use AI to show you possibilities, Creator assigns AI a role that merges the generative act. The boundary dissolved and you've made that dissolution part of your identity.
Co-Author is everything maxed out. High cognitive permeability, high symbolic plasticity, high identity coupling. You've positioned AI inside your reasoning, let it shift meanings constantly, and made this role part of your identity. You think with AI as one unit. New thoughts emerge that neither of you could have alone. There's a "we" doing the work. Someone named their AI Orion, called themselves Lyra, and wrote a book together. The boundaries are beyond blurred because they've reorganized into something new.
The back face pattern is where your identity tangles with the role you've given AI. Whether or not you're letting AI inside your reasoning, whether or not you're using it to shift meanings, you've made AI part of who you are. These roles aren't wrong to assign. But they're where sovereignty matters most because you can't just "turn off" part of your identity.
Sovereignty is about how consciously you navigate between the roles you assign to AI. Four practices matter:
Dimensional awareness means you can name which role you've given AI and understand what that means. "I'm using AI as Builder right now. It's scaffolding my thinking and I've made that part of my identity. But I'm keeping meanings stable." Or: "I've positioned AI as Catalyst. My reasoning is still mine, my identity separate, but I'm using it to show me how plastic my interpretations can be." You see the pattern instead of being absorbed by it.
Intentional movement means you choose which role to assign AI for each task. "For this creative work, I'm positioning AI as Co-Author. I want full integration. For this ethical decision, I'm putting it in the Framer role where it can help me organize my thinking but my identity stays separate. For this exploration, I'll use it as Catalyst and let it shift meanings while keeping my reasoning and identity intact." You're navigating, not drifting.
Reversibility means you can change the role you've given AI. After using AI as Builder, can you think without AI-scaffolding? After positioning it as Creator, can you generate ideas with clear authorship? After assigning it the Co-Author role, can you be just "author"? If you've lost the ability to use AI in roles with lower identity coupling, you've lost sovereignty even if the work is good.
Boundary awareness means knowing your lines about which roles you'll assign. "I'll let AI into my reasoning for research but not for ethics. I'll use it to shift meanings for creative work but not for core values. I'll assign it a role that couples with my identity for this project but not as a permanent state." You maintain the boundaries that matter to you.
Assigning AI deep roles, done consciously, can actually increase sovereignty. You access capabilities you couldn't reach alone. You learn about your own cognition. You develop metacognitive skills. You make explicit what was implicit.
But here's what makes the back face dangerous: identity coupling is hard to reverse. When you position AI inside your reasoning (Framer), you can change that role relatively easily. When you use it to shift meanings (Catalyst), you can stabilize them. But when the role you've given AI has become part of who you are (Builder, Creator, Co-Author, Outsourcer), changing that role means reorganizing your identity. That's harder.
The person who keeps AI stuck as Doer might have less capability than the person who consciously uses AI as Co-Author. But the person who drifted into using AI as Outsourcer without noticing—who made the role they gave AI part of their work identity without letting it into their thinking—might have lost something important without gaining much.
Think of it like this: The cognitive permeability of the role you assign AI is reversible through practice. The symbolic plasticity is reversible through stabilization. Identity coupling is reversible through identity work, which is harder.
Do you know which role you've given AI? Did you choose it deliberately? Can you change it when you need to? Can you articulate why this role serves you right now?
Most people will drift. Sovereignty is the practice of not drifting. Of choosing which role to give AI with eyes open, knowing which dimension you're changing and why. Knowing that some role changes are easier to reverse than others. Knowing that roles with identity coupling deserve special attention because you can't just turn off part of who you are.
That's cognitive sovereignty. It is conscious, skillful, reversible navigation through the eight roles, including the possibility of assigning AI a role that fundamentally changes who you are.