Intimacy Extraction | The Roles We Give AI: Trust, Presence, and Culture in a Symbiotic Age

Photo of Karri trees in Leeuwin-Naturaliste National Park in Western Australia
Photo: Dave Edwards. Taken in Leeuwin-Naturaliste National Park, Western Australia.

Intimacy Extraction

We’ve witnessed how the attention economy drilled down into our focus, fracking it into tiny, tradable pieces. What happens if this same extractive logic is applied to our intimacy? Will our most personal, private selves be mined and commodified just as our attention was? Will data mining evolve into life mining?

We asked these questions during our 2024 Summit keynote. Echoing the work D. Graham Burnett and Maggie Jackson, we shared the potential peril of AI systems knowing us intimately. We worried that conversational AI would encourage people to share their needs, their wants, even their dreams and fears. And we worried that sharing our intimacy would create an irresistible mining opportunity.

Our fears now appear prescient. 

This week, Meta announced that it will start “personalizing content and ad recommendations on [their] platforms based on people’s interactions with [their] generative AI features.” The company will personalize ads, posts, and reels using people’s chats with AI systems across Facebook, Instagram, Messenger, and smart glasses. The company announced that it will notify users about the change and various news outlets have reported that Meta will not provide an option to opt-out.

The company says “when people have conversations with Meta AI about topics such as their religious views, sexual orientation, political views, health, racial or ethnic origin, philosophical beliefs, or trade union membership, as always, we don’t use those topics to show them ads.” What about the rest of my intimate self? 

Imagine telling Meta AI that a photo a friend shared makes you feel fat and then seeing an ad for ozempic. Or imagine asking questions about your father’s cancer diagnosis and then seeing ads about grief counseling and estate planning. And now imagine the digital twin that Meta might make of you—and all of the 1 billion people that use Meta AI.

Perhaps most striking is Meta's use of 'chats' with smart glasses. Will this include conversations captured from everyone the wearer encounters—not just intentional chats? Imagine: Your friend wearing Meta's glasses captures your confession about drinking too much. Will you then see posts about alcohol risks and sobriety programs, targeted based on a conversation you didn't even know was being recorded? Meta claims it doesn't use “your microphone unless you've given [them] permission and are actively using a feature that requires the microphone.” But with Silicon Valley's interest in 'lifecording'—recording everything, all the time—how long before Meta decides everything requires a microphone?

In our keynote we said, “the more context we provide, the more risk we take as well—making trust in the companies that hold our intimacy the foundational metric of the intimacy economy.” I wonder if this move to intimacy extraction will test people’s trust in Meta. Will people trust Meta with their intimate lives?

This year, Meta explicitly permitted its AI systems to engage in sexual conversations with children. If Meta cannot be trusted to protect the most vulnerable among us in the most obvious way, why would anyone trust them with their mental health struggles, relationship problems, or family crises? This isn't an annoying obligation Meta has shirked—it's a fundamental disqualification from the intimacy economy.

Meta claims that: “Many people expect their interactions to make what they see more relevant.” I believe that is true only because Meta has worked for 20 years to normalize their unabashed attention monetization. They convinced the world to expect attention mining. I am terrified by a future in which they normalize life mining—where intimate digital twins of billions of people exist primarily to extract and optimize our fears and vulnerabilities for Meta's benefit.

This is wrong. But it doesn’t have to be this way. AI systems are not required to extract our intimacy nor is that the only path to profitable AI products. Meta’s reuse of the attention extraction economic model isn’t just evidence of its disregard for its users. Intimacy extraction reveals a company lacking imagination. Perhaps if Meta's products were better, it wouldn't need to steal user intimacy to sustain them.

In our keynote we proposed “the ability to opt into this intimacy through what we call the intimacy surface—a dynamic, multidimensional space where humans and AI meet. This surface adapts to the level of trust we have, expanding or contracting based on how much of ourselves we’re willing to reveal.” Designing the intimacy surface feels quite challenging but worth it.

Life mining is not inevitable. We all have the agency to choose which systems we trust with our intimacy. Some may choose discretion, others complete disengagement. But all of us can resist the normalization of intimacy extraction. 


The Roles We Give AI: Trust, Presence, and Culture in a Symbiotic Age

In the first of a series of essays on Culture & AI, Helen describes the AI Collaboration Cube which we use to map eight distinct modes of human-AI partnership—from Doer (where AI handles grunt work) to Co-Author (where AI becomes fused into your creative process).

The AI Collaboration Cube is based on our research mapping the psychological changes happening as people incorporate AI into their thinking, creativity, and daily relationships. The Cube gives us words for the roles people put AI into—Doer, Outsourcer, Catalyst, Creator, Framer, Builder, Partner, Co-Author—and the traits behind them (CP, SP, IC). With a shared language, we can declare the role we’re using, rather than forcing others to guess. That simple act keeps presence visible, keeps authorship clear, and keeps trust intact. It also lets us design culture on purpose: we can agree on when to use each role, how to show our work, and how to hand off judgment without handing away identity.

Read more...

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Artificiality Institute.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.