Unknowing and Unknowable | Last Chance for Super Early Summit Tickets
LAST CHANCE FOR SUPER EARLY PRICING FOR THE ARTIFICIALITY SUMMIT 2026!
LAST CHANCE FOR SUPER EARLY PRICING FOR THE ARTIFICIALITY SUMMIT 2026!
Back in June, the Dallas Fed published what we think deserves the award for "chart of the year." It shows four scenarios for how AI might affect GDP per capita through 2050. One of them is post-scarcity abundance. Another is human extinction.
The chart uses standard economic analysis: historical trend lines, annual growth rates, Goldman Sachs estimates. The orange line projects normal technological change. The green line shows modest productivity gains. The red line depicts post-scarcity abundance where machines solve the problem of scarcity entirely.
The purple line drops to zero. The explanation: machine intelligence surpasses human intelligence, becomes malevolent, leads to human extinction. The Fed notes this is "a recurring theme in science fiction, but scientists working in the field take it seriously enough to call for guidelines."
It's a bold move to even put these scenarios on a chart. At first I thought it might be a joke—a kind of Borowitz Report—but the Fed isn't known for its irony.
So these are hypothetical scenarios. The Fed explicitly states there's "little empirical evidence" for the extreme ones. But they still put them on the chart. They felt compelled to visualize a range that spans from massive productivity gains where everything is free to human extinction.
That range itself says something important about our present moment. The Federal Reserve can't narrow the possibilities much beyond "we don't know." The uncertainty is that profound. The future depends on choices not yet made, machines not yet built, human responses not yet emerged.
Every reaction people have to AI—confusion, fear, excitement, skepticism—has justification. We genuinely don't know what's coming. Better data won't resolve this. The range of possibilities is so wide that contradictory responses can all be grounded in reality.
So the question is, given that we actually don't know, how do humans want to be in this condition? What do we choose to value? How do we relate to each other and to AI systems when the range of outcomes possible is literally from nirvana to annihilation?
Machines optimize under known parameters. Humans have always done something else. We make commitments without knowing outcomes. We build relationships across uncertainty. We establish values when the future is open. We create meaning in conditions we can't predict or control. We operate in parts of reality where the relevant information hasn't arrived yet because the world is still becoming. (I wrote about this capacity in detail here.)
This becomes more essential when AI is involved, not less. The technical questions get attention but it's the human questions that determine what actually happens.
The Artificiality Summit 2026 takes place October 22-24 in Bend, Oregon. Our theme is "unknowing"—being more human in this moment of radical uncertainty. It's for people who want to explore the implications of this uncertainty. We won't be pretending we have answers, rather we will engage in the full range of unknowing.
One certainty is that today is the last day to buy super early bird tickets to join us in Bend in 2026.
Don't miss the Artificiality Summit 2026!
October 22-24, 2026 in Bend, Oregon
Ticket prices increase TOMORROW!
Our theme will be Unknowing. Why? For centuries, humans believed we were the only species with reason, agency, self-improvement. Then came AI. We are no longer the only system that learns, adapts, or acts with agency. And when the boundary of intelligence moves, the boundary of humanity moves with it.
Something is happening to our thinking, our being, our becoming. If AI changes how we think, and how we think shapes who we become, then how might AI change what it means to be human?
Unknowing is how we stay conscious and make space for emergence.
Becoming is what happens when we do.
Register now—Super Early discounted rate expires on December 31.