The Language of Data

Digital consent might make sense if data can be used in currency, benefiting the owner and creator. But that isn't how the internet actually operates. Perhaps a new, alternative metaphor might make more sense: "impressions."

An abstract image of impressi

In the physical world, consent is reserved for circumstances that have obvious consequences and complex tradeoffs attached to them. Whether it’s before a medical procedure, participating in experimental research, applying for a loan, or engaging in sexual intimacy; being directly asked for consent is a decidedly out-of-the-ordinary experience. Ears perk up, pupils dilate, bodies tense, attention narrows.

By contrast, consent in the digital world is constantlyincessantly—being requested of us. From the cookie banners at the periphery of almost every website we visit, to the device permissions we have to wade through after installing a new app, to the terms of service agreements that we’re not actually expected to investigate when signing up for a new account; digital consent has become an altogether mundane experience.

Something is deeply wrong with this picture. Sure, words have a habit of evolving, but this is different. “Consent” was plucked right out of active use and applied directly to a different domain. Which means that all its built-in connotations were expected to seamlessly transfer over. But do they actually apply?

Our intuitions about privacy have been finely tuned over millennia to detect threats and interpret social interactions through our physical senses. Just think of the way you can feel it when a stranger hovers behind you, peering over your shoulder. Or how you’re able to detect slight changes in someone’s breathing patterns before they’re about to say something uncomfortable. We don’t experience anything like that when data is being collected, making it difficult to identify the consequences or ramifications of consent. And as a result, literally nothing from the real world social contract of consent makes sense for personal data processing:

  1. In the real world, consent needs to be acquired before each and every instance of the associated activity.
  2. In the real world, if harm occurs as a result of consent being ignored or misused, that harm is tangible or palpable.
  3. In the real world, if someone says “no” at any point during the activity, even if they’d previously said “yes,” everything needs to immediately halt.
  4. In the real world, the thing being consented to is literally “of me”; my body or my property.

Now, some people may take issue with that last bullet. They might argue that data is personal property. But they’d be wrong. Because property implies ownership, control, and the ability to exclude others. Data, once shared, is inherently replicated and disseminated, making true ownership impossible. The very nature of data defies the traditional boundaries of physical property. And that is entirely the problem.

How did we get here? And why do we tolerate such a profound misappropriation of language?

Prior to Edward Snowden whistleblowing on the NSA in 2013 or the Cambridge Analytica scandal in 2018, we were, for the most part, blissfully ignorant as a society about data processing. Complaints like “if you’re not paying for the product, you are the product” or “if the product is free, it’s because you’re paying with your data” simply didn’t exist. But today, sentiments like these have become quite common. They show up in docudramas like “The Social Dilemma,” get expressed through high-minded activism like “Data Dignity” (which contemplates a “collective bargaining” approach to personal data use), and have even been codified into law through competition regulations like the “New Deal for Consumers.”

“Paying” with data—as though it’s a currency—evokes a potent metaphor.

Imagine you’re carrying around your data, like loose change rustling in a purse or pocket, and that by engaging in the “digital marketplace,” you’re either:

  • “Exchanging” a payment of your data in return for a good or service; or
  • “Making a donation” of your data in support of a good or service.

In addition, imagine that while you’re inside the virtual walls of a service provider, it’s understood that their staff may be observing you, ready to help guide you, find what you’re looking for, and promote other goods or services that they think you (and they) would benefit from.

Makes pretty intuitive sense, right? Gives and gets. They have something you want and you have something they want. Fair trade and whatnot.

This metaphor feels coherent because it transforms something esoteric (zeros and ones) into something that feels relatable and grounded (dollars and cents). And it follows in a long line of metaphors that translate human behaviors into physical substances. For example, “taking” your photo, “capturing” your likeness, “stealing” your idea, “losing” yourself in a story, the list goes on…

The currency metaphor also helps online service providers rationalize the “free market” model they’ve fashioned for users. A model that goes something along the lines of: “Don’t like it? You can always choose to spend your data elsewhere.”

The problem is, none of this is how the internet actually operates. “Donations” or “payments” made with data aren’t an accurate description of the interaction between users and digital products.

In using websites and apps, people aren’t consciously participating in an “economy”; at least not in any kind of traditional sense of the term. They don’t type words into a search field, prompt an AI chatbot, or swipe to the next video in a content feed as an expression of their “consumer choice.” There are no price tags, cashiers, receipts, or refunds. Data transactions begin processing the moment a user “walks in the door,” and continue being imperceptibly processed long after they’ve left.

Plus, once someone’s data has “changed hands,” there’s no way for them to ever truly “get it back.” Copies will have been made and passed around in the meantime, and those copies will have been used to create new “intellectual property,” which is protected by copyright, patents, and other protections afforded to inventors.

If all that wasn’t enough, individuals have no way to balance their data “budget,” nor can they evaluate whether they're getting “fair market value” for the services they’ve “paid for” with their data. Advertising metrics like “cost per click” are computed by and for corporations, brokered through an opaque network of auction dynamics far outside the user’s awareness. AI training is even more imbalanced. When a person's artistic style, writing voice, or professional expertise is absorbed into an AI system through their publicly available work, the “transaction” isn't remotely equitable. The value extracted from this data—helping to create systems worth billions—far exceeds any service the individual received when initially sharing their content online.

Hopefully by this point, the flaws in the currency metaphor are evident. It falsely equates data with tangible assets, suggesting a transactional relationship that doesn't exist. It ignores the continuous, often imperceptible nature of data collection and processing. It fails to account for the irreversible nature of data sharing and the lack of user control over its valuation and distribution. Underneath its surface-level simplicity is a deeply imbalanced and extractive relationship between technologists and consumers.

Which brings us back around to consent.

What people want from a consent agreement is to be able to rely on basic human decency. A normative social contract of privacy that affords anonymity until you decide to behave memorably. Where personal questions have an obvious connection to the current context. And where you don’t have to be perpetually vigilant about whether you’re being gossiped about or eavesdropped on.

Unfortunately, the technical and regulatory requirements underpinning digital consent have moved further and further from those ideals in recent years. It’s a game of cat and mouse, where tech companies try to preserve as much “optionality” as possible (for their future research and development), while policymakers introduce more legal hurdles for them to jump through in order to explain what a user’s data might be used for.

But rather than trying to decipher the legal minutiae of consent, most users have instead learned to recognize the UI patterns of consent as a cue to gloss over their contents. And who can blame them? Here’s this screen that stands in the way between them and their actual destination, compelling the path of least resistance via a simple click, adorned with platitudes about “caring” for your privacy.

Something has to change. And if metaphors got us into this mess, maybe they can also get us out?

Pivoting language is always harder than we expect. After all, the words we use can shape us as much as we shape them. And as we’ve discussed, the “data is currency” metaphor has become so enmeshed in the fabric of tech culture that it’s effectively been rendered invisible in our daily jargon.

Nevertheless, I’d like to propose an alternative metaphor for data: “impressions.”

Impressions are subtle yet significant marks left when one entity interacts with another. Unlike currency, which is discrete and countable, impressions are qualitative, cumulative, and transformative. They can be shallow or deep, temporary or permanent, intentional or accidental. In social contexts, impressions form gradually through repeated interactions; in art, they capture essence rather than detail; in materials science, they reveal the properties of both the impressed and impressing objects.

As a metaphor for data, impressions better capture how our digital interactions leave traces that aren't simply extracted but created through contact. These data impressions aren't zero-sum—they don't deplete our personal "data reserves" but rather create new information through the interaction itself. Consider the wavelengths of light bouncing off a film negative, or the thumbprints molding a lump of clay.

Yet these impressions can still be deeply revealing, showing the contours of our behaviors, preferences, and patterns. As the saying goes, you may only get one chance to make a “first impression.” And so, consent remains a necessity.

But through the lens of impression-making, digital consent may be able to regain its lost connotations; as a conscious act of setting “boundaries.”

A boundary functions like the resistance between two materials—it determines how deeply and permanently an impression can be made. In real-world consent, boundaries are actively maintained: each new form of contact requires renewed permission, the depth of impression is negotiated, and either party can increase or decrease resistance at any point. Physical boundaries are felt through immediate sensory feedback—we know when someone is pressing too hard or crossing a line.

In the digital world, applying boundaries to consent would mean:

  1. Resistance that increases with depth—more revealing data impressions would require more explicit consent.
  2. Temporal limits—impressions would naturally fade unless renewed permission is granted.
  3. Sensory feedback—users would receive clear signals about what impressions are being made and how deep they go.
  4. Continuous negotiation—consent would be an ongoing process rather than a one-time gateway.
  5. Material memory—previous impressions would affect how easily new ones can be made in the same area.

These characteristics could transform digital consent from a legal fiction into a meaningful practice that respects the profound nature of the impressions being made on our digital selves.

In time, and with practice, my hope would be that technologists could learn to embrace the ephemeral, incomplete, and experiential nature of personal data. To dispense with the false tautology that being “data-driven” is the same as being “truth-driven.” And, in the face of that universe of unpredictability, choose to prioritize sensitivity and curiosity over power and greed.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Artificiality Institute.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.