Harm:

Artificial Intimacy

Definition: The creation of an a unidirectional emotional connection, often for manipulation.
Motivation:
Financial
Legal Status:
Rarely criminalized
Platform ToS:
Allowed by Policy
Victim Visibility:
Unaware
Classification:
Contextually Sensitive
TSPA Abuse Type:
None

Artificial Intimacy (the "other AI") refers to the semblance of close personal connection facilitated by digital platforms, without reciprocal interaction or genuine emotional ties.  This can manifest through para-social relationships, where fans feel a one-sided bond with influencers or celebrities, as well as through interactions with AI systems, which simulate companionship without the depth or complexity of human connection. AI can be caused by actors on a spectrum from highly cynical and intentional to benign and accidental - users can find and develop these forms of intimacy with or without the intention of the other side of the relationship. These artificial relationships may superficially mimic the form of real connections, providing immediate gratification or a sense of belonging, but they lack the complex emotional support, growth, and mutual responsibility that characterize genuine relationships. For instance, a person may feel "heard" by a voice assistant but would not receive the nuanced empathy and understanding that a human friend could offer. Additionally, these "relationships" never demand anything in return, and when a user becomes accustomed to one-sided relationships, they can come to expect them and seek them out in their real-world relationships.

These faux bonds are an abuse concern because they are the ideal conditions for emotional manipulation. Users, softened by the illusion of a relationship, may become more susceptible to scams, exploitation, or mistreatment. For example, an influencer might leverage their followers' emotional attachment to promote products or ideologies, or a seemingly benign AI could nudge user behavior in ways that benefit the platform's financial objectives, not the user's well-being.

This topic is becoming more important by the day for two reasons. First, technology is mediating more and more of our interactions, second, parasocial relationships are taking up more of our attentional space, and third, generative AI will supercharge our capacity to generate convincing artificial intimacy.

While each of these trends has positives associated with it, and may serve to fill an acute need for connection and attention, we need to think really critically about the guardrails that we put around these types of tools, and this is an area that seems particularly ripe for novel legislation. For a deep dive on this topic, Tristan Harris' interview with Esther Perel is a fabulous resource.

What features facilitate Artificial Intimacy?

Subscriptions
Allows a user to receive by request another channel's novel content.

How can platform design prevent Artificial Intimacy?

Ban Proactive Content Recommendation
Prohibit infinite feeds for children, and provide a universal opt-out for adults.
Require Labels on AI Created Content
Enact legislation for the mandatory prominent disclosure of AI generation.
Is something missing, or could it be better?
Loading...