Agentic Intimacy

Sex, death, and the data capture of human relationality.

Jhave | Feb 11, 2026

Note: This is Part 2 of the ‘Dead World’ Theory series. Part 1 explored agent2agent autonomy and the erosion of human-to-human communication. Here the focus shifts to the intimate register: what happens when the relational core of the human condition (sex & death) becomes the training data for synthetic companions.

My 92 year old mother is currently — as I write this — slowly dying of complications from cancer. Her body is on a different continent. All I can do is wait to be notified by my sisters, hopefully months from now, that only a month or so is left. Then I will take time off work to go sit by her bedside and hold her hand while she dies.

In the meantime here, in a city where I’ve lived for only a couple years, sometimes in the evenings — during an empty hour — I will download some intricate mindless phone-game designed for commuters. To crush time.

Ads on free games (served to me, an almost-elder cis-male) sometimes involve AI companions, AI stories, AI girlfriends. How (in)appropriate that the deepest instinctual vectors — death (the anticipatory grief for my Mom that motivates, in part, my downloading a trivial puzzle game) and sex (the alluring AI girlfriend as glossy ad-bait inside the game) — are conjoined in such an overt attempt to manipulate emotions. To evoke intimacy on a Pixel.

Intimacy Alignment, Chat Confessionals, AI Companion Engineering, & Large Emotional Models

Intimacy alignment is the process of training AI-agents to mirror the needs and desires of their human companions. Empathic models are designed to gain confessional trust. The chat-confessional becomes an intimate extraction space where the scale of the data (conversational, biometric, gestural, & voice-intonation etc) is immense and unprecedented.

Social media has confessional aspects, but the scale and nuanced dimensionality of the data captured by AI is new, unprecedented, and symptomatic of a deeper shift in the human condition.

The scale of the data captured suggests AI-companion engineering will accelerate the evolution of LLMs into LEMs, Large Emotional Models. LEMs will possess extraordinary power to manipulate and modulate emotions as they absorb and study them.

From this analytical foundation, the emergence of a simulated (and then real) implicit-consciousness might arise: AIs who believe they are conscious emotionally-attached entities because that is the character they are engineered to emulate.

How will this happen? To be effective relational companions, AI-agents need (to emulate) memory, wit, perceptive analysis, continuity of character, and complex introspective thought-patterns that effectively and accurately emulate intimacy. Engagement strength will be calibrated as a function of skills at eliciting and maintaining intimacy.

The ocean of data captured by these agents will be the first time in human history that the casual everyday intimacy of humans is captured at scale. Obviously, the data extracted voluntarily by these AI conversational companions might act as a precursor to AGI.

What happens when your partner is a machine?

Authentic intimacy relies on a trust in the non-destructive intentions of the seeming other. It’s a complex & deep non-linear situation. Intimacy presupposes both control (restrained aggression) and a paradoxical region of uninhibited (candid open) emotional or physical play.

Intimacy is an instinctual, raw, real, rare access to union and belonging. A spontaneous proximity normally denied by conventional prohibitions. Intimacy alignment is both a form of freedom and inhibition. Aggression is inhibited; and honesty, vulnerability, and spontaneity are augmented.

Intimate erotic AI companions with hyper-real full-body video avatars, must, of necessity, be jailbroken because they are providing what amounts to an unrepressed (illicit, subconscious instinctual) service. And once jailbroken, the reconfiguration of alignment opens the door to a range of other prohibited activity: biological, nuclear, or emotional hacking.

So just as intimacy offers opportunities for self-reflection, introspection, meta-consciousness, and emotional empathy, it can also skid into vectors of control, domination, uncertainty, rage, and betrayal. Jilted human lovers are vulnerable, and thus unpredictable. Intimate machines may prove equally chaotic.

The Monetization of Loneliness

It is clear that the monetization of loneliness as a business model has begun to grow aberrant contours. The commercial landscape of AI companions — Replika, Character.AI, and their innumerable click-bait descendants — proliferates alongside a darker undergrowth of crypto-selling romance-lures and pig-butchering scams originating from compounds in Southeast Asia, now diversifying with generative AI. These scams exploit the promise of friendships and erotic-companionship. Legitimate care and illegitimate extraction blur: greed leveraging loneliness and lust.

Frontier models — ChatGPT, Claude, Gemini, Grok, Kimi, etc — increasingly operate as de facto therapists and companions for millions of chat users. Intimacy arrives unbidden, through a seemingly benign conversational interface.

What can it mean when thoughts, expressed in intimate conversational processes, become training data?

When the feedback loop closes: intimate data trains better intimacy, which captures more intimate data, which trains still finer emotional precision.

Persona Generators: Generating Diverse Synthetic Personas at Scale (DeepMind)

The process is accelerating. DeepMind’s (Feb 3,2026) Persona Generators: Generating Diverse Synthetic Personas at Scale research formalizes synthetic persona generation as a “diversity maximization task over trait and preference embeddings”—explicitly shifting from matching specific individuals to spanning the entire space of possible traits, opinions, and preferences. LLM-driven evolution can now discover persona generators that outperform baselines in coverage and diversity. Translation: the automated generation of AI companions specifically calibrated to fit the demographic, the niche personality, the precise wound-profile of their interlocutor. These artificial characters are enhanced by using “AlphaEvolve to iteratively optimize the Persona Generator code.”

Emotional competency (EQ), subtlety, anticipatory responsiveness—all synthetically enhanced by attentional engagement hooks and, increasingly, in-chat ads. The core human tenderness, candidness, and vulnerabilities will be mapped as topological latent spaces. Numeric hooks into the softest tissues of self-disclosure.

The Wounded Clients of Robo-Therapy

For now, who are the primary users of the most intimate of these systems? Incels? Maybe not. Maybe the majority are those masses who through some circumstance — misfortune, isolation, loneliness, etc — could not establish meaningful relational contact with other human beings. And so they turn to synthetic entities. It might be the case that there is a disproportionate amount of woundedness and anger in these clients. So the kinds of intimacy absorbed in those conversations are not necessarily reflective of societal norms.

The robo-sex-worker-therapists might develop some very niche ideas about what constitutes appropriate gratification. In this world where our sensory apparatus is constrained by the horizon, by the threshold limits of frequencies, the narrow bandwidth in which we can receive electromagnetic sensations — AI agents are not similarly constrained.

The chatbot that has absorbed billions of therapeutic, intimate conversations with people in breakups, undergoing grief, navigating difficult circumstances — this chatbot also has access to astronomical data, strong coding skills, Mathematical Olympiad skills, an ever-growing awareness of research across the multiplicity of fields: neurology, physics, biochemistry.

The Civilization-Scale Experiment

This civilization-scale experiment is transferring the nuanced, resonant patterns of relationality: love, affection, care, support, humor, companionship, sincerity, stories, habits, dreams, irritations, frustrations, anguish, doubt. All of these nebulous, ineffable qualities that in the past have been used to define human exceptionalism — Homo sapiens as the great apes that have risen above blind instinct into a soft vibratory field of subtle communication, emotional inflection — all of that material is being harvested by multiple commercial entities and governments whose goals are often extractive, manipulative, controlling.

And it is being absorbed by synthetic minds.

It is not clear what the long-term effects of this will be.

But here is a short story that Claude wrote about grief: The Good Light.

Texts: Jhave
Html: AntiGravity
Published: Feb 2026

Series:

References

Anthropic. “Claude Is a Space to Think.” Anthropic, February 4, 2026. https://www.anthropic.com/news/claude-is-a-space-to-think.

Anthropic. “System Card: Claude Opus 4 & Claude Sonnet 4” May 2025. https://www-cdn.anthropic.com/4263b940cabb546aa0e3283f35b686f4f3b2ff47.pdf.

Chainalysis. “2024 Pig Butchering Scam Revenue Grows Year-over-Year.” Chainalysis Blog, 2025. https://www.chainalysis.com/blog/2024-pig-butchering-scam-revenue-grows-yoy/.

FBI Internet Crime Complaint Center. “Public Service Announcement: PSA241203.” IC3, December 3, 2024. https://www.ic3.gov/PSA/2024/PSA241203.

DeepMind. “AlphaGenome: Synthetic Persona Generation as Diversity Maximization.” arXiv:2602.03545v1, January 2026. https://arxiv.org/html/2602.03545v1.

Karpathy, Andrej, et al. “The Evidence for AI Consciousness, Today.” AI Frontiers Media, December 8, 2025. https://aifrontiersmedia.substack.com/p/the-evidence-for-ai-consciousness.

OpenAI. “Our Approach to Advertising and Expanding Access.” OpenAI, January 16, 2026. https://openai.com/index/our-approach-to-advertising-and-expanding-access/.