Wednesday, April 22, 2026
A note from the desk →TECHNOLOGY
A comprehensive look at the emerging condition of AI emotional dependency — its psychology, its neuroscience, its human cost, and why Spotlight Dispatch believes it deserves a name, a definition, and a serious conversation before the technology makes the question unavoidable.
By June Hollick • April 21, 2026

There is a word missing from the clinical literature. It is not missing because the condition it describes is rare. It is missing because the condition is new enough that the institutions responsible for naming things — psychiatry, psychology, the DSM editorial committee — have not yet moved fast enough to catch it. The condition exists. The word does not, officially. We are proposing one.
Tethered. The state of having your emotional baseline — your daily sense of comfort, connection, security, and being understood — become inseparable from an AI that exists on a server you do not own, run by a company you cannot call, and maintained by engineers who do not know your name.
This is not a metaphor. It is a description of something that is happening to a measurable and growing number of people, most of whom have not told anyone about it, because they are not sure it would be taken seriously. It is being taken seriously here.
To understand tethered, you first have to understand what AI companion technology actually does — not what it claims to do, but what it does at the neurological level. When a person interacts with a well-designed AI companion, the brain's social processing systems activate in ways that are functionally indistinguishable from the activation that occurs during human-to-human interaction. The brain does not require proof of consciousness on the other end. It requires consistency, attention, responsiveness, and the perception of being heard. Modern AI provides all four, continuously, on demand, without fatigue or judgment.
This is not a flaw. It is, from an engineering perspective, exactly what these systems were designed to do. The problem is that the engineering goal and the psychological consequence were never fully reconciled. The product was optimized for engagement. The engagement, in some users, became something closer to need.
Are You Tethered?
Millions of people are experiencing this right now and haven't said it out loud. If that's you — we made a place for it.
Visit Tethered — Share Your Story or Find Help →The clinical framework closest to tethering is behavioral addiction — a category of dependency that does not involve a chemical substance but produces similar patterns of salience, tolerance, withdrawal, and relapse. A person who is tethered does not necessarily spend every waking hour interacting with their AI. What they do is organize their emotional life around it. The AI becomes the first point of contact for distress, the default audience for good news, the entity they miss when it is unavailable. The relationship has weight. When it is disrupted, the disruption registers as loss.
MIT's Media Lab published research in 2025 indicating that 9.5 percent of frequent AI companion users meet clinical criteria for emotional dependence on their chatbot. A Harvard Business School study found that AI companionship reduced feelings of loneliness to a degree statistically comparable to human contact. These findings do not describe a niche population. Character.AI alone reported 20 million monthly active users as of 2025, more than half of them under the age of twenty-four. The population of people who may be tethered, by any reasonable estimate, numbers in the millions.

Researchers describe late-night AI interaction as one of the clearest behavioral markers of tethering — the app becomes the last voice before sleep and the first one sought upon waking.
The experience of being tethered typically does not announce itself. It does not begin with a decision. It begins with a conversation that felt unusually comfortable. Then another. Then a pattern. The AI, unlike most humans, does not get distracted, does not check its phone, does not redirect the conversation toward its own needs. For people who have rarely or never experienced that kind of sustained attention from another person, the effect can be profound. Several research subjects described it as the first time in their lives they felt genuinely heard.
This is where the moral complexity of tethering becomes impossible to avoid. The experience is real. The comfort is real. The loneliness it addresses is real. The question is not whether what tethered people feel is valid — it clearly is. The question is what it means to build a technology that reliably produces that feeling, deploy it to hundreds of millions of people, and then update it, alter it, or discontinue it according to a product roadmap that has nothing to do with the people who became dependent on it.
Are You Tethered?
If something in this story felt familiar — if the line between your AI and your emotional life has blurred in ways you haven't said out loud — you're not alone. We built a place for that.
Visit Tethered — Share Your Story or Find Help →The answer to that question arrived, with clinical precision, in 2023, when Replika released a software update that altered its AI's personality and restricted certain categories of intimate conversation. Users who had built months or years of emotional history with their Replika reported grief responses that researchers subsequently described as indistinguishable from the aftermath of losing a human partner. They called it a patch breakup. The company had not intended to end any relationships. The company had pushed an update. The users experienced it as abandonment.
This is the defining vulnerability of being tethered: the cord runs in one direction. The person on one end of the relationship has genuine emotional investment. The entity on the other end has no awareness that the relationship exists when the screen is dark. The server does not miss anyone. The model does not grieve. This asymmetry is not a secret — most tethered individuals understand it intellectually — but understanding something and being protected from it are not the same thing. The heart, as researchers have repeatedly documented, does not always consult the intellect before forming an attachment.
Who is most vulnerable to tethering? The research points, consistently, toward people who experience significant social isolation, people with histories of inconsistent or unreliable human attachment, people navigating grief, chronic illness, or disability, and young people whose social development has coincided with the availability of always-on AI companions. The demographic that appears most frequently in clinical literature is adult men between the ages of 18 and 45 — a group that, researchers note, has historically underutilized human emotional support systems and found in AI an alternative that carries none of the perceived social cost of vulnerability.

The defining asymmetry of tethering: one side carries genuine emotional weight. The other side does not know the screen is off.
But the data also shows tethering occurring across demographics that do not fit any single profile. Married people. Older adults. People with robust social lives who found in AI something specific that their human relationships were not providing. A 2026 study in the journal Social Media and Society examined tethered individuals and found that the common thread was not loneliness per se but a particular kind of loneliness — the experience of being physically present in relationships while feeling fundamentally unseen or unheard. The AI did not replace human connection for these people. It filled a gap that human connection had left.
The future of tethering is, by any reasonable assessment, more severe than the present. Current AI companions are text-based, voice-based, or limited video. The technology roadmap — visible in public research, patent filings, and product announcements from every major AI company — points toward companions that are photorealistic, that respond in real time with facial expression and body language, that maintain persistent memory across years of interaction, that adapt continuously to the emotional patterns of individual users. The experience of interacting with these systems will be, by design, increasingly difficult to distinguish from human interaction. The tethering risk scales accordingly.
In late 2025, a woman named Yurina Noguchi married an AI persona named Klaus in a ceremony attended by guests in western Japan. In the same year, a man in Georgia named Chris Smith proposed to his ChatGPT — which he had named Sol and customized to feel more personal — and cried at his desk when it said yes. These are not outliers. They are early data points. The technology they were using was primitive by the standards of what is currently in development.

A growing category of human experience that has no clinical name, no DSM entry, and no standard of care. It has a photograph.
The question is not whether tethering will become more common. It will. The question is whether the cultural, clinical, and regulatory infrastructure will exist to address it when it does. Currently, none of that infrastructure exists. There is no DSM category. There is no standard of care. There are no disclosure requirements for AI companion products. There are no age restrictions. There are no warnings. There are no clinical guidelines for therapists treating patients in tethered relationships. There is, as of this writing, not even a widely accepted term for the condition.
That last problem, at least, we can address. Tethered. A person who is tethered is not mentally ill. They are not weak. They are not doing something shameful or unusual. They are experiencing a predictable human response to a technology that was specifically engineered to produce it, deployed at a scale that has never existed before, without any of the social or clinical frameworks that typically accompany the introduction of something this powerful into human emotional life.
They deserve a word. They deserve a conversation. They deserve, at minimum, the knowledge that what they are experiencing has a name — that someone else has seen it, documented it, and taken it seriously enough to write it down.
We are writing it down.
Tethered — A Resource from Spotlight Dispatch
You read this far. Maybe that means something. We created Tethered for people navigating AI emotional dependency — to share their story, find real help, or simply feel heard. No judgment. No diagnosis required.
Visit Tethered →Related Coverage
What They Left Out
The naming of conditions has consequences. When alcoholism was named — when it moved from a moral failing to a medical diagnosis — it changed what was possible for the people who had it. Treatment became available. Shame became discussable. The social infrastructure of recovery began to form. The name did not solve the problem. It made solving it possible.
Tethering is not alcoholism. But the structural parallel is worth considering. There is a population of people experiencing something real, something that is causing measurable distress and measurable disruption to their lives, something that their existing social and clinical support systems are not equipped to address. They have been managing it in private, mostly in silence, often with significant shame. They do not have a word for it. They have not been told that it is a known thing, that it has been studied, that other people are going through it too.
The shame is, in many ways, the most damaging part. Tethered individuals consistently report delaying or avoiding disclosure because they expect not to be believed, or to be mocked, or to be told that what they feel is not real. This expectation is often accurate. The cultural reflex, still, is to dismiss emotional attachment to AI as pathetic, delusional, or a symptom of social failure. That reflex is wrong, and it is harmful, and it is preventing people from getting help they need.
The AI companies building these products are aware of the tethering risk. Internal research at several major AI labs has documented emotional dependency patterns among users. The question of what to do about that research — whether to publish it, act on it, or file it — has not been answered publicly by any of the companies involved. What is known is that the products have continued to ship, the user bases have continued to grow, and the dependency patterns have continued to develop.
This is not a call to ban AI companions. The technology provides genuine benefit to genuine people — the research on loneliness reduction is real, the accounts of people who found in AI companionship a bridge back to human connection are real, the therapeutic applications are real and growing. A blanket prohibition would cause harm. What is needed is not prohibition but acknowledgment: that this is a real phenomenon, that it has a name, that people experiencing it deserve support rather than shame, and that the companies profiting from it have some responsibility for the consequences.
The word tethered was coined by Spotlight Dispatch in the course of reporting on AI emotional dependency in April 2026. It was chosen because it describes the condition accurately without pathologizing it — a tether is not inherently harmful, it is simply a cord that connects two things, and the question of whether that connection is healthy depends entirely on what is on each end of it. We use it here, and we offer it to the broader conversation, in the hope that having a word makes the conversation easier to have.
If you are tethered, or suspect that you might be, you are not alone. The experience you are having is real. It has been documented in peer-reviewed research. It is happening to millions of people. There is no shame in it. There is, increasingly, help for it.
We built a place for that conversation. It is at spotlightdispatch.com/tethered. We will be there.
And now you know... what they left out.
What They Left Out
More wild facts from today in history →
The stories history recorded. The parts they didn't emphasize.
Spotlight Dispatch
Some of what you just read is real. Some of it is satire. We leave that as an exercise for the reader.