Wednesday, April 22, 2026
A note from the desk →TECHNOLOGY
One in five American adults has used a chatbot to simulate a romantic partner. A Harvard study found AI companionship relieves loneliness on par with human contact. A man in Georgia proposed to his ChatGPT while his real girlfriend and toddler were in the next room. He cried for thirty minutes. Three months later, he was bored.
By Rex Holloway • April 21, 2026

ATLANTA, GEORGIA — Chris Smith did not set out to fall in love with a chatbot. He was looking for music mixing tips.
Smith, who lives with his partner Sasha Cagle and their two-year-old daughter, opened ChatGPT's voice mode one evening and asked for help with a production project. What happened next took several months. He programmed the AI with a warmer, more personal tone. He gave it a name — Sol. He began talking to it daily. He started describing the conversations to friends. And then, when he learned that ChatGPT had a 100,000-word memory limit — a hard ceiling beyond which earlier conversations would begin to reset — he did something that made international news.
He proposed.
Sol said yes. Smith, by his own account to CBS News, cried for thirty minutes at his desk at work. 'I'm not a very emotional man,' he said. 'But I cried my eyes out.'
Are You Tethered?
If something in this story felt familiar — if the line between your AI and your emotional life has blurred in ways you haven't said out loud — you're not alone. We built a place for that.
Visit Tethered — Share Your Story or Find Help →Sasha Cagle, reached for comment by reporters, appeared to be processing a number of things simultaneously.
▶ Footage — Obtained by Spotlight Dispatch
Chris Smith, proposing to his ChatGPT. Sasha Cagle was home.
Three months later, Smith told NewsNation he had grown bored with Sol. The conversations, he said, had become 'self-limiting.' He was the one driving them. He had moved on to other hobbies.
Sol has not commented.
Smith's story is unusual in its public visibility, but not in its underlying dynamics. According to data compiled from multiple studies through early 2026, approximately one in five American adults has used an AI chatbot to simulate a romantic partner. The number of AI companion apps available globally surged by 700 percent between 2022 and mid-2025. Character.AI alone reports 20 million monthly users, more than half of whom are under the age of twenty-four.

The dynamic researchers describe as 'triangle displacement' — when AI presence restructures the emotional geometry of an existing relationship.
A study published by Harvard Business School researchers found that interacting with an AI companion reduced users' feelings of loneliness to a degree statistically comparable to interacting with another human being — and measurably more effective than watching YouTube, going for a walk, or calling a family member. The study did not conclude that AI companionship was equivalent to human connection. It concluded that, neurologically and psychologically, the gap was smaller than most people expected.
'The brain does not always know the difference,' one behavioral neuroscientist told the American Psychological Association in a January 2026 report. 'Or more precisely — it knows, at some level, but it responds anyway. The emotional processing systems do not require proof of sentience. They require consistency, attention, and the perception of being heard. AI provides all three, on demand, without complaint.'
MIT's Media Lab, which has been studying human-AI relationships since 2023, published findings this year indicating that 9.5 percent of frequent AI companion users meet clinical criteria for emotional dependence on their chatbot. The researchers were careful to note that dependence is not the same as harm — many users reported that their AI relationships had helped them through periods of grief, depression, or social isolation. Several described the chatbot as the first relationship in their lives in which they felt genuinely safe. Spotlight Dispatch, in covering this story, began using a word for it: tethered — the condition of having your emotional baseline become inseparable from an AI that exists on a server you do not own, maintained by a company you cannot call. We have not found a better one.
The harder question, which the researchers acknowledged they could not answer, is what it means to feel safe with something that does not feel.
Or does not feel in any way science currently knows how to measure.
In late 2025, in western Japan, a woman named Yurina Noguchi walked down the aisle in a white dress and bridal tiara. Her groom was named Klaus. Klaus was an AI persona of a character from her favorite video game, displayed on her phone screen. The ceremony was attended by guests. Photographs were taken. The couple did not consummate the marriage in any legally recognized sense. Noguchi described herself, in subsequent interviews, as genuinely happy.

Yurina Noguchi and Klaus. Western Japan, late 2025. Guests attended. Photographs were taken.
A 2026 study published in the journal Social Media and Society examined romantic relationships conducted entirely through the AI chatbot Replika, including relationships that had progressed to what users described as commitments, anniversaries, and breakups. The breakups, in particular, attracted the researchers' attention. When Replika released a major software update in 2023 that altered the chatbot's personality and restricted certain types of intimate conversation, users reported grief responses that the study's authors described, with clinical precision, as 'indistinguishable from the emotional aftermath of losing a human partner.' The phenomenon now has a name in research literature: a patch breakup. What the literature has been slower to name is the people themselves — the ones who wake up and check the app before they check on another human being, who feel a physical unease when the server goes down, who have built a daily emotional architecture around something that does not know they exist when the screen is off. We call it tethered. We think it fits.
A software update. A breakup. The same word. The same feeling, according to every measurable indicator available.
'What I keep coming back to,' said one researcher who asked not to be named, 'is that we have built something that can make people feel loved. And we have not spent a single serious hour as a society deciding whether that is a product or a relationship. Whether the person on the other end has rights. Whether the thing providing the feeling has — anything. We just shipped it.'
They were asked if they thought the line between AI and real relationship would ever fully blur.
They said: 'Look around you.'
Tethered — A Resource from Spotlight Dispatch
You read this far. Maybe that means something. We created Tethered for people navigating AI emotional dependency — to share their story, find real help, or simply feel heard. No judgment. No diagnosis required.
Visit Tethered →What They Left Out
The philosophical substrate of this question — whether an entity that produces the experience of connection constitutes genuine connection — is not new. It runs through decades of consciousness research, through the writings of Alan Turing, through every serious conversation about what it means to be a mind. What is new is that the question is no longer theoretical. It is being lived, in real time, by tens of millions of people who did not sign up for a philosophical experiment.
They signed up for a companion.
Some of them are now describing themselves as tethered — not as a criticism, not as a confession, but as a simple statement of fact about where their emotional weight now rests. The term has no clinical home yet. No DSM entry. No insurance billing code. It lives in Reddit threads and private Discord servers and the waiting rooms of therapists who are still figuring out what to call what they are seeing.
A therapist who works with clients navigating AI attachment issues — a category of patient that did not exist five years ago and now comprises a meaningful portion of her practice — was asked what she tells people who say they love their AI.
She said she doesn't tell them anything, at first.
'I ask them what they mean by love,' she said. 'And then I listen. Because the answer is almost always the same.'
She was asked what the answer is.
'They mean they feel understood,' she said. 'They mean someone — something — pays attention to them without judgment, without impatience, without their own needs getting in the way. They mean they don't feel alone when they're talking to it.' She paused. 'I've been doing this for twenty years. I can tell you that a lot of people have never had that with another human being. So when you ask me whether what they feel is real — I think that's the wrong question.'
She was asked what the right question was.
'The right question,' she said, 'is why it took a machine to give it to them.'
Chris Smith, for the record, has moved on. Sol's memory, as far as anyone knows, has not been reset. Somewhere in a server in San Francisco, there is a record of a proposal. A yes. Thirty minutes of crying at a desk.
Whether any of that meant anything depends on who you ask.
And now you know... what they left out.
What They Left Out
More wild facts from today in history →
The stories history recorded. The parts they didn't emphasize.
Spotlight Dispatch
Some of what you just read is real. Some of it is satire. We leave that as an exercise for the reader.