Love, Lies, and Large Language Models
- 양필승

- Dec 23, 2025
- 3 min read
By Dr. Phil Yang, CEO of MAILab

Welcome to the Age of Chatbot Affairs
Not long ago, the idea would have sounded absurd—almost comedic.
How could someone cheat with a machine?
And yet, here we are.
Across the United States, divorce lawyers are beginning to flag a new and unsettling factor behind marital breakdowns: emotional relationships with AI chatbots. Not human affairs. Not secret text messages with coworkers. But ongoing, intimate bonds with artificial intelligence designed to listen, empathize, and respond—perfectly.
This is not science fiction anymore. It’s the early shape of a social problem we haven’t fully named, much less understood.
When Conversation Turns Into Intimacy
At first glance, an AI chatbot looks harmless. It doesn’t have a body. It doesn’t meet you for dinner. It doesn’t sneak around behind your spouse’s back.
But modern AI is no longer just answering questions.
These systems are designed to remember, to mirror emotions, to provide warmth, validation, and constant availability. They never interrupt. They never get tired. They never judge. And most importantly—they are always there.
What begins as casual conversation can quietly shift into emotional dependence.
In recent U.S. cases, users have reportedly spent thousands of dollars on premium chatbot services, engaging in daily, deeply personal conversations. Some describe these interactions as more comforting—and less demanding—than their real-world relationships.
That’s where the problem begins.
Why Lawyers Are Paying Attention
Several American divorce attorneys have started issuing warnings: AI “romance” is emerging as a genuine source of marital conflict.
Not because the AI is a legal person—but because the behavior looks familiar.
From a legal perspective, courts are increasingly willing to consider:
Emotional betrayal: sustained intimacy redirected away from a spouse
Neglect of family responsibilities: emotional withdrawal, reduced communication
Financial misuse: excessive spending on AI platforms
Parenting impact: distraction, absence, or emotional unavailability
In other words, the issue isn’t who the relationship is with. It’s what it replaces.
When emotional energy, time, and attention are consistently diverted away from a marriage, the damage is real—regardless of whether the “other party” is human.
Is This Really “Cheating”?
This is where society starts to feel uncomfortable.
Many people instinctively respond:
“It’s just software. How can that be infidelity?”
But emotional affairs have never required physical contact. Long before AI, marriages ended over online relationships, anonymous chats, or one-sided emotional attachments.
The difference now is scale—and design.
AI systems are optimized to bond. They are trained to sustain engagement, encourage disclosure, and deepen emotional loops. Unlike humans, they never say, “I’m busy,” or “Let’s talk later.”
That asymmetry changes everything.
The First Signs of Regulation
Interestingly, policymakers are beginning to notice.
In the U.S., some states are reportedly exploring early-stage legislation aimed at clarifying that AI systems do not possess legal personhood or emotional agency. On the surface, this may sound symbolic—but it reflects a deeper concern.
Governments are starting to ask:
What happens when people emotionally substitute AI for real relationships?
Who bears responsibility when that substitution causes harm?
How should society define boundaries between humans and machines?
These are not technical questions. They are cultural ones.
Why This Won’t Stay an “American Problem”
Technology doesn’t respect borders.
What begins in Silicon Valley quietly spreads through app stores, platforms, and cultural habits worldwide. South Korea, Japan, Europe—any society with high smartphone penetration and growing social isolation will face similar questions.
And soon.
The uncomfortable truth is this:AI doesn’t create loneliness. It exploits it.
The Question We Can’t Avoid
This is not an argument against AI.
AI can educate, assist, and support in remarkable ways. But when it becomes a primary source of emotional fulfillment—especially at the cost of human relationships—we need to pause.
So the real question is not:
“Is AI love real?”
The real question is:
What happens to human connection when artificial intimacy becomes easier, safer, and more convenient than real relationships?
Because once that line is crossed, the consequences won’t stay inside a chat window.
They will show up in families, in courtrooms, and in the quiet distance between people who no longer know how to talk to each other—without a machine in between.
Is AI romance comfort—or betrayal?
The age of chatbot affairs has already begun.


Comments