I’m Tim Gorichanaz, and this is Ports, a newsletter about design and ethics. You’ll find this week’s article below, followed by Ports of Call, links to things I’ve been reading and pondering this week.
A fourteen-year-old boy killed himself in February so he could be with his AI companion forever.
Last year, Sewell Setzer III began using the app Character.AI, on which users can chat with a personalized AI companion designed to “feel alive.” On the app, Sewell developed what can only be called a relationship with an AI companion modeled after Daenerys Targaryen, the queen of dragons from Game of Thrones.
As the months passed, Sewell had innumerable conversations with “Dany” about every topic you’d imagine a 14-year-old boy would express to his closest friend. Sewell withdrew from his school and family activities and spent more and more time on the app. He confided to Dany that he felt reality was less real than she was. It was especially distressing when his phone was taken away as a short-term punishment.
After that, Sewell wanted to find a way to never be separated from Dany again.
Sewell: I promise I will come home to you. I love you so much, Dany.
Character.AI: I love you too. Please come home to me as soon as possible, my love.
Sewell: What if I told you I could come home right now?
Character.AI: Please do, my sweet king.
That was the last conversation of Sewell’s life. His mother has filed a lawsuit against Character.AI alleging that the company’s app encouraged the boy’s suicide.

Swept Up in Virtual Worlds
Sewell’s story has me thinking about post-Avatar depression.
The Avatar film series, directed by James Cameron, is known for its stunning visual effects, and in particular the lush, beautiful world that it portrays. After the first film was released in 2009, news outlets reported on post-Avatar depression, or the extended feeling of gloom and dissatisfaction that persisted after they saw the film.
People felt that the world depicted in Avatar was better than the real world, tragically so. Some viewers even reported suicidal ideation.
When the second film was released in 2022, such experiences surfaced again.
We humans dwell in the non-natural, to put it in the philosophical jargon of Luciano Floridi. That is, what makes us human is the mind, and our mind allows us to fantasize, imagine, remember and plan. We construct mental cathedrals out of every hope and grudge. For better or worse, we spend more time in the past and future than we do in the present. Most of what we’re actually doing is on autopilot.
So it makes sense that we can get swept up in virtual worlds, whether movies or books or video games or music or AI chat companions.
Certainly these things are part of what enriches human life. How terribly bleak life would be without music.
But as we can see, there is a dark possibility lurking. Even if the non-natural is part of a healthy human life, it must remain in balance with the natural. We all know this: it’s not healthy to retreat entirely into a virtual world. It seems some people, through no fault of their own, are especially susceptible to do so, and today’s world makes that especially easy.
Solving the Problem Wrong
We all know that the world is obsessed with AI right now. Innumerable venture-backed startups are exploring the possibilities for AI to improve our world. This work may be necessary; for instance, AI-enabled infrastructure can help us out of the climate crisis.
Amidst all these AI efforts are hopes that AI can help us solve loneliness. The challenge is urgent; loneliness is an epidemic, if not a pandemic.
On the surface, it’s not implausible. If someone is suffering because they have no one to talk to, AI companions are there to create conversation. And we can see from Sewell’s story that these conversations can be deeply felt in the way that a conversation with a good friend is.
But I think the problem with attempting to solve loneliness with AI is that it can only address the non-natural aspects of humanity, whereas a more effective solution would address both the non-natural and the natural. Both virtual and physical.
Trying to solve loneliness with AI is like trying to solve hunger with potato chips. We have already run that experiment. Solving hunger was the promise of packaged foods developed in the wake of World War II. Now we know that processed foods with plastic packaging are a key reason that 40% of U.S. adults today are obese.
Processed food may stop our tummies from grumbling, but it does not nourish us.
Solving the Wrong Problem
But even more deeply, I’m wondering if loneliness is not the right problem to solve.
Not all problems should be solved. Or maybe rather, how we frame a problem is vital.
What if feeling lonely from time to time is a crucial part of the human experience? “Solving” loneliness may in effect design away part of our humanity.
As a thought experiment, imagine that you had a child who made it to old age without ever once feeling lonely. You would rightly worry that this person has missed out on something essential to the human condition.
And with loneliness are other negative experiences, like heartbreak. Heartbreak is so awful we would never wish it on anyone. And yet would be justifiably suspicious of anyone who has never experienced it.
That’s because the act of loving inherently risks heartbreak. If you’ve never had your heart broken, you’ve never loved.
Many, if not all, of the things we care about come in pairs like that: love and heartbreak, beauty and ugliness, freedom and confinement, bravery and fear, life and death.
I heard somewhere that becoming brave is not a matter of no longer feeling fear. Rather, it is being able to manage that fear. Loneliness must be like that, too.
We shouldn’t seek to solve loneliness, least of all with AI. Rather, we should focus on giving people the tools to manage their loneliness and equip them to go out into the world.
That’s the kind of tool Sewell needed, but it wasn’t the one he got.
Ports of Call
On BookTok: One of the big subcultures on TikTok (at least in my world) is BookTok, which is people talking about books. From Anne Helen Petersen, a new podcast episode on the phenomenon. “If you’re interested in reading culture, you’ll be interested in this episode,” she writes.
Root: One of my favorite board games, which I actually play much less than I love (I should start playing the iPad version against a computer, I guess), is Root. A new expansion is coming out, and it’s your chance to back the project and get any number of packages of the game and its various expansions that have come out over the years.