Skip to main content
Markkula Center for Applied Ethics Homepage

The Flip Side of Interacting for too Long With a Chatbot? Dancing With Many Humans.

Dancers paired up

Dancers paired up

The Lessons of Friction

Irina Raicu

Irina Raicu is the director of the Internet Ethics program (@IEthics) at the Markkula Center for Applied Ethics. Views are her own.

Chatbots are here, integrated more and more into many people’s daily contexts, and they are often useful.

But they also bring concerns, both about particular uses and about the ways in which extended interactions with chatbots may skew our interactions with human beings.

As chatbots are built to remember longer contexts for interactions and use data from those exchanges to personalize and adjust their responses, some users are finding them more and more engaging. Recently, for example, OpenAI released a model called GPT-4, which was designed to be more “conversational”; when that model was pulled and replaced with a differently optimized GPT-5, many users were greatly upset and complained—loudly enough that the company brought back GPT-4 (temporarily) as an option.

As Ars Technica reported, “[s]ome users with emotional attachments to the older model expressed grief over losing what they considered their ‘only friend.’ OpenAI CEO Sam Altman responded that the company was ‘working on an update to GPT-5’s personality which should feel warmer than the current personality but not as annoying (to most users) as GPT-4o.’” What some users had found annoying was the chatbot’s level of sycophancy: however, as one engineer put it, “the current backlash against OpenAI is not happening because users don’t like sycophantic AIs. It’s because the latest version of 4o isn’t good at being sycophantic…. The model is coming on too strong and breaking the illusion.”

Would a better-modulated, less sycophantic chatbot be a better “friend”?

In a recent blog post, emeritus Santa Clara University Philosophy professor Michael Meyer wrote about chatbots and relationships:

A true friendship is a deep and important human relationship, potentially creating a kind of ‘second self.’ Friends of this sort are a mirror in which you--through thoughtful support and emotional insight--can come to see and become your better self. So, can chatbots be effective substitutes for human friends in this deep sense which is directly connected to human happiness?

Meyer answers that at least for now they can’t.

I would add, though, that chatbots can distort our understanding of any human relationships—even the “shallower” ones that constitute many of our daily human interactions. This is because, as one Stanford researcher put it, “These chatbots offer ‘frictionless’ relationships, without the rough spots that are bound to come up in a typical friendship”—or in other types of relationships.

There is a different kind of human interaction, though, that is very good at teaching people how to deal with (at least some types of) interpersonal “friction.” It’s partner dancing. And it’s particularly effective if you participate in one of those classes that require partners to switch, every few minutes, so that you end up dancing with multiple people.

In such classes, at least in Silicon Valley, you would likely end up practicing with partners from many countries, of many ages, different heights and weights and body types, and possessing a wide range of dancing experience and know-how.

Partner dancing highlights the complexities and necessity of compromise between people with different abilities, skills, styles, needs, personalities, and backgrounds—all of whom are aiming to enjoy the music and the movement, and need each other in order to do that.

Such dancing forces you to look at other humans’ faces, instead of a screen. It also requires you to touch other people, and so deal with what philosophers call “embodiment”—the fact that our physical bodies matter (and can feel good, or tired, or hot, or all of the above and more), impacting our perceptions and our thoughts. Chatbots don’t sway or sweat.

Moreover, unless you’re in one of those professional pairs who’ve been practicing together for years, partner dancing will constantly confront you with more substantial friction—with missed signals, awkward steps, moments of off-beat distraction or hiccups. (Chatbots also don’t feel pain, if you step on their toes or hit them in the face with an inartfully-flung arm.)

If you’re lucky, you will find partners who respond to such friction with smiles.

But even at its best, without hiccups, good dancing is all about adjustments to another person. The small compromises, the push/pull, the size of the steps or the speed of the turns, all require paying attention to another human being with his or her own needs, limitations, moods, strengths, rhythm.

This also means that partner dancing is not always a good experience. It is definitely always a learning experience.

Not everyone can dance, of course, or would find it enjoyable--and dancing is not the only kind of activity that offers this kind of learning. But so much of what is lacking in “relationships” with chatbots becomes clear if you go out there and try to learn to dance with a bunch of different partners. The positive kind of friction in human relationships teaches us something about others, and about ourselves, and binds together communities.

Image: “Salsa/Merengue/Bachata Dance at College of DuPage Feb 2015,” #16, cropped, used under a Creative Commons licence.

Dec 9, 2025
--

Subscribe to Our Blogs

* indicates required
Subscribe me to the following blogs: