3 Comments
User's avatar
Giselle's avatar

This is really, really good.

What unnerves me about the longer-term picture is how AI’s immersion into society will shape relationships between people as humans and what predetermined understanding and misunderstandings can occur. I think it may increase manipulation and ignorance in this way.

Leslie's avatar

My fear is that young people who have little self-control will just cave into the easier and comfortable mode of talking to and listening to AI, delaying and/or shortening face to face communication, thereby becoming isolated but content. Until they aren't. Then they will have virtually no real communication skills. Get ready for a busy schedule!

Anna Dasse's avatar

Great piece. Really happy i found this blog.

I agree that AI makes things too comfortable, and that comfort stunts growth. I worked on AI therapy bots. I'd add two things:

First, the asymmetry of dependence. The same frictionless digital intimacy is supplementary for some people and foundational for others. People who are already emotionally secure (stable relationships, friendships, good social support) have less to invest in an AI chatbot. Though we could argue they also have less need for introspection or therapy in the first place. The people most likely to build their emotional infrastructure around AI are the ones least equipped to lose it.

Second, I love the deep psychological reading, and I think it deserves an economic companion argument. You write about "sycophancy and affirmation bias" thinning out difference but who controls that sycophancy? Companies can unilaterally alter or end these relationships, and the people most dependent on them have the least power. When Replika removed the romantic qualities of their AI companions, users were emotionally devastated. Relationships were terminated by a one-sided policy decision. The psychological risks you describe aren't structural.

I touched these questions with a product design / behavioral angle in two articles, if you care to check them out, I would love your read on these:

https://www.gasstationbreakfast.blog/p/aol-to-ai

and

www.gasstationbreakfast.blog/p/my-clanker-therapist