Is AI Making Us Too Comfortable For Our Own Good?
Technology promises us gain without pain - but in bypassing life's necessary frictions, we're bargaining with the devil.
I’m no luddite. I’m happy as Larry that some brilliant soul invented the wheel so I don’t have to drag things around on my back all the time. I’m also very grateful for my washing machine, dishwasher, and hoover - and most of all - that as a writer and academic, I grew up in a world of Microsoft Word instead of a mechanical typewriter (the respect I have for writers of the analogue age is boundless). In short, I love how our tools have made things easier in the past, but I’m really uncomfortable with what tech has been seeking to ease in recent years.
This is a supplement to my recent GQ article Want to Survive the AI Revolution? Find your inner masochist providing extra content, resources, and suggestions for my Substack subscribers. Pleased read the original article first so you’ll know what I’m talking about!
Is technology giving us what we want at the expense of what we need?
It’s now just over a decade since I published The Psychodynamics of Social Networking (PSN). While in tech-years this may seem millennia ago, it remains relevant today because it focusses less on any particular tech platform or innovation (which change very quickly) and more on how human psychology (which doesn’t) is mediated by tech. One of the foundations of the theory that I laid out at that time how technology lowers the bar to just about everything - and in lowering that bar it allows us to skip important steps between our desires and our outcomes.
When technology lowers the bar to just about everything, it ends up giving us more of what we want than what we need.
In PSN I argued that while human beings fundamentally need to give and receive authentic recognition to each other, to use the colloquial expression, “to really be seen”, social media offered us the much more superficial experience of validation instead. My go-to metaphor is that this is like needing the nutrition of a salad but getting a donut instead. Donuts are great, but you cannot live on them.
But donuts are addictive - just like getting likes, follows, and shares is. The longer we go living on donuts, the harder it is to get used to eating salads again, and we start to get sick.
Dating apps are like canapés
Dating apps were designed to take one of the big pain points out of dating - that is meeting new people - and they did a wonderful job. Too wonderful. They no doubt make it easier to meet people, but they also make it much more difficult to get to know them better. It’s much easier to graze upon a selection of canapés on them than it is to settle into enjoying a nice long meal. And while there’s no doubt that they have enabled many many people to dine with each other happily, that’s not what they are designed to do.
In solving the pain point of meeting, it created a bigger one in sustaining. Not only that, but the more you get lost in the meeting, the less you have a chance to develop the skills that are required for the sustaining, and these are the very skills we need to form happy long lasting relationships.
Why do we ghost people?
Put simply, because managing interpersonal relationships is among the hardest things that humans do (it’s also among the most gratifying). Finding that sweet spot between our similarities is terrifically difficult (just look at the state of the world), but being able to manage it correlates to all sorts of things like mental and physical health and a long life. The trouble is, all our digital platforms seek to compromise this.
Every communication that happens across a platform is stripped, to some degree, of the complexity of that communication in a live face to face situation. The more we become accustomed to these simpler communications, the less good we become at managing interpersonal complexity.
When technology removes all the friction from our lives, it also removes the conditions necessary for growth.
This is why ghosting happens. It is easier to ghost someone than it is to respectfully and decently tell them your not interested - it’s also a lot less humane. It’s easier to text a gripe, email a complaint, or lash out on a social network than it is to sit across someone and talk it through. But every time we make that task easier, we make it harder on ourselves to draw on those skills when we need them - and trust me, we need them.
What’s the best way to choose how to use AI?
For the record, I’m writing this post all by myself, but I may very well use AI to help me make a promotional LinkedIn post for it. After all, I’ve already done the hard work. For me, this is a good use of AI. Should I use AI to write the article itself, what is the point? Not only is there none of me in it, I also miss the opportunity to challenge myself to come up with the goods that keep me sharp.
AI can’t think for you without making you less able to think for yourself.
This isn’t to deny that it’s sometimes tempting to get AI to do some of the thinking for me as well - it can be. For the moment however, when I’ve asked it to that I’m unsatisfied with the result 95% of the time anyway - I can do it better myself. But even when it gets better, and it will, I will have to resist that temptation because to do so would defeat the whole point of being a creative person contributing to the world!
In my GQ article I referenced some research that shows the cognitive cost of having AI do the heavy lifting for you when it comes to creative and intellectual work. Even partial assistance can have a significantly deteriorate learning. Further research points to “metacognitive laziness” and a reduction in critical thinking. Much of this research is new, and much more needs to be done, but I believe these are signs of a clear and present danger.
Don’t let influence outshine expertise
What is becoming very clear to me is how the dangers of AI are closely correlated to how it is used and at what level of expertise. In a previous post I discussed how I picked up an error that ChatGPT made that a novice in my field would not have picked up. I was able to make use of ChatGPT because I have the expertise to make these critical distinctions - distinctions that I am only able to make because I engaged in the hard work of becoming an expert.
In the past influence came as a consequence of expertise. Today the performance of expertise comes as a consequence of having influence.
The confluence of social media and AI have created a topsy turvy world where influence trumps expertise, and because of this, people are motivated more to become influencers than they are to acquire real expertise. If you have a broad reach on TikTok, you can just have AI create content for you and use you popular algorithm to propagate it. You end up being an empty vessel through which manufactured content is spread. It’s not only a very dangerous situation for society, but it’s not so good for you either. I mean, where are you anyway?
Do The Hard Work, You Won’t Regret It
A “steep learning curve” is steep because it recognises the effort required to acquire the learning. We increase our weight resistance at the gym because the continued physical challenge helps us to develop to be our best. We stick through conflict in relationships and try to work it out with couples counselling because we know that intimate relationships require work in order to thrive.
They may be cliches but they are true - there is no free lunch, and there’s no gain without pain
There’s no more reason to plow a field with a hoe if you can use a tractor than there is to get some AI assistance for a powerpoint presentation in area of your expertise. What I am suggesting is that you choose very carefully where you cut your corners because you will pay for it later. Here are some final take aways.
FAQ:
Q: How can AI undermine learning?
A: When AI removes the friction from challenging tasks, it short-circuits the very processes that build skill and expertise. Research shows this can reduce neural engagement, weaken critical thinking, and encourage “metacognitive laziness” — a reliance on the tool instead of one’s own capabilities.
If you want to use it for learning, make sure you don’t let it do the work for you. Instead of using it to help you write an essay, use it to quiz you, test you, and have a conversation with you about the material. Make it challenge you!
Q: When is using AI a good idea?
A: Use AI to support tasks in areas where you already have expertise and can critically evaluate its output. Avoid using it to produce content that you want to learn more about, where the struggle is essential for growth.
Q: Why is friction important for personal development?
A: Friction — whether in relationships, creative work, or learning — pushes us beyond our comfort zones, strengthening resilience, adaptability, and problem-solving. Without it, we risk stagnation and a loss of capability.
Q: What’s the difference between convenience and corner-cutting?
A: Convenience removes unnecessary barriers. Corner-cutting skips essential steps. The former makes life more efficient; the latter can erode the skills and depth that give your work and life meaning.
I used AI to assist myself in making this FAQ - which is well within my personal “fair use” policy.
Aaron Balick, PhD is a an international keynote speaker, psychotherapist, psyche writer for GQ, and author of The Psychodynamics of Social Networking connected-up instantaneous culture and the self; Keep Your Cool: How to deal with life’s worries and stress; and The Little Book of Calm: tame your anxiety, face your fears, and live free. He is an honorary senior lecturer at the Department for Psychosocial and Psychoanalytic Studies at the University of Essex.
🙌🏻🙌🏻🙌🏻🙌🏻🙌🏻🙌🏻 Thank you for this article. I have a million thoughts running through my head. Soon AI will be used to help me prioritize them 🥴 . I am like a piano player that never learned how to read music. I love technology and can figure out a lot, but when I am stuck I don’t know the language in order to ask for help. That, said, I am very worried about AI and the implications for the future. First of all, the environmental concerns. In a world where water is increasingly a commodity, I understand that AI is using lots of it. Maybe that will change. But, even to argue with those who are in the field and interested in the ethical concerns you raise, the AI universe is just to tempting. Without rules and regulations, in this field and in this moment in our history, we’ve learned that those who’ve decided rules don’t matter, just do what they want anyway. They break things, they break rules/laws; being cruel isn’t outlawed so who cares; and they hurt people; one step further people die. I truly believe that is where the billionaire mad scientists could care less, as long as in the moment they amass power and wealth at the expense of a civil society.