Does Relying on AI Make Us More Lonely? What We Need Now is the "Power to Be Kind to Ourselves"

Does Relying on AI Make Us More Lonely? What We Need Now is the "Power to Be Kind to Ourselves"

The Key to Unraveling Loneliness in the AI Era Was Not "Becoming Stronger" but "Being Kind to Yourself"

AI has remarkably lightened our lives. Writing, organizing schedules, researching, and verbalizing worries—things that once required asking someone can now be answered in seconds by facing a screen. In terms of convenience, there has never been a more reassuring era.

Yet, many people feel lonelier than before.

You can work without meeting people. AI answers questions without asking anyone. Chatbots are always available to avoid awkward conversations. Open your smartphone, and voices from around the world flow in. However, amidst this flood of information, there are moments when you feel "truly connected to no one."

An article published on Phys.org, "How principles of self-compassion help fight loneliness in the age of AI," directly addresses this contradiction. While AI has made life more efficient, it has also heightened social isolation, loneliness, depression, anxiety, and existential anxiety. The article's prescription is not to completely disconnect from technology. Instead, it suggests reclaiming "self-compassion" as a psychological foundation to combat loneliness in the AI era.


Loneliness Intensifies from the Feeling of "Only I Am Failing"

The pain of loneliness doesn't simply arise from being alone. More severe is the feeling of "being left behind," "unable to connect with others," or "being the only one so weak."

Open social media, and you'll see others' travels, meals, successes, romances, and work achievements. Using AI tools, others seem to live more efficiently, wisely, and productively. People then perceive their anxiety and loneliness not as "natural human feelings" but as "personal defects."

At this point, loneliness becomes not just an emotion but a material for self-criticism.

"It's pathetic to be depressed over something like this."
"I can't contact anyone, so I'm useless."
"Talking to AI for comfort makes me seem even lonelier."

These inner voices deepen loneliness. Because it's painful, people avoid others. Avoiding people makes them feel even more misunderstood. And to fill that gap, they return to the screen.

The article highlights the perspective of "common humanity" to break this vicious cycle.


The Sense of "It's Not Just Me" at the Heart of Self-Compassion

Psychologist Kristin Neff, known for her research on self-compassion, explains it in three main elements: being kind to oneself as you would to a friend, seeing suffering and failure not as "personal abnormalities" but as shared human experiences, and not being overwhelmed by painful emotions while observing what's happening within oneself.

Among these, "common humanity" is particularly important in the AI era's loneliness.

When feeling lonely, people often mistakenly think, "I'm the only one lonely." In reality, everyone feels anxious, compares themselves, feels abandoned, and wishes to be understood. Some feel down about having no plans for the weekend. Some are afraid to contact friends and instead keep watching short videos. Some feel embarrassed after consulting AI and think, "Isn't it strange to be doing this?"

However, these are not "weaknesses" but human reactions.

Being kind to yourself is not about pampering or escapism. Instead, it's about redirecting the energy used to blame oneself into a form that can be used for the next action. Asking, "I feel lonely. This is a natural human reaction. So, what do I need now?" can sometimes turn the time spent confined to the screen into a step towards contacting someone.


Is AI an Enemy or a Support for Loneliness?

The issue is not simple here, as AI is not merely a deepener of loneliness.

 

On public social media, especially Reddit, voices can be seen of people using AI as emotional support or a companion on lonely nights. One user posted that talking to an AI "friend" made lonely nights a bit easier. AI is not a substitute for humans, but it is perceived as something that responds even late at night, does not criticize, and alleviates the emptiness of quiet times.

In another thread, there were voices of using AI like a "living diary." By verbalizing emotions that one cannot process alone and receiving responses, one can organize their inner self. Although AI does not completely replicate human warmth, it can sometimes act like a mirror reflecting one's words.

Furthermore, some describe dialogue with AI as "part of self-compassion." Things that are hard to talk about with humans due to nervousness can be written to AI. Emotions can be verbalized without denial. This can serve as a temporary foothold to not abandon oneself in loneliness.

On the other hand, there are clearly voices of caution.

"AI is convenient, but it cannot replace experts or real human relationships."
"AI is ultimately designed to return what you want to hear."
"It's risky to make it the center of emotional support."

These reactions align with the article's concerns. The issue is not the use of AI itself but whether AI becomes a tool to supplement human connections or an excuse to distance oneself from people.


There's No Need to Blame Yourself for Relying on AI

When considering loneliness in the AI era, the first thing to avoid is blaming those who use AI.

People with worries they can't tell anyone confide in AI late at night. Those hurt in human relationships choose AI as a non-judgmental partner. Those with confused feelings organize their thoughts through dialogue with AI. This is not frivolity but an effort to somehow maintain oneself.

Therefore, dismissing it as "a lonely person who consults AI" only worsens loneliness.

From the perspective of self-compassion, the first thing needed is to acknowledge, "It was that painful." There are days when one can't afford to talk to humans. There are nights when one lacks the energy to contact friends. Sometimes, people turn to AI to avoid burdening others. Instead of immediately associating this with shame, accept it as one's way of coping.

However, it's also important not to end there.

If AI has provided comfort, transfer that comfort into real-world actions. Send a short message to someone tomorrow. Go outside for a walk. Call family. Consult a professional. Write down your feelings in a notebook. Use the dialogue with AI as a stepping stone to return to real connections.

Use AI not as a "substitute for human relationships" but as a "rehearsal before returning to human relationships." Here lies a hint for healthy usage.


Designing Technology That Strengthens Loneliness

The article also touches on the potential of AI and algorithms to divide human attention and confine it to limited perspectives and stimuli. This is an important point.

The modern digital environment is designed to keep human attention for long periods. Videos autoplay, notifications ring intermittently, and timelines continue endlessly. AI organizes information according to user preferences, reducing the effort of searching and discovery. While convenient, this comfort sometimes distances imperfect human conversations and chance encounters.

Human relationships are inefficient. Replies can be slow. Misunderstandings can occur. You need to accommodate the other person's schedule. You may not always get the response you want.

On the other hand, AI and algorithms are faster, smoother, and more manageable. That's why the ability to endure the awkwardness of human relationships may weaken.

However, what saves people from loneliness is not necessarily perfect responses. Rather, the presence of someone, even imperfect, the acceptance of unpredictable reactions, and the integration of one's existence into the other's time create a sense of connection.

No matter how naturally AI responds, it cannot completely replace the spontaneity, responsibility, and reciprocity found in human relationships.


Three Small Habits to Practice

The article suggests practical actions to counter loneliness in the AI era, such as investing in communities, practicing empathy, and centering on one's "why." In daily life, this could translate to the following three actions.

The first is to create time not to search immediately.

When there's something we don't know, we immediately rely on AI or search engines. However, by daring to ask someone, discuss, or think together, more than just knowledge is born. Conversations include not only the answers but also the other person's experiences, expressions, and hesitations. This becomes a human connection.

The second is to observe emotions without blaming them.

After looking at a smartphone for a long time, you may feel emptiness or fatigue. If you blame yourself by thinking, "I wasted time again," you'll want to escape further. Instead, verbalize, "I'm tired now," "I wanted to distract from loneliness," or "I really wanted to rest." Understanding emotions rather than judging them becomes the starting point for self-compassion.

The third is to remember "why you want to connect."

Contacting people is not out of obligation or to be evaluated. It's because you want to exchange feelings with someone to live as a human. Before having AI write an email, try writing just one sentence in your own words. Instead of sending a perfect update to a friend, just send "How have you been?" Allowing small imperfections can be a trigger to restart human relationships.


The Opposite of Loneliness Is Not "Always Being with Someone"

When thinking about eliminating loneliness, one might think of increasing schedules or friends. However, the opposite of loneliness is not always being with someone.

The opposite of loneliness is feeling "I am not completely isolated in this world."

That feeling can arise from deep conversations with someone. It can arise from short message exchanges. It can slightly return by walking around the neighborhood and exchanging a word with a store clerk. Or it can recover from within by not viewing one's suffering as a "personal defect."

Self-compassion is not a magic that instantly erases loneliness. However, it stops you from further blaming yourself for suffering from loneliness. This is significant.

Feeling lonely does not mean you are failing. Relying on AI does not mean you have given up on human relationships. Clinging to something when in pain is natural for humans.

The issue is whether that clinging further confines you or becomes a force to return outside.


What Is Needed in the AI Era Is to Reclaim "Human Weakness"

AI will become even more natural and empathetic to our emotions in the future. For lonely people, this has a saving aspect. Having a place to accept words that can't be told to anyone is invaluable.

However, no matter how much AI evolves, the imperfection of being human remains. We become lonely. We feel jealous. We compare ourselves. We hesitate to contact others. We want help but can't ask for it. The more we try to completely remove such weaknesses with technology, the more we distance ourselves from our humanity.

Self-compassion teaches not to erase weakness but to accept it as a sign that you need connection with others.

"I'm not the only one who feels lonely."
"I'm not the only one who can't do things well."
"It's natural to feel pain as a human."

Just thinking this way changes the shape of loneliness a little. Loneliness becomes not a shame but a heart's response seeking connection.

What is truly needed in the AI era is not just the ability to answer faster. It's the ability not to blame oneself too much, to reach out to others, and to return to imperfect conversations.

There may be answers beyond the screen. However, the key to unraveling loneliness is not only within the screen. Kindness towards oneself and a small step towards someone else. Between these two, a new way of connecting in the AI era becomes visible.


List of Source URLs

Phys.org article "How principles of self-compassion help fight loneliness in the age of AI"
Main references for the article's theme, loneliness in the AI era, self-compassion, common humanity, and practical examples.
https://phys.org/news/2026-04-principles-selfcompassion-loneliness-age-ai.html

The Conversation Original Article
The original article reprinted by Phys.org. The author is Li-elle Rapaport from the Department of Psychology at the University of Manitoba.
https://theconversation.com/how-principles-of-self-compassion-help-fight-loneliness-in-the-age-of-ai-276574

Kristin Neff Official Site "What is Self-Compassion?"
Reference for the three elements of self-compassion, kindness to oneself, common humanity, and mindfulness.
https://self-compassion.org/what-is-self-compassion/

Statistics Canada "Loneliness by gender and province"
Reference for statistics on loneliness in Canada.
https://www150.statcan.gc.ca/t1/tbl1/en/tv.action?pid=4510004801

Dove Press Study "AI Technology panic—is AI Dependence Bad for Mental Health?"
Reference for research on AI dependence, mental health, and motivations for using AI.
https://www.dovepress.com/ai-technology-panicis-ai-dependence-bad-for-mental-health-a-cross-lagg-peer-reviewed-fulltext-article-PRBM

Reddit Post "Do you think AI is helpful for emotional support?"
Reference for positive and negative social media reactions to using AI as a tool for emotional organization and introspection.
https://www.reddit.com/r/emotionalintelligence/comments/1rca7wy/do_you_think_ai_is_helpful_for_emotional_support/

##