Skip to main content
ukiyo journal - 日本と世界をつなぐ新しいニュースメディア Logo
  • All Articles
  • 🗒️ Register
  • 🔑 Login
    • 日本語
    • 中文
    • Español
    • Français
    • 한국어
    • Deutsch
    • ภาษาไทย
    • हिंदी
Cookie Usage

We use cookies to improve our services and optimize user experience. Privacy Policy and Cookie Policy for more information.

Cookie Settings

You can configure detailed settings for cookie usage.

Essential Cookies

Cookies necessary for basic site functionality. These cannot be disabled.

Analytics Cookies

Cookies used to analyze site usage and improve our services.

Marketing Cookies

Cookies used to display personalized advertisements.

Functional Cookies

Cookies that provide functionality such as user settings and language selection.

The Reason for Not Entrusting Investments to AI Wasn't Its Capability ─ The Key Lies in "Trust Through Attachment"

The Reason for Not Entrusting Investments to AI Wasn't Its Capability ─ The Key Lies in "Trust Through Attachment"

2026年01月08日 00:15

In that investment decision, it's not "AI" but "your partner" that gives you the final push

AI is becoming more familiar as a consultant for household and investment matters. From summarizing texts to classifying household accounts and proposing portfolios, we live in an era where a single button press can yield a seemingly appropriate answer. However, when it comes to major financial decisions like "taking a risk" or "choosing a safe option," people are more inclined to follow the advice of their romantic partners (lovers or spouses) rather than AI—this is the intuitive yet crucial reality that the latest research introduced by Phys.org has demonstrated with data. Phys.org


The research was conducted by Erik Hermann and colleagues at the European University Viadrina in Germany. The focus is not on "whether AI is smart," but on examining "where trust originates" by comparing it to the closest other, the romantic partner. Phys.org



Four Experiments: Partner, Robo-advisor, Personified AI, and "Partner×AI"

The experiments were conducted four times with over 1,400 participants in relationships in the United States. Participants first chose between a low-risk, low-return safe fund and a high-risk, high-return fund. Then, advice to "switch to another option" was presented by multiple "advisors." The advisors were alternated under conditions such as a partner, a robo-advisor (AI), a personified AI given a human-like name (referred to as "Alex" in the article), and a partner using AI to give advice. Phys.org


The results are straightforward.Participants were significantly more inclined to follow the advice of their partners than that of computers (AI). The Phys.org article highlights two major factors contributing to this difference. One is "algorithm aversion"—the psychological resistance to delegating important decisions to AI. The other is the "affective trust" present between partners. In other words, people weigh advice based on whether they feel their partner genuinely cares about their happiness, rather than whether their partner is a financial expert. Phys.org


What's even more interesting is that the "AI side's ingenuity" can narrow the gap. According to the abstract of the paper,personifying AI reduces AI aversion, and the advice from personified robo-advisors is accepted to the same degree as advice from partners. Additionally,"advice given by a partner with the help of AI" also had a high acceptance rate, similar to that of the partner alone, and was clearly stronger than AI alone. ScienceDirect



Why "a word from your partner" over "seemingly correct AI"?

The key here is the "two-layered structure of trust" shown by the research. Trust involves expectations of ability and accuracy (cognitive trust) and expectations of goodwill and empathy (affective trust). In situations like finance, which have a significant impact on life, the latter is particularly effective. Phys.org explains, "We trust our partners because we feel they genuinely care about our happiness." Phys.org


AI can demonstrate "smartness" through learning data and model performance, but it struggles to create the circuit (affective trust) where users feel "this advice is for me." This is why "design" such as personification and collaboration is effective. The research suggests that financial companies should design AI not as a replacement for humans but as AI that "thinks together with people."" Phys.org



The Next Point: Personification is "Convenient" but Can Also Be "Risky"

Personification enhances trust—this result is attractive from a product-making perspective. However, at the same time, "human-likeness" can also lead to misunderstandings. There is a risk of overestimating advice by mistakenly believing that AI has emotions or interests.


In fact, in online AI discussions, the observation that "people anthropomorphize everything" is repeatedly mentioned. In discussions on Hacker News, comments reflect the sentiment that "people talk to cars, ships, and even plants. That's just how it is." Hacker News


While personification can smooth user experience, it also has the potential to obscure responsibility for judgment and understanding of risks. Finance is an area where running on "feel-good" alone can be dangerous.



Reactions on Social Media (Within the Range Confirmed on Public Pages)

*Due to viewing restrictions on platforms like X (formerly Twitter), where it can be difficult to track post content and replies,we summarize the trends in reactions based on public discussions that could be fully confirmed without login.


Reaction 1: Empathy "Of course, I trust my partner. After all, it's a joint operation of life."

Phys.org's official post on LinkedIn summarizes the key points of the research as "emotional trust in partners is stronger than in AI" and "collaborative design is key rather than replacement," and the importance of "joint decision-making with partners (family)" is being received even in a business context. LinkedIn


Reaction 2: Practical "AI is a tool for organizing thoughts, not the 'final decision.'"

In a different context, on Hacker News, while there are shared experiences of "consulting AI for financial decisions like investments, real estate, and pensions to 'structure thoughts,'" there are also reactions keeping a distance, saying "isn't that overdoing it?" Hacker News


This temperature difference aligns with the research's direction that there is "resistance to AI alone" and that "AI that assists human decision-making is easier to accept."


Reaction 3: Caution "Making people trust through personification is particularly dangerous in finance."

In discussions about personification, the point that "personification is a shortcut to understanding but amplifies misunderstandings" remains strong. On Hacker News, there is talk about distinguishing between using personification as a "convenient abstraction" and companies stirring up expectations with excessive humanization. Hacker News


In the financial field, getting this wrong can lead to the short-sighted belief that "human-like AI = trustworthy AI," potentially leaving discussions on accountability and risk disclosure behind.



Conclusion: The Key Battle is "Designing Relationships" Rather Than "Accuracy"

What this research highlights is the "trust gap" that cannot be filled by AI's performance competition alone. In major financial decisions, people do not act solely on correctness. They are drawn to affective trust nurtured within relationships, with those who care about their happiness. Phys.org


So, will AI continue to lose?—Not necessarily. The fact that personification can alleviate AI aversion and that collaboration like "Partner×AI" is strong indicates a future vision for financial AI. ScienceDirect


When AI becomes "a partner to help you (and others) make decisions you agree with," rather than "an entity that decides on your behalf," it might stand on the same ground as a word from your partner.



Reference Article

Why Trust Romantic Partners Over AI in Big Financial Decisions
Source: https://phys.org/news/2026-01-romantic-partners-ai-big-financial.html

← Back to Article List

Contact |  Terms of Service |  Privacy Policy |  Cookie Policy |  Cookie Settings

© Copyright ukiyo journal - 日本と世界をつなぐ新しいニュースメディア All rights reserved.