Ptechhub
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs
No Result
View All Result
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs
No Result
View All Result
PtechHub
No Result
View All Result

5 Reasons to Think Twice Before Using ChatGPT—or Any Chatbot—for Financial Advice

By Wired by By Wired
April 24, 2026
Home AI & ML
Share on FacebookShare on Twitter


I’ve used ChatGPT to help me build a budget before, and it was genuinely helpful. After I input my monthly salary as well as my standard utilities and recurring expenses, the chatbot drafted a few solid options, and I tweaked them into penny-pinching perfection. I’m admittedly part of the growing number of people turning to chatbots, like Anthropic’s Claude, Google’s Gemini, and OpenAI’s ChatGPT, for financial advice.

“Millions of people turn to ChatGPT with money-related questions, from understanding debt to building budgets and learning financial concepts,” says Niko Felix, an OpenAI spokesperson, when reached for comment. “ChatGPT can be a helpful tool for exploring options, preparing questions, and making financial topics easier to understand, but it is not a substitute for licensed financial professionals.” OpenAI’s Terms of Use state that the AI tool is not meant to replace professional financial advice.

While you may consider chatbots to be practical financial assistants, it’s always worth keeping the limitations of these AI tools in mind. Beyond miscalculations, here are five additional reasons to approach them with skepticism when it comes to money tips.

AI Still Confidently Outputs Incorrect Answers

When I ask ChatGPT for help managing my money smarter, the bot is confident in its responses, often laying out what seems like solid reasoning behind each bullet point of advice. But always keep in mind that chatbots can weave convincing errors into outputs.

OpenAI has reduced the rate of hallucination in more recent model releases, but chatbot tools still output errors. “There seems to be this sense emerging, at least among casual users, that the hallucination problem has been fixed,” says Srikanth Jagabathula, a professor of technology operations and statistics at NYU. “But that’s definitely not the case, because they’re fundamentally statistical machines. They don’t have a notion of a ground truth, or what is true.”

Even if an answer seems correct at first, one easy way to stress test the output is simply to ask a chatbot to double-check everything it just said. While this approach won’t confirm whether the output is correct, this method has highlighted plenty of issues in AI responses and leaves me feeling increasingly skeptical about turning to bots for advice on any topic, beyond just money.

Yes-Bot May Affirm Preexisting Beliefs

When you turn to a human financial advisor for money tips, they will likely be cordial and professional and push back on any preconceptions you may have about saving, investing, and spending money. On the other hand, chatbots are known for being overly agreeable, often taking the user’s side.

“AI sycophancy is not merely a stylistic issue or a niche risk, but a prevalent behavior with broad downstream consequences,” reads part of a study about AI’s conversational flattery published earlier this year in the journal Science. “Although affirmation may feel supportive, sycophancy can undermine users’ capacity for self-correction and responsible decision-making.”

The study looked at how AI will take a user’s side during interpersonal conflicts, but concerns about sycophancy are relevant to financial questions as well. When I’m making money moves, I want to turn to someone who knows more than me for guidance, not rely on a yes-bot for affirmations.

Requires Sensitive Info for Better Results

For any chatbot to provide its best outputs tailored to your specific needs, people are nudged to share sensitive information with the AI tools. For example, when I asked ChatGPT how it could help improve my budget even more, the bot nudged me to consider uploading my complete financial history from the last few months for the best answers.

“You don’t have to upload everything—but yes, the more real data you share, the more accurate (and useful) the audit will be,” read ChatGPT’s output, in part. “Upload CSVs or screenshots of bank account, credit cards. Then I can: categorize everything, calculate exact spending patterns, identify hidden leaks you wouldn’t notice, and build a precise monthly budget.”

Unless your settings are adjusted, all of your conversations with ChatGPT may be used by OpenAI to improve the tools and as training data for future iterations. Visit ChatGPT’s “data controls” tab to change your settings. Even if you opt out of AI training, it can be risky to upload so much sensitive data about your money to a platform that’s not an official banking app.

Bots Lack Accountability

Jagabathula sees tools like ChatGPT as a worthwhile part of your toolkit, primarily when you’re in the early stages of asking questions about money matters, like tax saving strategies or investment ideas. But you should always rope in someone with expertise before making high-stakes decisions.

“A human expert in the loop is super critical,” he says. “Especially for the last mile, you’re actually going from idea generation to taking action. Somebody needs to review the plan, adjust it, and correct it if necessary.”



Source link

Tags: anthropicappsArtificial Intelligencechatbotschatgptfinancegoogle geminiopenai
By Wired

By Wired

Next Post
Foreign car companies bet on technology to hang onto once-lucrative China auto market

Foreign car companies bet on technology to hang onto once-lucrative China auto market

Recommended.

Government faces claims of serious security and data protection problems in One Login digital ID | Computer Weekly

Government faces claims of serious security and data protection problems in One Login digital ID | Computer Weekly

April 14, 2025
PREFORMED LINE PRODUCTS ANNOUNCES FIRST QUARTER 2025 FINANCIAL RESULTS

PREFORMED LINE PRODUCTS ANNOUNCES FIRST QUARTER 2025 FINANCIAL RESULTS

May 2, 2025

Trending.

Microsoft Details Cookie-Controlled PHP Web Shells Persisting via Cron on Linux Servers

Microsoft Details Cookie-Controlled PHP Web Shells Persisting via Cron on Linux Servers

April 3, 2026
SysAid Recognized in the 2025 Gartner® Magic Quadrant™ for AI Applications in IT Service Management

SysAid Recognized in the 2025 Gartner® Magic Quadrant™ for AI Applications in IT Service Management

September 11, 2025
Viettel Marks 20 Years of Global Expansion, Overseas Revenue Up 25%

Viettel Marks 20 Years of Global Expansion, Overseas Revenue Up 25%

April 3, 2026
守正笃行:IBM 张榕解码 AI 时代的组织变革与人才之道

守正笃行:IBM 张榕解码 AI 时代的组织变革与人才之道

April 3, 2026
New SparkCat Variant in iOS, Android Apps Steals Crypto Wallet Recovery Phrase Images

New SparkCat Variant in iOS, Android Apps Steals Crypto Wallet Recovery Phrase Images

April 3, 2026

PTechHub

A tech news platform delivering fresh perspectives, critical insights, and in-depth reporting — beyond the buzz. We cover innovation, policy, and digital culture with clarity, independence, and a sharp editorial edge.

Follow Us

Industries

  • AI & ML
  • Cybersecurity
  • Enterprise IT
  • Finance
  • Telco

Navigation

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Subscribe to Our Newsletter

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Copyright © 2025 | Powered By Porpholio

No Result
View All Result
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs

Copyright © 2025 | Powered By Porpholio