More consumers are turning to tools like ChatGPT for budgeting, investing, and financial planning — but AI isn’t a licensed or regulated advisor.
Experts warn that AI can lack context, personalization, and accuracy, which can lead to misleading or overly confident financial advice.
AI works best as a support tool, not a substitute — major financial decisions should still be reviewed by a qualified professional.
From building a budget to mapping out investment goals, more people are turning to tools like ChatGPT for quick, easy financial advice. It’s fast, free, and available 24/7 — no appointment required.
But while AI can simplify complex topics and offer helpful starting points, it’s not a licensed financial professional. And that distinction matters more than many users realize.
ConsumerAffairs spoke with banking industry veteran Raul Landeo, Director of Information Technology at Maspeth Federal Savings, who explained that the rise of AI-driven financial guidance is happening faster than the safeguards around it.
While these tools can be useful for general education or basic planning, they often lack the context, personalization, and accountability needed for major financial decisions. As a result, some consumers may be putting too much trust in advice that isn’t regulated — and isn’t always right.
Making finances easier to understand
Landeo explained that many consumers are using tools like ChatGPT to make finances easier to understand.
“They are starting out by using it to explain concepts so it's easier to understand and then they are treating it like a personal financial advisor,” Landeo said. “This can be either as a substitute for those who do not have one, or as a supplement for those who do, but want more frequent reviews. The main shift is the increased availability of a financial planner.”
The risks of trusting AI
If you’re going to use AI for financial advice, there need to be safeguards in place.
“Any AI solution requires a human at the helm to review its output,” Landeo said. “True autonomous AI advice is not there yet, so when consumers take advice without a review, there's a risk.
“A major risk is where the AI solution does not have all of the information needed to make the best answer and may hallucinate the response just to provide one. Consumers do not know all of the inputs that are required, so a professional is that ‘human at the helm’ needed to provide those inputs and provide a fiduciary duty.”
Lacking regulation
Another important risk to keep in mind: AI tools have no standard regulations. This can be tricky – especially when finances are involved.
“Fraud remains a constant threat,” Landeo said. “Regulation provides an oversight that makes companies set up guardrails to minimize this risk.
“Also, there is no standard framework for producing AI solutions, so without this, it's highly possible for different AI models to come up with different responses given the same inputs. A lack of personalization creates a risk of assuming all consumers have the same requirements to set up their best financial plans and financial planning is not a one-size-fits-all.”
Landeo’s biggest piece of advice: AI works best in conjunction with a human financial planner.
“AI tools can be incredibly useful, given the right environment,” he said. “While AI has tremendous potential, the current expectation is that all output needs to be verified by a human that fully understands how the specific solution went through the process of developing its output.”
