Do investors trust AI with financial statements? FINRA Foundation study offers some surprises
Despite the hype around and capabilities of AI technology, consumers still trust human financial professionals more — and by a significant margin in certain financial situations, according to a recent survey by the FINRA Investor Education Foundation. But when it came to statements on wealth management and stock performance specifically, respondents said they trusted AI nearly as equally as the human financial advisor.
The survey, which polled more than 1,000 adults in February 2024, found that only 5% of respondents would seek AI to help make a financial decision, compared to 63% who said they would seek a financial professional and 56% who'd turn to friends and family.
"Given the attention that AI has received in the last year or so, we expected this figure to be higher," said Gerri Walsh, president of the FINRA Investor Education Foundation. "That said, 10% of respondents aged 18 to 29 reported using AI for financial information, so there could be generational differences in the adoption and use of AI for financial information."
Diving deeper into the findings, however, showed consumers trusted AI more in different financial situations, specifically when it came to investments and managing portfolios.
The survey included an experiment in which consumers were given four different pieces of hypothetical financial statements that were plausible but not always accurate. The statements spanned across information on homeownership, stock and bond performance, portfolio allocation, and savings and debt. Half of the participants were told the information was provided by AI, while the other half was told it came from a financial professional.
The study found that when it came to showing the same hypothetical statement on stocks and bonds, nearly an equal amount of respondents (34%) trusted it when it came from AI as those who trusted it coming from a human advisor (33%). Further, when it came to a statement on portfolio allocations, 37% of respondents trusted the statement when they were told it came from a financial professional, versus 30% who trusted it when they were told it came from AI.
"For two out of the four questions, between 30% to 40% of respondents trusted the financial professional's answers — that's very low," said Gabe Rissman, president and co-founder of YourStake, a wealth management platform for RIAs.
But Rissman and others said results such as this indicate the need for advisors to use AI and machine-learning technology to create better, trustworthy relationships with clients.
"One possible way to enhance trust and confidence is to combine the speed and analytical ability of AI with the comprehensive and connected knowledge of a financial professional that allows more end clients to be served, more comprehensive and tailored advice given, and overall better quality," he said.
The financial information and the format it's presented in made a key difference in how people perceive AI tools or understand that they might already be using it. For example, while most said they didn't trust AI, about 25% of respondents said they seek information using financial apps, which may include AI technologies in the background.
"I see AI as a back-end tool that helps advisors do a better job serving their clients. It's not about replacing the human element, but rather enhancing it," said Nathan Wallace, wealth manager at New York-based Savvy Advisors.
The wealthtech provider specifically uses AI to help advisors streamline their workflow functions and manage assets.
"From an investment perspective, AI has the possibility of both shortening and improving the research process for advisors, allowing them to get informed ideas to clients faster," Wallace said. "We do not consider AI a threat to human financial advice, but rather an amazing tool that supercharges our advisors' ability to deliver insightful and timely advice to clients."
However, AI technology is still in its infancy and has been widely known to hallucinate or give perfectly structured but inaccurate responses. This has made it challenging for advisors themselves to trust AI enough to place it on the consumer-facing side.
"One facet of the lack of trust in AI comes from what can be called 'chalkboard decisions.' These are decisions where the math is correct but does not fit someone's unique personal situation and future goals — the human side of advice," said Jordan Hutchison, vice president of technology and operations at RFG Advisory, a wealtech platform for RIAs. "We can infer that due to the high level of neural trust, people will ask AI for advice and then confirm that output with someone they trust."
A Financial Planning survey of financial advisors performed earlier this year found that about half of the 127 respondents said they'd trust AI to be responsible for making predictions on their car or house maintenance needs, but only 15% said they'd trust it to make financial recommendations.
Because there are trust issues in using AI to make financial decisions, Rissman said that leaves an opportunity for advisors to build the explainability and customization part for clients that AI models currently lack.
"While people may trust AI's statements on specific topics, they recognize that AI may not consider all aspects of their financial plan or know the right questions to ask, emphasizing the importance of human advisors who possess a holistic understanding of their clients' unique situations," he said.
Rachel Witkowski
Tech Reporter, Financial Planning
Do investors trust AI with financial statements? FINRA Foundation study offers some surprises
Despite the hype around and capabilities of AI technology, consumers still trust human financial professionals more — and by a significant margin in certain financial situations, according to a recent survey by the FINRA Investor Education Foundation. But when it came to statements on wealth management and stock performance specifically, respondents said they trusted AI nearly as equally as the human financial advisor.
The survey, which polled more than 1,000 adults in February 2024, found that only 5% of respondents would seek AI to help make a financial decision, compared to 63% who said they would seek a financial professional and 56% who'd turn to friends and family.
"Given the attention that AI has received in the last year or so, we expected this figure to be higher," said Gerri Walsh, president of the FINRA Investor Education Foundation. "That said, 10% of respondents aged 18 to 29 reported using AI for financial information, so there could be generational differences in the adoption and use of AI for financial information."
Diving deeper into the findings, however, showed consumers trusted AI more in different financial situations, specifically when it came to investments and managing portfolios.
The survey included an experiment in which consumers were given four different pieces of hypothetical financial statements that were plausible but not always accurate. The statements spanned across information on homeownership, stock and bond performance, portfolio allocation, and savings and debt. Half of the participants were told the information was provided by AI, while the other half was told it came from a financial professional.
The study found that when it came to showing the same hypothetical statement on stocks and bonds, nearly an equal amount of respondents (34%) trusted it when it came from AI as those who trusted it coming from a human advisor (33%). Further, when it came to a statement on portfolio allocations, 37% of respondents trusted the statement when they were told it came from a financial professional, versus 30% who trusted it when they were told it came from AI.
"For two out of the four questions, between 30% to 40% of respondents trusted the financial professional's answers — that's very low," said Gabe Rissman, president and co-founder of YourStake, a wealth management platform for RIAs.
But Rissman and others said results such as this indicate the need for advisors to use AI and machine-learning technology to create better, trustworthy relationships with clients.
"One possible way to enhance trust and confidence is to combine the speed and analytical ability of AI with the comprehensive and connected knowledge of a financial professional that allows more end clients to be served, more comprehensive and tailored advice given, and overall better quality," he said.
The financial information and the format it's presented in made a key difference in how people perceive AI tools or understand that they might already be using it. For example, while most said they didn't trust AI, about 25% of respondents said they seek information using financial apps, which may include AI technologies in the background.
"I see AI as a back-end tool that helps advisors do a better job serving their clients. It's not about replacing the human element, but rather enhancing it," said Nathan Wallace, wealth manager at New York-based Savvy Advisors.
The wealthtech provider specifically uses AI to help advisors streamline their workflow functions and manage assets.
"From an investment perspective, AI has the possibility of both shortening and improving the research process for advisors, allowing them to get informed ideas to clients faster," Wallace said. "We do not consider AI a threat to human financial advice, but rather an amazing tool that supercharges our advisors' ability to deliver insightful and timely advice to clients."
However, AI technology is still in its infancy and has been widely known to hallucinate or give perfectly structured but inaccurate responses. This has made it challenging for advisors themselves to trust AI enough to place it on the consumer-facing side.
"One facet of the lack of trust in AI comes from what can be called 'chalkboard decisions.' These are decisions where the math is correct but does not fit someone's unique personal situation and future goals — the human side of advice," said Jordan Hutchison, vice president of technology and operations at RFG Advisory, a wealtech platform for RIAs. "We can infer that due to the high level of neural trust, people will ask AI for advice and then confirm that output with someone they trust."
A Financial Planning survey of financial advisors performed earlier this year found that about half of the 127 respondents said they'd trust AI to be responsible for making predictions on their car or house maintenance needs, but only 15% said they'd trust it to make financial recommendations.
Because there are trust issues in using AI to make financial decisions, Rissman said that leaves an opportunity for advisors to build the explainability and customization part for clients that AI models currently lack.
"While people may trust AI's statements on specific topics, they recognize that AI may not consider all aspects of their financial plan or know the right questions to ask, emphasizing the importance of human advisors who possess a holistic understanding of their clients' unique situations," he said.
Rachel Witkowski
Tech Reporter, Financial Planning