4 ethical concerns about AI in wealth management

Azish Filabi of The American College of Financial Services spoke at this week’s Strategy Forum held by the Investments & Wealth Institute.
Azish Filabi of The American College of Financial Services spoke at this week’s Strategy Forum held by the Investments & Wealth Institute in a presentation about the ethical questions for the industry posed by artificial intelligence.
Tobias Salinger

The vast potential for artificial intelligence in wealth management and the financial industry comes with ethical risks for financial advisors and other professionals, an expert said.

AI brings questions relating to the transparency of disclosing its use, the competence and accuracy of a nascent technology still prone to mistakes, the confidentiality of clients' personal data and the problem of racial bias, according to a presentation by Azish Filabi, an associate professor of business ethics who is the managing director of The American College of Financial Services' Cary M. Maguire Center for Ethics in Financial Services. She presented at the Investments & Wealth Institute's Strategy Forum conference this week in Chicago. 

Filabi spoke ahead of Financial Planning's first-ever ADVISE AI conference next week in Las Vegas, where speakers will discuss the many applications and shortcomings of the technology. AI is posing "a lot" of impact to the financial services, in terms of the business as well as possible regulations, Filabi said. Her presentation delivered three main lessons on the ethical component of the ongoing discussion in the industry about how best to implement AI.  

"When it comes to AI and human interaction, you are always in charge, so deferring to the authority of a machine could lead to problems down the line. I think it's important to keep that in mind, even if it's in control, even as AI is getting smarter, faster and sharper," Filabi said. "The best way to address AI ethics risk is, in my opinion, to actually engage with AI. There's a lot to be feared around the technology, but I think the more we engage with it, the more we understand it, the more we educate ourselves, the better this will go. And finally, I think it's important to think critically and long term. And there are certain use cases with that in mind, that maybe AI is not the best use for."

READ MORE: 5 things to expect at Financial Planning's ADVISE AI  

Government agencies are — slowly but surely — beginning to issue new proposals and guidelines around AI, Filabi noted. So far, the examples include President Joe Biden's executive order, the Securities and Exchange Commission's proposed rule for the use of predictive analytics in recommendations, a request for information about AI in finance from the Treasury Department earlier this year and some state-level standards, Filabi noted.

Besides the compliance concerns, the technology has a direct business impact in wealth management and other financial fields. Filabi shared a chart displaying the extent that survey respondents expressed trust in financial companies as compared to other sources such as social media posts or consumer advocacy groups.

"Individuals who have less trust are more likely to go to these informal sources of information," she said. "My takeaway from this information is that consumer trust will shape how competition is structured in the industry for years to come."

As the wealth management industry and other financial firms start to use AI tools in their businesses while striving to earn that trust, the ethical concerns tied to them are mounting as well, according to Filabi. For example, data sources that have historical biases toward one race or another group — as a study by Filabi and other researchers at the college found can happen in life insurance underwriting decisions — could prompt allegations of discrimination.

READ MORE: How AI is changing financial advisors' jobs

In another instance, the widespread use of software calculating the probability of a range of outcomes known as "Monte Carlo" simulations — which present advisors and their clients with choices relating to their ultimate retirement nest eggs — present concerns about the formulas used in the estimates and whether firms are communicating the results accurately. 

"The challenge here is that the smoother, the sharper, the faster the technologies get, the more likely we are to defer to them," she said. "As we think about the information that we're getting from software that sometimes seems like magic, how do we actually interpret this information, and how do we put it in the context of the day-to-day, normal work that we do? And, finally, as we started to think about, 'Well, gosh, what would regulatory accountability look like?' This is really our challenge, because we have so many different touch points within a system like this. You could have data inputs that are based on policy assumptions or just mistakes. You could have the software, the algorithm, working not as intended. The broker-dealer, who is responsible for the process, and the advisor, they can be working on a set of assumptions that are not necessarily in the best interest of the clients."

To try to help industry professionals address these challenges, Filabi laid out a checklist of questions they may wish to consider as they implement AI. Advisors and wealth management firms should ask themselves what their particular policies say about the use of the technology, whether they will be telling clients that they're using it in one area or another, the degree that they need to use due diligence and the issue of whether the tools display forms of bias.

READ MORE: ChatBlackGPT CEO: Advisors can use AI to diversify their client base

As those topics relate to the use of ChatGPT, the industry must, at the very least, refrain from plugging private client data into publicly available tools, Filabi noted. 

"I think it's important to think about the fact that they are still under construction, so you can't really fully rely on the accuracy of the tools and the quality of the information that they're getting from the tool," she said. "Many of you are leaders within your firm. So, thinking about, how do you want to put in place guidelines as a best practice for the consistent use of ChatGPT? I think now is the time to start thinking about that aspect of it."

For reprint and licensing requests for this article, click here.
Fintech Practice and client management Regulation and compliance Artificial intelligence Advise AI
MORE FROM FINANCIAL PLANNING