Is your chatbot speaking the right language to your customers?

Robot human chatbot
stock.adobe.com

When Morningstar launched its AI-fueled research assistant chatbot Mo a year ago, it could comprehensively answer tens of thousands of investment-related questions in an instant, largely making it a success with users.

But what it couldn't do was cancel subscriptions. 

"About 10% of the questions we get submitted on our retail website are about canceling subscriptions because people are so used to support chatbots," not an AI research assistant like Mo, said Marc DeMoss, head of research products at Morningstar. "So we are working on connecting that instance to our support corpus and capability because it's a clear, easy button, an easy win." 

Morningstar is constantly feeding data and updating Mo because it is an AI-backed learning machine that interacts with investors through a large language model (LLM). It's highly user friendly and engaging in a way that DeMoss believes is part of a changing landscape in how chatbots will interact with customers, like using it to predict and place food items in an Amazon cart, he said. 

"What we're so used to today with software and user interfaces — the pointing and clicking —  that's all going to radically change," he said. One day soon, "you're just going to type it in or speak to it [the chatbot] and tell what you want it to do, and it will do that thing, and you're done."

READ MORE: Morningstar wants you to meet 'Mo,' its new OpenAI-powered chatbot

By and large, most chatbots have limited capabilities centered on providing scripted answers, like how to open or close an account, or how to fill out an application. But the recent development of LLMs, like ChatGPT, have given bots the ability to be more interactive with smarter responses. 

Bank of America's virtual financial assistant Erica is among the first and more widely known financial chatbots backed by language processing and predictive analytics. The Charlotte, North Carolina-based bank launched Erica six years ago but added AI capabilities last year. Erica surpassed 2 billion interactions with customers this April, BofA recently announced.

"Erica is getting a ton of buzz and the reason is, they were there first. It really is an interactive chatbot, so you ask it questions about your financial health, or about an account, and it will answer those questions for you," said John O'Connell, founder and CEO of The Oasis Group, a software provider for wealth managers and financial technology firms based in Monroe Township, New Jersey.

O'Connell said that as an industry, "We're missing an opportunity with chatbots."  

"Chatbots today are — I'm going to use a very non-PC word — they're really dumb. They were designed in the same vein, or the same thought pattern as the PBXs [Private Branch Exchange phone systems] were designed," he said. "And because of that, I think a lot of people find it incredibly frustrating to deal with. Now that's a missed opportunity for a large language model."

READ MORE: Will small clients be claimed by chatbots?

Morgan Stanley plugged AI into an existing chatbot, now called the AI @ Morgan Stanley Assistant, late last year. Backed by OpenAI, which created ChatGPT, the bot uses a large language model to give financial advisors fast responses based on more than 100,000 research reports and documents.

But it's also the human-like interaction with AI-powered chatbots that has become a game-changer for legacy chatbots. 

Sal Cucchiara, Morgan Stanley's chief information officer and head of wealth management technology, said with their previous chatbot, if a user asked how to link an account to an existing liquidity access line, the question would need to be asked perfectly to get the right steps. The engagement potentially causes a lot of back and forth to get the correct answer, leading to user frustration.

With the AI chatbot "I could just say, 'link an existing liquidity access line.' I don't have to use the perfect sentence, the perfect word, and it's going to get me really, really close to the best answer," Cucchiara said. "It understands what you're asking. And I think that's the big distinction between chatbots built using large language models versus chatbots that were built using somebody curating a question and answer."

Another key feature that Morningstar added to Mo was the ability for it to show the user how it found the answer by providing reference links or sourcing the top three articles. That level of explainability through an AI-powered chatbot has helped build trust with clients.  

READ MORE: Advisors know ChatGPT, but that doesn't mean they trust AI

"Before, in the first pass, it just would answer a question and you would have no idea where the answer came from," DeMoss said. "Providing that explanatory language has been huge in terms of how our users have been perceiving the usefulness of it."

However, DeMoss said getting to that level wasn't easy, and they're constantly adding more data, testing responses and facing sticky updates whenever a new ChatGPT version comes out. 

READ MORE: How Google, Nvidia and other AI-powerhouses influence vendor pricing for advisors

"ChatGPT 4.0 came out and it was supposed to be way more advanced because it was based on a lot more content than ChatGPT 3.5 was, and in some ways, that was good," he said. "But in other ways, because it was trained to be more thoughtful, it would give you the wrong answers in some instances because it was overthinking the situation."

But that's also par for the course in working with any AI-based models — it's a learning  technology still in its infancy that needs to be taught. 

For example, an AI chatbot "might know that Apple is a stock, but it might also think it's something else," DeMoss said. "It's like training a Swiss army knife which tool to pull out depending on what situation you find yourself in."

For reprint and licensing requests for this article, click here.
Technology Practice and client management Wealth management Fintech Morgan Stanley Morningstar
MORE FROM FINANCIAL PLANNING