Deutsche Bank is testing out artificial intelligence tools that aim to detect possible signs of misconduct from the tone of traders' phone conversations.
Germany's largest bank is examining Google Cloud's machine learning as part of a wider exploration of uses for AI, Bernd Leukert, chief technology, data and innovation officer, said in an interview. The firm declined to say how soon the system could be rolled out.
It's part of a wider drive into AI at the bank, which is also exploring how large language models, or generative AI, can help support client and employees and assist with coding.
Lenders globally are looking to make monitoring more effective, as stakes are getting high following government crackdowns on impropriety. The total fines paid by financial institutions for the use of unofficial communication channels like WhatsApp have exceeded $2.5 billion since December 2021.
For now, Deutsche Bank's surveillance tools can analyze words but not changes of tone or hints of cynicism. The new system should be able to differentiate between, for example, traders saying "keep it between us" when discussing a surprise party and when their aim is more nefarious.
"If you have a tool that can understand and direct you to tonality, not just to words that are suspect, it can give much better oversight," Leukert said. "It has the potential to put governance onto a completely different level."
Read more:
Deutsche is training the AI using recordings of existing calls between traders, so it becomes familiar with the language they use and the norms of conversation. The software will direct analysts to areas of interest, and humans will make the ultimate judgments about how to proceed.
Hiring spree
Staff on Deutsche Bank's innovation team are working to ensure the surveillance system complies with regulations and avoids privacy issues, while minimizing false positives.
"It is crucial to ensure that banking algorithms are fair, don't include any hidden discrimination and results are factually correct and explainable," Leukert said.
The Frankfurt-based financial giant has about 16,000 staff working on tech, data and innovation, about half of whom are engineers working on code.
And it's looking to hire more. Deutsche Bank is among the top five lenders for AI recruitment, according to the consultancy Evident, posting around 1,300 AI-adjacent roles from February through April.
However, some lawyers are warning of possible risks in using AI to detect bad practice.
If software looks for laughter or trades that are more profitable than normal, it risks ignoring evidence of even greater concern, said Duncan Black, a financial services partner at Fieldfisher in London. There's also a risk of traders arguing that the evidence against them was created by flaws in the software.
"It's great in detecting patterns in huge amounts of data and doing it faster and more efficiently than humans," he said. "But you have to keep an eye on what it uses as base ground before it goes looking."
Although risks and shortcomings are still being identified and sorted out, the role of AI continues to increase in the industry.
Other lenders are using Google Cloud technology to turn unstructured data such as traders' text messages and call transcripts into forms that can be searched more easily, according to Zac Maufe, the global head of regulated industries at Google Cloud.
"If you have ever heard traders speak or communicate with each other, you quickly realize that they use their own language," Maufe said in an interview. "Leveraging new AI techniques has allowed significant improvements in the ability to understand this."
Societe Generale uses AI to analyze traders' communications across 30 channels including mobile phone calls, instant messaging and social media, a spokesperson said. The French firm's system, known as CAST, covers 16 countries and 26 languages.
A spokeswoman for SocGen declined to say what AI the lender uses and whether it can analyze tone of speech.
Read more:
Anne Beaumont, a partner at the law firm Friedman Kaplan Seiler Adelman & Robbins in New York, said the systems could work like a lie detector and help identify suspicious behavior without necessarily proving it.
"I can't tell you the number of cases when someone has said 'you the man,"' she said. "You don't say that when you're buying a sandwich, you do it when they've done something for you that they shouldn't be doing."