BMO’s cloud migration strategy eases AI adoption

The Canadian bank is embracing a more digital future as its cloud strategy makes gains and it looks to both traditional machine learning and generative AI for further augmentation.

Armando Benitez’s Bank of Montreal career started in the world of exchange-traded funds. When he joined the bank’s capital markets division eight years ago, his team was tasked with pricing equity ETFs and comparing them to other market-makers. “You have to start from the data perspective, and you have to understand the story that the data is telling,” Benitez tells WatersTechnology.

While pricing ETFs may appear to be a simple task, he says that back then, no one had done it yet. It eventually led the way to more pricing tools and expansions of methods across asset classes. In other words, it laid a foundation.

Benitez is now the managing director and head of AI and quantitative engineering at BMO Capital Markets. He says that when looking to address the possible use cases around generative AI and large language models (LLMs), having a foundation is proving potentially fruitful again. “We’re not starting from scratch, because we’re leveraging all the good work that we had on the infrastructure side, all the knowledge we have about our datasets, and ultimately, the collaboration that is fundamental for the success of everything we do,” he says.

Collaboration, in this context, means bringing together business leaders, AI specialists, and engineers to come up with solutions instead of utilizing a consulting approach where the data science group or the engineering group tries to solve a problem without full context. “By embedding all those people together and putting them together to collaborate is where we have seen the most wins,” he says.

Benitez says many of the problems he deals with are deterministic in nature, meaning that a given set of inputs produces the same output each time. Traditional data analytics would work best in this case. In other cases, a more specific statistical modeling or applied math approach might work best.

GenAI’s strengths pay off in text-based use cases, but for banks, numbers are more central. The potential is out there for more numbers-based use cases, Benitez says. “One of the papers that I’m now reviewing is one where people are using LLMs to solve time-series problems,” he says. “The interesting thing about these models is that they work on contexts that are not expected to work and I think that’s the part where people have to be cautious.”

Laying the groundwork

Banks, asset managers, and exchanges are constantly weighing decisions about which technologies will serve them best over time. In the past decade, cloud and artificial intelligence have figured prominently in those calculations.

For BMO, conversations around moving to the cloud began in 2016. The bank adopted a cloud-first, cloud-native strategy to be able to release new tools and services faster and more efficiently. Chief architect and chief innovation officer Lawrence Wan told WatersTechnology in 2021 that BMO was methodically working its way through a three-stage migration plan.

Today, Wan says that the cloud migration is still on track in terms of what workloads the bank wants to migrate, although he declined to provide specific percentages. “I think it’s more important to not fixate so much on the proportion of the workload, but focus more on the capability that we’re able to mature,” Wan says.

BMO partners with Microsoft Azure and Amazon Web Services, with the latter serving as its preferred cloud provider. Wan says the bank has been able to provision a range of applications and their associated data in the cloud while maturing their security capabilities.

BMO’s shift to the cloud is part of a larger transformation the bank is tackling in its Ambition 2025 plans, which encompasses objectives laid out by BMO CEO Darryl White. Outside of cloud, there is an overarching effort to digitize BMO’s capabilities. Under a concept referred to as Digital First, data analytics capabilities—from data gathering to predictive analytics—were also examined. “Digital First was not just about digital channels; it was about thinking, as an organization, how do we drive speed? How do we automate, and where can it be?” Wan says.

These days, automation, cloud, and data go hand in hand with artificial intelligence and analytics. As Big Tech expands its offerings to include generative AI tools and model sandboxes alongside the traditional services of cloud, it seems likely that Big Tech’s existing large bank and asset management customers will want to expand as well.

“It’s far easier to do it in the cloud, both in terms of compute capability, but also native capability that is available from cloud,” Wan says. He points to AWS’ Sagemaker offering and Azure’s Azure ML offerings as examples of services available in the ecosystem. “So there is definitely a lot of advantage to being in the cloud.”

For BMO, traditional machine learning is still the “workhorse” for many AI use cases, but the door remains open to leveraging GenAI as well.

Slow steps, first

For now, BMO Capital Markets’ exploration of GenAI has mainly applied to internal use cases, which has been the norm for highly regulated banks. “We’ll be focusing on building what we call horizontal capabilities, and those horizontal capabilities are going to help us later build more complex solutions,” Benitez says.

Tools for chat applications currently allow users to interact with large language models in an almost educational format to familiarize them with the technology. “So far, we have noticed that some users are very excited, and they want to have access to LLMs; for others, it’s something that doesn’t solve any problems,” he says.

Coding assistants have also shown potential to be efficiency boosters. In a low-risk approach, the bank is working to identify how they can make the code-writing process better. If a developer has a particular way of doing an SQL query, they can ask the LLM or show it a piece of code to receive a hint about what the code is doing. The ability to give a hint can take a user from zero to 80, and getting to that final 100 still keeps a human in the loop. So far, BMO has had success with the programming languages Python, C#, and JavaScript.

On the heels of initial buzz about ChatGPT and LLMs at the start of last year, Tim Baker, founder of consultancy Blue Sky Thinking and financial services practice lead at software provider Expero, told WatersTechnology that the chatbot had shown it’s capable of writing half-decent code and that it was only a matter of time before developers used it to write code faster and even test it that way. “If you’re in an algorithmic world, it’s a big productivity booster for you and it might also make those investment strategies more accessible, because more and more people will be able to build strategies,” he said.

Teuta Naghshineh, partner for consultancy Capco’s US digital engineering practice, says across banks, the strategy appears to be tackling internal tech use cases first. “Technologists are basically your first adopters,” she says. “So, let’s just get them to play with all the new tools, whether it’s CoPilot, ChatGPT, or Llama. Whatever it is, let them play with it so they can see what they do.” After that, the second area is more operational use cases around improving processes, reporting, and so on, followed by client-facing use cases, which are currently presented as chatbot offerings.

Traditional problems will continue to be solved by traditional models, Benitez says, and LLMs will continue to get better. “They’re fantastic synthesizers [but] I don’t think they’re there yet with reasoning capabilities that would allow us to use it for the quantitative space,” he says.

Even still, Benitez is not ruling out a reality where the two could work in tandem. As an example, a well-established quantitative model could be analyzing market data and doing predictions. One could bring an additional input to the model from a quantitative measure that an LLM has produced. “So it is not necessarily that the LLM is going to replace the other, but it is more in conjunction with the traditional machine learning, using it as an input just to have another feature,” he says. “But I think the core part of the decision will be made by the traditional model and then the LLM is an added bonus.”

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

The IMD Wrap: Will banks spend more on AI than on market data?

As spend on generative AI tools exceeds previous expectations, Max showcases one new tool harnessing AI to help risk and portfolio managers better understand data about their investments—while leaving them always in control of any resulting decisions.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here