At the end of 2016 into 2017, Scotiabank initiated a project that would allow it to use cloud GPUs to run its valuation adjustments (XVA) program. With the project gaining traction, the results have been impressive.
According to the bank, the runtime for risk calculations and derivatives pricing using cloud GPUs is 30 times faster, allowing brokers to deliver more accurate derivatives pricing in 20 seconds, which would previously have taken 10 minutes. It also allows for more nuanced risk analysis thanks to more detailed risk scenario modeling that can assess more than 10 times the number of previous scenarios.
“The scale of XVA means that we need to lean on the scalability of public cloud for compute power and couple that with data at scale from our internal global data platform,” Stella Yeung, chief information officer at Scotiabank Global Banking & Markets, tells WatersTechnology. “This combination lets us deliver, in real time, to the traders the information that they require to serve our global clients.”
Andrew Green, managing director and lead XVA quant at Scotiabank, who joined the bank at the end of May 2016, believes that a GPU (graphics processing unit) is the best type of platform for running XVA calculations. Additionally, Scotia already had a cloud-first policy in place, even before they started this particular overhaul. When combined with a public cloud infrastructure—for valuation adjustments, Scotiabank is using the Microsoft Azure cloud and their NC24 virtual machines—GPUs are better equipped to handle these type of computationally-intensive calculations than traditional CPU cores. And finally, the bank’s XVA program is a Microsoft Windows-only system, and as you’d expect, Microsoft Azure has the capability to use GPUs with the Windows operating system.
Combined, the greatest tech change over the last four years is the ready availability of GPU machines via the cloud, Green says. This trend is being driven by firms looking to experiment with deep learning, but that demand has allowed risk managers to take advantage of the same hardware.
Turn the Page
Since the financial crisis, derivatives valuation adjustments have grown, both in size and in complexity. This has been an ongoing challenge for banks, but at the same time, the ability to store massive amounts of data in the cloud relatively cheaply, combined with vast improvements to compute power and the continued evolution of GPUs, has allowed firms to more efficiently crunch massive datasets and run risk calculations.
So [CVA] is a very onerous numerical calculation that needs to benefit from some accelerations, which is why you get GPU cards and GPU compute capability…
Andrew Green, Scotiabank
It also hasn’t hurt that new regulations—from credit valuation adjustment (CVA) accounting standards, and the new Fundamental Review of the Trading Book (FRTB) stemming from Basel III, to new rules around initial margin requirements, and BCBS/Iosco requirements—have helped to push banks toward newer technologies for help.
“There’s been significant growth in the number of valuation adjustments that are applied in common practice in the derivatives industry since 2008 and the financial crisis,” Green says.
When it comes to valuation adjustments, it’s an acronym minefield. Beyond CVAs, which account for counterparty credit risk, there are funding valuation adjustments (FVAs), which account for funding costs for derivatives; margin valuation adjustments (MVAs), which relate to the funding costs associated with initial margin; and, among others, capital valuation adjustments (KVAs), which is something that banks look at to assess the impact of new derivatives transactions on the bank’s balance sheet and return on capital.
“Those things have been growing over the last 10 years, so it’s now common practice to include those whenever you do a new transaction with a client—you want to assess the impact of all of those things on your accounting valuations and on your balance sheet,” he says. “So you need a system that is capable of being able to price those into new derivative trades, and because they’re a part of your accounting practice, you need to also include them in your books-and-records valuations, and you also need to calculate sensitivities because they impact your derivatives risk and sensitivities, as well. As a result, you need to have an end-of-day process where you generate sensitivities to those numbers, and you have a trading desk that is responsible for managing them and hedging them, as well.”
Better, Faster, Stronger
The standard approach to calculating XVAs is to use large-scale Monte Carlo simulations, Green says.
“Typically you’ll use a Monte Carlo simulation because of the high number of risk factors that are involved in the calculation. Monte Carlo is a relatively slow numerical technique, but the only one that’s really capable of dealing with this high dimensionality in the nature of the problem,” he says.
We did a release where we optimized the calculations significantly and we got a big performance boost and that means we can dial down the compute requirement. And then later on in the year, we’ll be expecting to add more calculations…and then we’ll need to dial it up again. So it gives us a degree of flexibility that we wouldn’t otherwise have.
Andrew Green
What you end up with is a very large number of calculations that are required to conduct a basic valuation adjustment. By Green’s estimates, this equates to a possible 10,000 Monte Carlo paths, hundreds of time steps, and a typical trade portfolio of hundreds of thousands of derivatives transactions.
“So you very quickly, even for the baseline calculations, get into 1011 or 1012—on the order of a trillion valuations—just to get your nightly, basic CVA number without any sensitivities,” Green says. “So it’s a very onerous numerical calculation that needs to benefit from some accelerations, which is why you get GPU cards and GPU compute capability provided by Nvidia,”—the bank’s preferred brand of compute cards for all of these calculations.
Scotia started with a Kepler series GPU but will upgrade to the Volta (V100) series “fairly soon” to take advantage of the newer cards. A V100 card has more than 5,000 compute cores on it, so is better suited to performing these types of Monte Carlo calculations, where you’re essentially doing the same calculation on each part, but with different data inputs, he says.
“A couple of months ago, we did a release where we optimized the calculations significantly and we got a big performance boost, and that means we can dial down the compute requirement,” Green says. “And then later on in the year, we’ll be expecting to add more calculations to it as we start to do second-order sensitivities, and then we’ll need to dial it up again. So it gives us a degree of flexibility that we wouldn’t otherwise have.”
During the release, the bank reduced the runtime for calculating risk sensitivities by about 50%, he says. Looking ahead, Scotia will add more sensitivity calculations to provide a richer set of metrics, particularly around second-order sensitivities, or Gammas, and will also add more types of derivative transactions.
Additionally, he says, by switching from Kepler to Volta, the bank will be able to do more because of card improvements. For example, the K80 series that Scotia currently uses has two GPUs and 24 Gigabytes (GB) of RAM, whereas, the Volta cards have only one GPU but 32 GB of RAM. That extra memory will allow them to do more high-intensity computations on a single card. Volta also allows atomic operations—the ability to run concurrent programming independent of other processes—whereas Kepler’s K80 does not have this feature.
This is also another example of how the cloud—in this case, they’re using a grid of GPU machines on the cloud, and a piece of software called Origami to distribute their calculations to the various GPU cards and the various CPU cores on the cloud within the same calculation—can boost a bank’s performance, as it allows the firm to choose the card it wants to use, thus allowing for flexibility.
Prior to using Azure, Scotia would have needed to perform the same processes as it would have done five years ago—go through a purchase cycle, buy the GPUs, install them in their multiple datacenters, and then deal with the multitude of business continuity issues that will inevitably arise. Now, they can access GPU cards on the cloud fairly easily, and the cloud also gives them the ability to tune the scale and size of the grid they need to directly suit the calculations that they want to perform, and change that over time.
Not to NAG
Beyond speed, Scotia is also incorporating algorithmic differentiation and data from vendor Numerical Algorithms Group (NAG), which allows brokers to see how different changes to factors within the model might impact risk. NAG helps them to do these calculations through three tools: DCO, DCO Map, and the NAG library.
The traditional way that derivative sensitivities are calculated is using what’s known as finite difference approximation, which is essentially a bump-and-revalue technique, Green says. So, for example, to calculate first-order sensitivity, you take your inputs—such as a volatility or an interest-rate-swap price that is part of the market data that your model is calibrated to—and then you shift it up a bit, re-run the entire calculation, and then shift it down a bit, re-run the entire calculation, and then take the difference to approximate the sensitivity.
“Finite difference approximation for derivative sensitivities is enormously computationally intensive, as you can imagine, because we have thousands and thousands of inputs, particularly to the XVA calculation, because it depends on so much market data. So it’s very slow,” he says. “Algorithmic differentiation… allows you to do the same thing—or calculate the sensitivity directly by differentiating the computer program automatically.”
At its core, algorithmic differentiation is a numerical technique for calculating sensitivities by differentiating the computer program automatically. It uses two modes: forward (tangent) mode, and backward (adjoint) mode. They have different efficiencies, Green says, depending on the format of your computer program. For XVA calculations, the adjoint is the more efficient one, he says, but is harder to use than the tangent one.
Currently, Scotia is using the forward mode to calculate the first-order sensitivities, particularly for Vega calculations. Within the next few months, the bank plans to switch to the adjoint mode of calculation because it’s much more efficient, Green says.
Further reading
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Removal of Chevron spells t-r-o-u-b-l-e for the C-A-T
Citadel Securities and the American Securities Association are suing the SEC to limit the Consolidated Audit Trail, and their case may be aided by the removal of a key piece of the agency’s legislative power earlier this year.
Chief data officers must ‘get it done’—but differ on what that means
Voice of the CDO: After years of focus on data quality, governance, and compliance, CDOs are now tasked with supporting the business in generating alpha and driving value. How can firms put a value on the CDO role?
In a world of data-cost overruns, inventory systems are a rising necessity
The IMD Wrap: Max says that to avoid cost controls, demonstrate the value of market data spend.
S&P debuts GenAI ‘Document Intelligence’ for Capital IQ
The new tool provides summaries of lengthy text-based documents such as filings and earnings transcripts and allows users to query the documents with a ChatGPT-style interface.
As NYSE moves toward overnight trading, can one ATS keep its lead?
An innovative approach to market data has helped Blue Ocean ATS become a back-end success story. But now it must contend with industry giants angling to take a piece of its pie.
AI set to overhaul market data landscape by 2029, new study finds
A new report by Burton-Taylor says the intersection of advanced AI and market data has big implications for analytics, delivery, licensing, and more.
New Bloomberg study finds demand for election-related alt data
In a survey conducted with Coalition Greenwich, the data giant revealed a strong desire among asset managers, economists and analysts for more alternative data from the burgeoning prediction markets.
Waters Rankings 2024 winner’s interview: S&P Global Market Intelligence
S&P Global Market Intelligence won two categories in this year’s Waters Rankings: Best reporting system provider and Best enterprise data management system provider.