Waters Wrap: Big Tech’s capital markets expansion continues
Anthony looks back at some of the major cloud and AI projects involving the likes of Amazon, Google, IBM, Microsoft, and/or Snowflake in 2023.
At this time last year, few in the capital markets were talking about generative AI. Yet, as we head into 2024, it’s arguably the most popular topic when it comes to the evolution of financial technology. What was being talked about a year ago was how the so-called Big Tech firms were increasingly digging their tentacles into the capital markets as banks, asset managers, exchanges, and vendors migrated more and more pieces of their tech stacks to the cloud.
That rush to the cloud did not slow down in 2023. And when you combine those cloud capabilities with Big Tech’s prowess in providing machine learning tools and handling large language models, well, you have a recipe for Big Tech to become exponentially more important to the world of trading.
For this column, I will highlight some of the more interesting projects from the last 12 months that involve Amazon, Google, IBM, Microsoft, and/or Snowflake. This is just a snapshot (and click on the links for far more detail). Still, hopefully it shows how rapid technology in the capital markets is evolving—and largely thanks those five companies above.
If I’m missing anything, don’t hesitate to reach out and let me know your thoughts: anthony.malakian@infopro-digital.com.
Cloud multicasting
In recent years, CME, LSEG, Nasdaq, and Deutsche Börse—among others—announced significant partnerships with Amazon (AWS), Google or Microsoft to move their exchange infrastructures to the cloud. For example, Nasdaq officially moved its second and third matching engines to the AWS cloud in August.
However, one of the biggest challenges that exchange operators will face for these migration projects is the concept of multicasting. Over the past two decades, multicast has proved essential for the timely and efficient distribution of market data from exchanges to market participants. But now, as all parties start moving significant chunks of their technology infrastructures—from hardware stacks to applications—to the cloud, the ability of exchanges and cloud providers to support multicast in cloud environments has proved a barrier to adoption in some areas.
Max Bowie dove deep into this subject, but while skeptics remain as to the viability of cloud multicasting, others say multicast won’t be a stumbling block as long as cloud technology evolves, and that cloud’s biggest migration challenges remain strategy, perception, and understanding.
“We’ve discovered software- and partner-based approaches to enable applications dependent on multicast to leverage cloud and simplify distribution of information at scale,” says Rohit Bhat, senior director for capital markets, digital assets and exchanges at Google Cloud.
“This is especially useful for use cases that leverage multicast but don’t have low-latency requirements, such as passive order flow. In addition, there now exist adapters, which enable popular data distribution platforms like Kafka and Google Pub/Sub to dramatically simplify operations, while gaining the benefit of cloud resilience and global connectivity.”
Click here for much more on that subject.
At the same time, though, Wei-Shen Wong spoke with execs at Colt Technology Services on the topic of cloud multicasting. In October 2022, the network provider and market data carrier partnered with AWS on a cloud co-location proof of concept that demonstrated the viability of hosting and distributing multicast data in the cloud for capital markets customers.
The project resulted in Colt bringing its multicast data service for capital markets to life with AWS in June 2023. Wei-Shen lays out all the work that went into it, and looks at what’s ahead for the partnership.
Millennium Management enlists Google for building custom tech solutions
Multi-strategy hedge fund Millennium Management has selected Google Cloud for its portfolio managers to leverage AI tools such as Vertex AI and data analytics via BigQuery in a multi-cloud environment. Millennium’s 300-plus global investment teams and 1,200 technologists can use Google Cloud’s platform to build custom tech solutions and solve challenges critical to their investment workflows.
“Millennium has been an early adopter of cloud technology, and by offering access to the Google Cloud infrastructure, we are providing portfolio managers and technologists with optionality in cloud capabilities. We will continue to explore new ways cloud can add value to our firm, including expanding our generative AI capabilities,” said Rob Newton, global head of technology infrastructure at Millennium.
Millennium, which manages over $60 billion of assets, has always been a leader in technology and infrastructure. In addition to the proprietary technology deployed to Millenium’s investment teams, the firm began building out its cloud capabilities in 2016. It also counts AWS as a provider of cloud services and machine learning through the firm’s use of SageMaker.
LSEG looks to 2024 for Microsoft product release
David Schwimmer, chief executive officer of LSEG, said in October that the exchange is on track for product delivery in the second half of 2024 relating to its partnership with Microsoft. He added that this aligns with the initial timeframe LSEG laid out during the partnership announcement in December 2022.
“You should expect to see products in the different areas that we have talked about, i.e., the embedding of our data and analytics and workflow in the Microsoft Teams and productivity suite; the usage of our data and the movement of our data into the Microsoft Azure environment; and the usage of Fabric, which will make a much more attractive, integrated environment for the usage of our data,” he said.
This includes new analytics and modeling-as-a-service, providing LSEG customers a cloud-based engine to access LSEG analytical tools and modeling frameworks through Workspace, Excel, API, and other customer-defined front ends.
On LSEG’s shift to the cloud, Schwimmer explained that it “fundamentally” changes how customers use some of its services.
For example, in the past, to access real-time data from LSEG, customers would need to have LSEG’s hardware and servers on the trading floor. Now that some of its real-time data is available in the cloud, it is accessible to other customers, too.
“Corporates, for example, who might not even have a trading floor but want to track a complex supply chain, or something along those lines. Similarly, in our customer and third-party risk business, the cloud availability has just made our workflows easier to embed in our customers’ daily activity,” he said.
Bloomberg, Snowflake ally to accelerate cloud data adoption
In June, Bloomberg released an app in Snowflake’s Native App Framework that will make it much easier for clients to use Bloomberg data to develop and run cloud-hosted data management tools and applications.
Snowflake’s Native App Framework, launched in 2022, enables developers to build and test applications in the cloud. The Bloomberg DL+ Snowflake Native App was developed and tested over the past two months, and allows users to incorporate data from Bloomberg Data License Plus (DL+)—the vendor’s cloud-based data management solution, which delivers more than 40,000 data fields, covering 50 million securities—into those applications, which run within clients’ Snowflake accounts.
Don Huff, global head of client services and operations at Bloomberg Data Management Services, told WatersTechnology that the move was driven by customer demand to be able to access Bloomberg data in Snowflake’s cloud.
“Our customers want our data everywhere and anywhere,” he says. “We do have customers taking our data into Snowflake today. Now we have a tool to quickly get them up and running.”
Suppose a firm already contracts to use Snowflake and its app store. In that case, the Bloomberg app creates a fresh Snowflake data warehouse to store the data, and Bloomberg enables a Cloud Delivery Manager function within DL+, which delivers data to the consuming application, as set up and configured in the app by the client.
Huff adds that the app will be available to any Bloomberg customer across the buy and sell sides that wants access to its data in Snowflake’s cloud.
The app is intended to complement—rather than replace—its primary data delivery channels, such as the Bloomberg Professional Terminal and its B-Pipe datafeed, and to serve different needs within client firms.
“Our customers have a lot of skill sets and require different tooling. We’re catering to those who know how to set up a data pipeline but maybe aren’t writing Python code,” he says.
“It means a chief data officer can purchase the data from Bloomberg, administer the data from Bloomberg, and deliver the data from Bloomberg—all without leaving their chair, and without having to call IT staff or build an engineering pipeline. And then all their data scientists now have a sandbox of all the data they’re getting from Bloomberg,” Huff says.
DTCC’s resiliency framework
In November, AWS and the Depository Trust and Clearing Corporation (DTCC) released a technical framework, a set of best practices, and a reference implementation for building resilient financial applications. The work comes as part of an ongoing series of resiliency-focused whitepapers that DTCC has issued since 2019.
The latest report details how to embed resiliency into applications through the development and consumption of reusable components and capabilities, and enable applications to operationally rotate between datacenter regions and run in each region for an extended period. The DTCC and AWS developed two resilient reference applications—for basic settlement and trade matching—with the code available for free on GitHub.
Jim Fanning, US director of global financial services at AWS, told Rebecca Natale that as an infrastructure provider to financial services firms, clients are asking for prescriptive guidance regarding resiliency and high availability. AWS’ suite of services—this latest reference implementation uses tools such as Route 53 Application Recovery Controller, CloudWatch, HealthCheck API, Systems Manager, and SDK for Python (boto3), among others—can serve as building blocks.
“Building blocks are great because you can build anything. If I dumped a bunch of blocks out on the floor and said, ‘Build me a house,’ you could build 1,000 beautiful houses. But for regulated customers, especially in financial services, what we’ve been hearing more frequently is, ‘Tell me how I can build my house,’” Fanning said.
Partnering on cloud controls
In October, the Fintech Open Source Foundation officially open-sourced its Common Cloud Controls project, initially starting as an internal project at Citi. In July of this year, Finos announced the formation of an open standard to describe common controls across public cloud providers in the financial services sector. Alongside Citi, initial participants included Goldman Sachs, Morgan Stanley, Bank of Montreal, Royal Bank of Canada, and London Stock Exchange Group, among others.
The project is looking to provide a consistent approach to how the threats that a service has to mitigate are described. “A CSP can then contribute and give us the implementation details of their controls that would match that logical control that would address that threat,” Jim Adams, CTO and head of Citi technology infrastructure, told Nyela Graham. One provider might say there is a single control that they can use, while another may have two or three. The expectation is that the providers will be able to map to the standard.
“By putting this into open source, we get the collective wisdom of the industry,” Adams says. “We can all look and say we agree these are the threats that a relational database has to defend against. And we can all look at the description of the logical controls and say we believe that control would be effective against those threats.”
Google is currently the sole provider involved, but Gabriele Columbro, executive director of Finos, said there are active conversations with other cloud providers, both large and small.
IBM bets big on generative AI
This year, IBM launched Watsonx, an enterprise-ready AI and data platform. And in June, it announced that it was buying Apptio, a financial and operational IT management and optimization software provider, for $4.6 billion.
Watsonx, which became generally available in July, has a studio for new foundation models, generative AI, and machine learning; a data store built on an open lakehouse architecture; and a toolkit to enable firms to build AI workflows with responsibility, transparency, and explainability.
During the company’s Q2 earnings call on July 19, chairman and CEO Arvind Krishna stressed that IBM’s focus is on enterprise AI to solve business problems. He added that AI is being “infused” into IBM’s software products and that the company is currently building products to address specific enterprise use cases.
In August, John Duigenan, general manager for the Financial Services Industry for IBM, joined the Waters Wavelength podcast to discuss genAI and Watsonx. Click here to listen.
You can also read about how IBM believes that genAI will help improve everything from high-frequency trading to post-trade processing to addressing settlement challenges.
RBC explores genAI tools
Vinh Tran, head of cloud engineering at the Royal Bank of Canada, spoke with Eliot Raman Jones about the firm’s cloud strategy, but he also discussed how RBC is also exploring genAI tools provided by Microsoft and AWS, separately.
“Our generative AI use cases are very internally focused. We think genAI will be a big enabler for us, behind the scenes, making developers more productive,” Tran said. He added that keeping up with the speed and innovation of technology while maintaining security of data is the main challenge for RBC in the future. “Especially with what’s happening in genAI today, we are seeing innovation and change happen at warp speed,” Tran says. “That’s what keeps me up at night: How do we move faster and keep up with the change while maintaining our very high bar for security and resiliency?”
BMLL, Snowflake partner
London-based data and analytics provider BMLL Technologies partnered with Snowflake to bring new financial datasets to Snowflake’s Marketplace and bring the cloud company further into the front office. The deal follows Snowflake’s investment into BMLL and will allow BMLL’s level 3 order book data to be delivered to Snowflake customers via its curated storefront of third-party data and applications.
“We were really attracted to BMLL for that data, but as we dug a little deeper, they’re also doing a ton of really interesting data engineering to get that high-value data that would normally be on-premise, that every capital market client would be doing themselves. BMLL is doing all the heavy lifting up front, making a really unique dataset,” said Harsha Kapre of Snowflake Ventures, the company’s venture capital arm. “As BMLL also is moving more workloads to Snowflake, then we get into the realm of being able to say beyond just having this data, there are applications and other things we can enable for their clients and our joint clients beyond just the joining of data.”
Deutsche Börse, CME Group expand on individual partnerships with Google
During Deutsche Börse’s Q3 earnings call, the exchange’s CEO, Theodor Weimer, said the partnership with Google Cloud is progressing nicely. The deal with Google was announced in February.
“We already have 40% of our IT state in the cloud, and by 2026 it will be around 70% with Google Cloud as a preferred partner,” Weimer says. “We benefit from Google Cloud’s skills and their superior knowledge in the industry. Areas where we can expect tangible results from our Google Cloud partnership include faster time to market, increasing efficiency and superior cybersecurity.”
Earlier in the year, Weimer said that the exchange will also aggressively explore how genAI can help Deutsche Börse. “It’s our explicit wish and our ambition that we are not becoming a laggard in this field.”
CME Group also has an exclusive partnership with Google. Earlier this year, the exchange said it will spend $60 million this year as part of its ongoing project with Google to migrate key parts of the exchange’s infrastructure to the cloud. But the effort is already paying off, as officials say using Google’s cloud platform has already enabled CME to develop new data and analytics products faster than it would have been able to do with traditional technology.
For example, using the cloud has already allowed CME to create completely new execution analytics tool that allows the exchange to share benchmark execution data with customers to help them better manage their trading. CME put the new analytics into production earlier this month, and will begin sharing them with clients soon, said Julie Winkler, CME’s chief commercial officer.
Major industry players partner on blockchain network for institutional assets
Digital Asset and a number of financial firms plan to launch the Canton Network, a privacy-enabled, interoperable blockchain network designed for institutional assets. Other participants in the project include ASX, BNP Paribas, Broadridge, Cboe Global Markets, Deloitte, Deutsche Börse Group, Goldman Sachs, Microsoft, Moody’s, S&P Global, and several more.
The Canton Network aims to provide a decentralized infrastructure that connects independent applications built with Daml, Digital Asset’s smart-contract language, enabling financial institutions to work within a safer and reconciliation-free environment where assets, data, and cash can synchronize freely across applications.
For example, asset registers and cash payment systems are distinct and siloed systems in today’s markets. With the Canton Network, a digital bond and a digital payment can be composed across two separate applications into a single transaction, providing simultaneous exchange without operational risk. Likewise, a digital asset could be used in a collateralized financial transaction via connection to a repo or leveraged loan application.
Canton Network participants will begin testing interoperability capabilities across a range of applications and use cases in July.
CJC deploys Google Chronicle Security Operations Suite
In June, market data managed services company Crown Jewels Consultants (CJC) deployed the Google Chronicle Security Operations Suite following a partnership with cybersecurity company SEP2.
According to Google, the Chronicle Security Operations suite ingests a user’s data into a private container at a petabyte scale. The data is aggregated, normalized and linked with out-of-the-box detections and threat intelligence. That data is then exposed via case management, sub-second search, collaboration, and contextual mapping. Rapid response times occur via automated playbooks, incident management, and closed-loop feedback.
CJC has partnered with SEP2 on Google’s introduction to optimize the platform’s deployment. The project aims to improve threat response times and decrease the company’s incident resolution times. Chronicle will also provide better metrics and increase automation in CJC’s security processes. CJC will use SEP2.security, SEP2’s MDR solution. SEP2.security combines Chronicle’s SIEM and SOAR components with SEP2’s 24/7 Wingman support service.
S&P Global, LTX expand individual partnerships with AWS
Broadridge subsidiary LTX migrated its corporate bond trading platform to AWS this year. In a press release, Broadridge said the migration will allow LTX to provide more advanced analysis of its Liquidity Cloud, a network of anonymous real-time buy- and sell-side indications of interest. LTX will enhance its pricing, dealer selection, client recommendation, and similar bonds functionalities, better helping buy-side investors and dealers to make smarter trading decisions.
Similarly, S&P Global chose AWS as its preferred cloud provider in a new push to enhance its cloud infrastructure. The agreement will extend cloud-based services to more than 100,000 of S&P’s government and enterprise customers in 43 countries worldwide. As it stands, 65% of S&P’s application workloads run on AWS. As part of the collaboration, S&P Global will also move its Capital IQ and remaining core data platforms to AWS by 2025.
Moody’s taps Microsoft for genAI plans
In June, Moody’s Corporation and Microsoft (announced a new strategic partnership to deliver next-generation data, analytics, research, collaboration and risk solutions for financial services and global knowledge workers. Built on a combination of Moody’s data and analytical capabilities and the power and scale of Microsoft Azure OpenAI Service, the partnership aims to create innovative offerings that enhance insights into corporate intelligence and risk assessment, powered by Microsoft AI and anchored by Moody’s proprietary data, analytics, and research.
At the time of the announcement, “Moody’s CoPilot”—an internal tool—was deployed to Moody’s 14,000 global employees, and will combine Moody’s proprietary data, analytics and research with the latest large language models and Microsoft’s generative AI technology to drive firm-wide innovation and enhance employee productivity in a safe and secure digital sandbox.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Too ’Berg to fail? What October’s Instant Bloomberg outage means for the industry
The ubiquitous communications platform is vital for traders around the globe, especially in fixed income and exotic derivatives. When it fails, the disruption can be great.
J&J debuts AI data contracts management tool
J&J’s new GARD service will use AI to help data pros query data contracts and license agreements.
How a consolidated tape could address bond liquidity fragmentation
Chris Murphy, CEO of Ediphy, writes that the biggest goal of a fixed-income tape should be the aggregation of, and democratized access to, market data.
Tech VC funding: It’s not just about the money
The IMD Wrap: It’s been a busy year for tech and data companies seeking cash to kick-start new efforts. Max details how some are putting the fun into fundraising.
BNY uses proprietary data store to connect disparate applications
Internally built ODS is the “bedrock” upon which BNY plans to become more than just a custodian bank.
Waters Wavelength Ep. 296: Questions about data quality
It’s all about the data, data, data.
The AI boom proves a boon for chief data officers
Voice of the CDO: As trading firms incorporate AI and large language models into their investment workflows, there’s a growing realization among firms that their data governance structures are riddled with holes. Enter the chief data officer.
FactSet launches conversational AI for increased productivity
FactSet is set to release a generative AI search agent across its platform in early 2025.