The IMD Wrap: The risks of becoming AI-rich and memory-poor

Like Alice in Wonderland, Max disappears down the rabbit hole of AI to discover a world that is sometimes fantasy—and sometimes a nightmare.

Those who forget history are doomed to repeat it. So said someone smarter than myself. 

I’m writing this on Memorial Day, the US holiday that honors men and women who died serving in the US armed forces. Humans in general are fickle creatures, and sometimes need to be prompted to remember and revere historical episodes that would otherwise be forgotten, misremembered, swept under the carpet, or revised and rewritten to suit those who find the true record of history—or their role in it—awkward or unpalatable.

While memories fade and firsthand witnesses pass away, our digitized records—and those of trades, such as the Consolidated Audit Trail (CAT) of US market activity—live on, preserved in zettabytes of storage. According to research website Statista, the amount of information created, captured, and consumed in 2010 totaled 2 zettabytes. By 2020, that had risen to 64.2 zettabytes (in total, not per year), and is forecast to reach 181 zettabytes by 2025. 

While the cost per unit of storage may be falling, the volume of data is rising rapidly. Financial data represents only a fraction of this, but nevertheless continues to grow and drive those overall volumes higher as real-time data becomes ever more granular, and that data becomes archives of historical trade and order book data—not to mention news, estimates, OTC market data, and other types.

Storing that data comes with a price, which is rising just as quickly and has driven the move away from on-premises hardware to elastic and—with the right controls—more cost-effective cloud storage. According to Statista, cloud spend has been outstripping traditional datacenter hardware for several years already. In 2021, global spend on cloud infrastructure amounted to $178 billion—a rise of 37% over $130 billion the previous year—compared to spend on datacenter hardware of $98 billion.

Along with greater volumes of data and digital storage comes another trend: The number of major corporate data breaches in 2023 in the US alone shot up to 3,205, compared to 1,862 and 1,802 in 2021 and 2022, and more than double the previous record of 1,506 in 2017, with IBM estimating that the average cost of a data breach is $4.45 million.

Customer data, trade data, strategy data—all are fair game for data pirates, and that’s before you get started on ransomware demands, halting production facilities, oil pipelines, and wastewater treatment plants, all of which have been targets for cybercriminals.

And though (in theory) our digital records—like the promise of blockchain—should be immutable, the digital world is far from infallible: Misinformation spreads like wildfire on social media; records can be altered or deleted (though hopefully there would at least be a record of who tampered with it); bank accounts can be hacked and drained; social media accounts can be hacked and used to influence followers or manipulate markets; and “deepfakes” can produce scarily accurate renditions of people and events that aren’t true.

Politicians and celebrities, for instance, may be the subject of deepfake audio recordings or videos misstating their views or purporting to show them in compromising or illegal acts. (Or, more likely, they may claim that a video of them doing something is a deepfake when in fact it’s genuine.) Perhaps video editors and forensic experts can tell the difference, but in the court of public opinion, expert witnesses often take a backseat to gut feeling.

Deepfakes have thus far—with some exceptions—mostly been successful in media and retail finance, rather than in capital markets, where stricter controls exist and where regulators impose rules on what can be done, and why. 

The fact that it occurs in retail finance at all is a tragedy—and one that often goes undetected and unpunished. According to Jay Krish, head of data governance for compliance in financial crimes compliance at State Street, at most, only 5% of financial fraud is detected, often months after the crime is committed, and less than 1% of stolen funds are ever recovered.

The cost of financial crime stands at around $6 trillion per year, or between 3% and 5% of world GDP, said Krish, who delivered a keynote speech on using graph data and generative AI to combat financial crimes at our recent North American Financial Information Summit. In contrast, he added, US firms are only spending $220 billion to combat it.

Most of us can now spot a phishing email. And attempts to manipulate capital markets via techniques such as layering or flash orders are rapidly spotted and punished by regulators. But what about instances where the crime is more nebulous, and where the fraud is committed with the willing participation of those being defrauded?

Say someone creates a deepfake video of stock pundit and TV host Jim Cramer endorsing or shorting a stock and his audience enthusiastically follows “his” advice. Or a rogue bot infiltrates a Reddit group. We’ve seen how influential these groups can be in shifting perceptions and reversing a company’s fortunes—the Gamestop saga, for example—and how the financial markets react to or are impacted by significant retail investment flows.

Observers love to portray retail investors and Wall Street as a David versus Goliath scenario. And it’s true that if David loads a big enough rock in his slingshot, Goliath should duck. But there’s also an element of role reversal here: In this analogy, until recently, that same Goliath was actually David’s protector. 

The relatively recent innovation of self-directed investing and retail trading platforms that emerged in the past few decades has made it much easier for retail investors to trade and make money—but also, conversely, to lose money. Whereas financial professionals who manage money on behalf of others must obtain qualifications and certifications, an individual is free to trade multiple assets based merely on their say-so that they understand what they are doing.

So, whether they like it or not, retail investors need additional protections—both from potential predators and from themselves, so they can resist the urge to make a small fortune out of a previously large one.

One of our greatest weapons in the fight against fraud and cyber crime may yet prove to be artificial intelligence—from better empowering investors and traders; bringing more information to their fingertips, faster; and contributing to better decision-making. As a tool for data cleansing, it may help improve data quality, and may also make it easier for compliance professionals and regulators to spot suspicious behavior, and hopefully prevent instances of market manipulation.

At the NAFIS event, Krish described how the custodian has been harnessing AI to identify potential issues. “We feed the models with suspicious activity … and anomalies to teach them what suspicious activity looks like, so they can spot it,” he said.

But there’s a catch: AI is not yet—and may never be—fully trusted by authorities. Its track record isn’t great so far. It may save time in boiling down lengthy documents into the most salient points, but teaching it to recognize what’s important and what isn’t—especially when what’s important to one analyst may be different to another or to what a trader wants—takes time, and requires people to check and double-check its results for accuracy. 

And while using a secure and ring-fenced AI instance will limit the rate at which it can learn and the number of data sources it can learn from, training a public model risks data leakage and exposes your model to data sources and biases that you may not want incorporated into your results.

Plus, AI “hallucinates,” where like a child caught out with a test question they don’t know the answer to, it just makes up an answer. Take those hallucinations at face value, and not only do we risk errors; we also risk rewriting factual data points and actual events with incorrect values. Trying to value an asset or price a derivative when a date or rate is wrong is a recipe for bank collapses.

At NAFIS, speakers warned that all AI models will need programmed guardrails and human subject matter experts to check their results and ensure accuracy, and that some use cases may be forever off limits for AI.

Don’t get me wrong: I’m excited about AI’s potential. And I believe it will make certain aspects of life—in the capital markets or otherwise—easier, more productive, and more secure. Then again, I also get excited about rollercoasters. But I’m not riding one until I’m sure it’s been tested thoroughly and know it’s not going to veer off the tracks.

Oh, and the opening quote in this article about being doomed to repeat history? Ironically, it’s been repeated so often by different figures through the ages that no one seems 100% sure who originally said it, or when. Call it a self-fulfilling prophecy, or perhaps just human nature: We repeat it and forget it at the same time.

Are you excited or alarmed about AI? Tell us how your firm is using it at max.bowie@infopro-digital.com 

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

The IMD Wrap: Will banks spend more on AI than on market data?

As spend on generative AI tools exceeds previous expectations, Max showcases one new tool harnessing AI to help risk and portfolio managers better understand data about their investments—while leaving them always in control of any resulting decisions.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here