AI's Role in the Next Financial Crisis: A Warning from SEC Chair Gary Gensler

d89d4115-b58d-433d-af12-ec4e513fbdc6TL;DR - The future of finance is intertwined with artificial intelligence (AI), and according to SEC Chair Gary Gensler, it's not all positive. In fact, Gensler warns in a 2020 paper —when he was still at MIT—that AI could be at the heart of the next financial crisis, and regulators might be powerless to prevent it.

AI's Black Box Dilemma: AI-powered "black box" trading algorithms are a significant concern. Imagine several traders using similar algorithms, all deciding to sell at the same time. It's like a stampede at a market, causing a crash. This risk is amplified by the "apprentice effect," where people trained together tend to think alike.

Regulatory Challenges: Regulating AI is like trying to catch smoke with your bare hands. If regulators try to control AI, they might inadvertently create a situation where all AI models act the same, increasing the risk of a synchronized failure. Gensler's words ring clear: "If deep learning predictions were explainable, they wouldn't be used in the first place."

Discrimination and Unpredictability: AIs are like mysterious judges, assessing creditworthiness and other financial decisions. But their opacity makes it hard to tell if they're acting in a discriminatory manner. An AI that was fair yesterday might become biased today, and there's no way to predict or prevent that.

Systemic Risks and Regulatory Gaps: Deep learning in finance is like a growing storm, likely to increase systemic risks. Regulators might try to slow it down by increasing capital requirements or implementing "sniff tests" from more explainable models, but Gensler admits these measures are "insufficient to the task."

The Data Conundrum: AI's hunger for data is like an unquenchable thirst. Models built on the same datasets may act in lockstep, leading to crowding and herding. This convergence can create monopolies and "single points of failure" that threaten the entire network. Think of Lehman Brothers' failure, but on a data-driven scale.

Incomplete and Dangerous Data: Even the largest datasets are like incomplete puzzles, lacking enough historical information to cover a full financial cycle. This gap can lead to devastating consequences, as seen during the financial crisis.

Global Risks: Developing economies might end up using AIs trained on foreign data, like trying to navigate a local market with a map of a different city. The risks here are even larger.

The Bottom Line: AI's unknowns are its most dangerous aspect. The intertwining of AI and finance is a complex dance, and as Gensler warns, one misstep could lead to a crisis.

Free QR Code Phishing Security Test

Did you know dynamic QR code scans increased 433% globally from 2021 to 2022? Try our free QR Code Phishing Security Test to identify users that are most susceptible to these types of attacks so you can train them to think twice before scanning QR codes and build a stronger security culture.

Monitor-QRT-2Here's how it works:

  • Immediately start your test for up to 100 users (no need to talk to a person)
  • Select from 35 languages and choose one of 3 templates
  • Choose from a “red flags missed” or a “404 error” landing page
  • Get a PDF emailed to you in 24 hours with your Phish-prone Percentage

Go Phishing Now!

PS: Don't like to click on redirected buttons? Cut & Paste this link in your browser:

Subscribe to Our Blog

Comprehensive Anti-Phishing Guide

Get the latest about social engineering

Subscribe to CyberheistNews