Gary Gensler, the Chairman of the U.S. Securities and Exchange Commission (SEC), has stated that artificial intelligence will play a significant role in future financial crises, and regulators will face challenges in addressing its effects.
According to IDEA News, Gary Gensler, the Chairman of the U.S. SEC, has mentioned that artificial intelligence will be at the heart of future financial crises, and regulators will face difficulties in preempting its impact.
As the head of one of the most important regulatory bodies in the United States, Gensler’s message paints a concerning picture of the future of financial markets with AI as a central focus.
In an article published in 2020 while he was teaching at MIT, Gensler examined potential risks and noted that regulators would struggle to control these risks effectively.
The significance of AI’s role in financial markets lies in the power of hidden algorithmic trading strategies that collectively make simultaneous decisions to sell specific stocks or assets, causing a severe market crash.
Gensler’s article explains, “In fact, many individuals lack the necessary skills to construct and manage these models, and they all have a fairly similar background. Moreover, individuals who have been educated alongside each other have strong affinities, referred to as ‘apprenticeship effects.'”
Furthermore, regulations might lead to uniformity among AI models. If regulators control the capabilities of AI, the risk of excessive similarity among these models increases, almost turning them into a unified entity, and the likelihood that companies will use AI as a service, provided by a few large providers, becomes higher.
Since the intentions behind the buy and sell decisions made by these models are ambiguous and unknown to humans, preventing a crash in trading markets is not feasible for regulators.
Gensler writes, “If deep learning predictions were easily explainable, there would be no rationale for their use.”
Algorithmic trading strategies are only a part of the potential risks posed by artificial intelligence.
Many AI tools are used in credit scoring, making it difficult to discern potential discrimination. Additionally, as AI tools constantly and unpredictably evolve, it’s uncertain that tools that were not biased yesterday will remain unbiased today.
Gensler states, “With the widespread use of deep learning in financial matters, we may witness the emergence of a series of regulatory voids. In our view, deep learning increases systemic risk.”
Perhaps the simplest and most impactful regulatory response could be increasing the capital requirements for financial institutions using AI tools.
Regulators might also require all results generated by AI to pass through a traditional linear test, making the results more understandable and interpretable than the tools themselves. Furthermore, companies should avoid actions that are fundamentally unjustifiable.
However, while regulators might mitigate the speed of risk escalation, completely preventing systemic risk is not possible.
The main issue lies with data
Gensler writes, “Models built on common datasets will likely have closely related predictions, which will gradually lead to crowding and group behavior.”
Gensler also believes that the hunger for information leads to the accumulation of a significant volume of data in a particular place, creating monopolistic tendencies. In this situation, concentrated failure points threaten the entire network.
Moreover, even the largest databases are inherently incomplete and risky. He writes, “Data from the internet, wearables, telematics, GPS, and smartphones, all together, lack a sufficient time horizon to cover even one complete financial cycle.”
This issue has negative consequences, as seen in the 2008 financial crisis, and the risk of crowding behavior remains.
On the other hand, companies operating in developing countries might use tools to engage in financial markets that don’t leverage native data, leading to even higher risks.
In conclusion, AI tools don’t know that they don’t know, intensifying the risk.
No Comment! Be the first one.