The Securities and Exchange Commission proposed a rule in July that would require advisers using predictive data analytics technology to eliminate or neutralize any conflict of interest arising from that use.
The proposal has come under withering criticism from industry leaders. The comment file, which closed on Tuesday after a 60-day period, featured letter after letter calling for a full withdrawal of the rule.
Criticisms of the proposal focus on two elements: the wide application of the proposal to technologies capable of guiding, predicting or forecasting on behalf of a client, which would capture technology outside the artificial intelligence space, and the requirement to eliminate or neutralize related conflicts of interest.
According to some comment letters, the covered technology definition would include the ability of advisers to provide retirement readiness calculators and income in retirement forecasts, since they are predictive in character.
The SEC’s proposal defined covered technology as “analytical, technological, or computational functions, algorithms, models, correlation matrices, or similar methods or processes that optimize for, predict, guide, forecast, or direct investment-related behaviors or outcomes of an investor.”
Traditionally, advisers have been held to a mitigation-and-disclosure model for managing conflicts.
‘As Long as You’re Making Money … There Is a Conflict’
Dan Gallagher, the chief legal, compliance and corporate affairs officer at Robinhood, said the proposal is “the worst thing I have seen come out of the SEC in my entire career” on October 13 at a conference hosted by the Security Traders Association. Gallagher noted that the proposal defines a conflict as anything which places the adviser’s interest ahead of the client’s. According to Gallagher, this includes app notifications of price movement if they could be construed by the SEC as an enticement to trade more.
Brett Redfearn, the founder and CEO of Panorama Financial Markets Advisory, said at the conference that “as long as you’re making money, presumably there is a conflict.” He went on to say that, “this is not limited to solicitations and recommendations. This is engaging, providing information, educational materials, any information.”
Jillien Flores, the head of global government affairs at the Managed Funds Association, said at the conference on October 12 that the MFA is “calling for a withdrawal of the proposal” because it is “abandoning the disclosure and informed consent model,” a departure she described as “unprecedented.” Because of the wide definition of covered technology, “virtually every part of an adviser’s business is impacted here.”
‘We Need to … Not Rely on 800 Pages of Legally-Guk’
On the subject of conflicts, Gallagher said the SEC’s new approach, one that does not permit disclosure, effectively says that “retail investors are too stupid to understand disclosure.”
SEC Chairman Gary Gensler has spoken about the issue in multiple public settings. At a speech hosted by the National Press Club in Washington in July, Gensler said artificial intelligence is uniquely unfit for disclosure and uniquely vulnerable to generating conflicts because the data on which AI technology is trained is extremely complicated, opaque and quickly changing. The issue is not so much that retail investors could not understand it, though that may be possible. Instead, Gensler has said the SEC is concerned it would be difficult to accurately describe the conflict for the purpose of disclosure, and the necessary summarization would provide opportunities for hidden conflicts.
Professor James Angel of Georgetown University argued at the conference on October 13 that “the problem they are trying to solve is disclosure, and it isn’t working,” because current disclosure is often ignored by investors. “We need to communicate what needs to be communicated and not rely on 800 pages of legally-guk.”