The author wishes to acknowledge the contributions of summer associate Henry Little
The US Securities and Exchange Commission (SEC) indicated this summer that it plans to introduce proposals to regulate conflicts of interest associated with artificial intelligence (AI) later this year as part of its semiannual rule-writing agenda. The SEC is considering proposed rules related to the following:
- Broker-dealer conflicts in the use of predictive data analytics, AI, machine learning, and similar technologies in connection with certain investor interactions.
- Investment adviser conflicts in the use of predictive data analytics, AI, machine learning, and similar technologies in connection with certain investor interactions.
SEC’s Key Areas of Concern
1. Conflicts. Conflicts of interest posed by AI have long been a concern for SEC Chair Gary Gensler, and he has repeatedly expressed concerns over whether brokers and financial advisors using AI can make recommendations that are in the best interests of their clients. When a broker or an adviser provides advice to a client, whether they utilize some form of AI or not, they must act in the best interests of their clients and not place their own interests ahead of their clients’ interests. Gensler is concerned about whether algorithms optimize for the investor’s interests and place the investor’s interests in front of the adviser’s interests.
Notably, in late July, the SEC proposed new rules that would require broker-dealers and investment advisers, registered or required to be registered under section 203 of the Investment Advisers Act of 1940, to take certain steps to address conflicts of interest associated with their use of predictive data analytics and similar technologies. Specifically, the proposal requires such firms to eliminate or neutralize the effect of conflicts of interest associated with the firm’s use of technologies that optimize for, predict, guide, forecast, or direct investment-related behaviors that result in investor interactions that place the interest of the firm or its associated persons ahead of investors’ interests (whether intentionally or unintentionally). The proposal also mandates any firm using these types of technologies to adopt written polices and procedures reasonably designed to prevent violations and achieve compliance with the proposed rules and includes a requirement to test the technology to determine whether it could give rise to a conflict of interest.
2. Systemic Risk. Gensler has highlighted in many of his speeches that too much concentration among AI programs could pose a potential systemic risk to the financial system. In 2020, as a professor at MIT, Gensler wrote a working paper warning of the systemic risks posed by broad adoption of deep learning in finance. Notably, Gensler warned that regulation could inadvertently cause problems and wrote, “challenges of explainability, fairness, and robustness may lead to regulatory gaps as well as how regulatory design may promote homogeneity in deep learning models…regulatory approaches to address these challenges inadvertently may lead to…model uniformity due to standardization of regulatory requirements.” Gensler stated that a future financial crisis could be sparked “because everything was relying on one base level, what’s called (the) generative AI level, and a bunch of fintech apps are built on top of it.” AI technology could pose a systemic risk in the very near future if there are concentrated AI data aggregators and concentrated generative AI and a widely used platform makes an error.
3. Bias and Misinformation. The SEC and Office of the Comptroller of The Currency (OCC) have acknowledged that the use of AI in the financial sector raises unique ethical issues. AI is dependent on data input, and therefore, the program developers need to prevent the program from incorporating data that reinforces historical inequities and reflects bias, affecting fair access and prices in the markets. Michael J. Hsu, acting Comptroller of the Currency, also flagged that AI has the capacity to enable fraud and the spread of misinformation. Gensler also wrote that “the outcomes of its predictive algorithms may be based on data reflecting historical biases as well as latent features which may inadvertently be proxies for protected characteristics.”
4. Accountability. Given AI’s ability to teach itself, when it learns and moves further from its initial programming, who should be held accountable if it makes an error? Providers should be capable of answering these questions as their AI initiatives expand. Gensler’s 2020 working paper noted that deep learning models outcomes are often unexplainable and that “human agency and traditional intervention approaches may be lost as a consequence of lack of model explainability and transparency.”
In attempting to address these concerns, the SEC’s Investor Advisory Committee (IAC) has recently advocated for additional measures to encourage the SEC to focus on the tenets of equity, consistent and persistent testing, and governance and oversight when developing additional AI guidance. In furtherance of these tenets, the IAC has recommended that the SEC hire additional employees with AI and machine learning expertise. The IAC also encouraged the SEC to draft and publish best practices regarding the use of AI and to expand guidance on the unique aspects of algorithm-based investment models, including enhanced monitoring and/or conducting risk-based reviews of the use of AI.
Robo-Advisers and Algorithmic Trading
Looking ahead, it’s likely that SEC regulations targeting robo-advisers and algorithmic trading can be used as guidance when looking at future regulation. For example, the SEC has requested comment as to whether index and model portfolio providers, as well as pricing services, should be considered investment advisers. Although the request for comment does not specifically mention AI, it implicitly raises AI-related questions to the extent that providers use AI to perform their services. While it may not be necessary for these types of providers to register as investment advisers, and they may nonetheless be exempt under the publisher’s exclusion, the SEC could mandate that the developers of these types of services register with the SEC.
The SEC has already addressed their treatment of automated advisors, which are often referred to as robo-advisers, and determined that they should be treated as traditional SEC-registered investment advisers, as defined by the Investment Adviser Act of 1940 (Advisers Act). Due to this categorization, and the robo-adviser’s unique business model, which includes reliance on algorithms and limited, if any, interaction with clients, the SEC’s Division of Investment Management and their Office of Compliance Inspections and Examinations outlined the following guidance for which robo-advisers should consider to ensure compliance with the Advisers Act.
- Substance and presentation of disclosures to clients about robo-advisers and the investment advisory services it offers.
- Obligation to obtain information from clients to support the robo-adviser’s duty to provide suitable advice.
- Adoption and implementation of effective compliance programs reasonably designed to address particular concerns relevant to providing automated advice.
These considerations are especially relevant because, under the Advisers Act, robo-advisers owe a fiduciary duty to their clients. Accordingly, key aspects of this fiduciary duty are for the robo-adviser to fully disclose its offerings and to not deviate from the client’s stated investment objective.
Similarly, the SEC has regulated algorithmic trading. In doing so, the SEC required algorithmic trading developers to register as “Securities Traders” if they are associated with a FINRA member and are primarily responsible for the design, development, or significant modification of algorithmic trading strategies or are responsible for supervising or directing such activity. Under the rule, “algorithmic trading strategy” is defined as an automated system that generates or routes orders and order-related messages. Further, the rule is designed to increase market transparency and accountability for firms engaged in electronic trading.
Use of AI by the SEC
Like many other organizations, the SEC is adopting these new technologies to analyze complex data. Scott W. Bauguess, the Deputy Director and Deputy Chief Economist for the Division of Economic and Risk Analysis (DERA), stated, “we have begun a host of new initiatives that leverage the machine learning approach to behavioral predictions, particularly in the area of market risk assessment, which includes the identification of potential fraud and misconduct.” For example, the Corporate Issuer Risk Assessment (CIRA) program was developed to detect anomalous patterns in financial reporting.
While new technology may flag a filing as high risk, the classification alone does not serve as a clear indicator of potential wrongdoing. Rather, as Bauguess explained, the data collected via machine learning methods helps the SEC prioritize examinations so that they can direct resources to areas of the market that are the most susceptible to potential violative conduct. Thus, as of now, machine learning methods do not generally point to a particular action or conduct indicative of fraud or other violations. The SEC staff must still look for the human element of fraud and scienter.
The next article in this series will discuss FINRA’s view of AI technology.
 Securities and Exchange Commission, Prohibition of Conflicted Practices for Broker-Dealers That Use Certain Covered Technologies.
 Securities and Exchange Commission, Prohibition of Conflicted Practices for Investment Advisers That Use Certain Covered Technologies.
 Gary Gensler, Investor Protection in a Digital Age, Remarks Before the 2022 NASAA Spring Meeting & Public Policy Symposium (May 17, 2022).
 Securities and Exchange Commission, SEC Proposes New Requirements to Address Risks to Investors From Conflicts of Interest Associated With the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers(July 26, 2023).
 Securities and Exchange Commission, Fact Sheet – Conflicts of Interest and Predictive Data Analytics.; Securities and Exchange Commission, Proposed Rule – Conflicts of Interest Associated with the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisors (July 26, 2023).
 Fact Sheet, supra note 5.
 Gensler, supra note 3; Acting Comptroller of the Currency Michael J. Hsu, Remarks to the American Bankers Association (ABA) Risk and Compliance Conference “Tokenization and AI in Banking: How Risk and Compliance Can Facilitate Responsible Innovation 9 (Jun. 16, 2023).
 Gensler, supra note 3.
 Hsu, supra note 12 at 11.
 Gensler, supra note 8.
 Hsu, supra note 12 at 9-10.
 Gensler, supra note 8.
 Christopher Mirabile & Leslie Van Buskirk, Establishment of an Ethical Artificial Intelligence Framework for Investment Advisors, U.S. Securities and Exchange Commission Investor Advisory Committee 2 (Apr. 6, 2023).
 Id.at 3.
 Securities and Exchange Commission, Request for Comment on Certain Information Providers Acting as Investment Advisers 3 (Aug. 16, 2022).
 U.S. Securities and Exchange Commission, Information for Newly-Registered Investment Advisers (Nov. 23, 2010).
 Bloomberg Professional Services, SEC Approves FINRA rule requiring registration of algorithmic trading developers (Apr. 20, 2016).
 U.S. Securities and Exchange Commission, Order Approving a Proposed Rule Change to Require Registration as Securities Traders of Associated Persons Primarily Responsible for the Design, Development, Significant Modification of Algorithmic Trading Strategies or Responsible for the Day-to-Day Supervision of Such Activities (Apr. 7, 2016).
 Bloomberg Professional Services, supra note 30.
 Baugess, supra note 33.