The U.S. Securities and Exchange Commission (SEC) announced on September 11, 2023, settlement agreements with nine registered investment advisers. All were charged with advertising hypothetical performance on their websites without adopting and implementing policies and procedures required by Rule 206(4)-1 (the Marketing Rule) under the Investment Advisers Act of 1940 (Advisers Act).
Continue Reading Marketing Exam Sweep Bears FruitSEC Imposes New Burdens on Registered and Exempt Private Fund Advisers
SEC Imposes New Burdens on Registered and Exempt Private Fund Advisers
On August 23, 2023, the U.S. Securities and Exchange Commission (SEC) voted 3-2 along party lines to adopt new rules under the Investment Advisers Act of 1940 (the Advisers Act) for investment advisers to private funds. While they don’t go as far as the SEC’s original proposal, the rules impose significant new obligations and restrictions on private fund advisers. Some of these apply to all private fund advisers, including venture capital fund advisers (VC Fund Advisers) and other exempt reporting advisers (ERAs), while others apply only to SEC-registered private fund advisers (RIAs).
Crucially, the new rulemaking affords “legacy” status to certain existing arrangements that would otherwise be prohibited. Where noted below, a provision does not apply to “legacy” contractual agreements that (a) “govern the fund,” such as the fund’s operating and organizational agreements, subscription agreements, and side letters, or (b) “govern the borrowing, loan, or extension of credit entered into by the fund,” such as promissory notes, credit agreements, and provisions in the fund’s governing agreements.
New Rules for All Private Fund Advisers
The new rules that apply to all private fund advisers, including VC Fund Advisers and other ERAs, address certain conflicts of interest “that have the potential to lead to investor harm,” and prohibit activities that the SEC has characterized as “contrary to the public interest and the protection of investors.”
The compliance deadline for these new rules is 12 months from publication in the Federal Register for advisers with $1.5 billion or more in private fund assets and 18 months for those with less than $1.5 billion.
- Restricted Activities Rule. All private fund advisers, including VC Fund Advisers and other ERAs, are restricted from engaging in the activities described below unless disclosure and/or consent requirements are met, or “legacy” treatment is available.
- Adviser Investigation Costs. Advisers may not cause private fund clients to pay fees and expenses associated with an investigation of the adviser, unless the adviser obtains written consent. Legacy status is available to existing funds, except that the costs of an investigation that results in sanctions for Advisers Act violations may not be charged to a private fund under any circumstances.
- Adviser Compliance Costs. Advisers may not cause a private fund client to pay for regulatory, examination, or compliance fees or expenses of the adviser, unless such fees and expenses are disclosed to investors within 45 days after the quarter end in which such charges occur.
- Clawback Reductions. Advisers may not reduce the amount of their carried interest clawback by the amount of their taxes (actual, potential, or hypothetical), unless the pre-tax and post-tax amounts are disclosed to investors within 45 days after the quarter end in which such clawback occurs.
- Non-Pro Rata Allocation of Portfolio Costs. Advisers may not charge or allocate fees or expenses related to an existing or potential portfolio investment on a non-pro rata basis “when multiple private funds and other clients advised by the adviser or its related persons have invested (or propose to invest) in the same portfolio investment,” unless (a) the non-pro rata allocation is fair and equitable and (b) the adviser provides advance written notice of the non-pro rata charge and a description of how it is fair and equitable.
- Borrowing from Private Funds. Advisers may not borrow money or securities from a private fund client, unless the adviser provides sufficient disclosure and obtains written consent. Legacy status is available to existing funds.
- Preferential Treatment Rule. All private fund advisers are prohibited from providing investors with preferential treatment regarding redemptions and information if such treatment “would have a material, negative effect on other investors.”
- Redemptions. Advisers cannot provide preferential rights of redemption to private fund investors unless required by applicable law or the preferential treatment is offered to all investors “without qualification.” Legacy status is available to existing funds.
- Portfolio Information. Advisers cannot provide preferential information about private fund portfolio holdings or exposures, unless the information is offered to all current and potential investors. Legacy status is available to existing funds.
- Transparency of Fund Terms. Advisers cannot provide preferential terms to any private fund investor, unless material economic terms are disclosed prior to an investor’s investment in the fund and other terms are disclosed after their investment. This includes an annual report to investors on preferential terms.
New Rules for RIAs
The new rules for SEC-registered private fund advisers are intended to “facilitate the provision of simple and clear disclosures to investors” about fundamentals of their investments and related conflicts of interest.
- Quarterly Statement Rule. RIAs are required to distribute quarterly statements to investors detailing fund performance, fees, and expenses, with prominent disclosures about how expenses, payments, allocations, rebates, waivers, and offsets are calculated and cross-references to the relevant sections of the fund’s offering documents. The quarterly statements must generally be delivered to investors within 45 days after the end of each of the first three fiscal quarters and within 90 days after the end of each fiscal year. Compliance deadline is 18 months.
- Private Fund Audit Rule. RIAs are required to annually obtain and deliver private fund audits that are prepared by an independent public accountant and meet the custody rule requirements of the Advisers Act. The audit must be delivered within 120 days after the end of each fiscal year and promptly upon liquidation. Compliance deadline is 18 months.
- Adviser-Led Secondaries Rule. RIAs are required, when offering existing investors “the option between selling their interests in the private fund and converting or exchanging their interests in the private fund for interests in another vehicle advised by the adviser or any of its related persons,” to obtain and distribute to investors (a) a fairness or valuation opinion and (b) a summary of any material business relationships between the adviser and the opinion provider that occurred within the previous two years. Compliance deadline is 12 months for larger advisers (≥ $1.5 billion in private fund assets) and 18 months for smaller advisers (< $1.5 billion in private fund assets).
Conclusion
The adoption of these rules is just one development within the long list of pending SEC rule proposals affecting the asset management industry. Indeed, along with the rulemaking summarized in this post, the SEC adopted rule amendments requiring written documentation of advisers’ annual compliance program review, with a short 60-day compliance deadline.
We are carefully analyzing the new rules and expect to provide an in-depth update with suggested action steps for VC Fund Advisers and other private fund advisers in the coming weeks. Please don’t hesitate to reach out to us as you work to digest the implications of the SEC’s new rules.
Artificial Intelligence: SEC Proposals and Concerns
The author wishes to acknowledge the contributions of summer associate Henry Little
Proposed Rules
The US Securities and Exchange Commission (SEC) indicated this summer that it plans to introduce proposals to regulate conflicts of interest associated with artificial intelligence (AI) later this year as part of its semiannual rule-writing agenda. The SEC is considering proposed rules related to the following:
- Broker-dealer conflicts in the use of predictive data analytics, AI, machine learning, and similar technologies in connection with certain investor interactions.[1]
- Investment adviser conflicts in the use of predictive data analytics, AI, machine learning, and similar technologies in connection with certain investor interactions.[2]
SEC’s Key Areas of Concern
1. Conflicts. Conflicts of interest posed by AI have long been a concern for SEC Chair Gary Gensler, and he has repeatedly expressed concerns over whether brokers and financial advisors using AI can make recommendations that are in the best interests of their clients. When a broker or an adviser provides advice to a client, whether they utilize some form of AI or not, they must act in the best interests of their clients and not place their own interests ahead of their clients’ interests. Gensler is concerned about whether algorithms optimize for the investor’s interests and place the investor’s interests in front of the adviser’s interests.[3]
Notably, in late July, the SEC proposed new rules that would require broker-dealers and investment advisers, registered or required to be registered under section 203 of the Investment Advisers Act of 1940, to take certain steps to address conflicts of interest associated with their use of predictive data analytics and similar technologies.[4] Specifically, the proposal requires such firms to eliminate or neutralize the effect of conflicts of interest associated with the firm’s use of technologies that optimize for, predict, guide, forecast, or direct investment-related behaviors that result in investor interactions that place the interest of the firm or its associated persons ahead of investors’ interests (whether intentionally or unintentionally).[5] The proposal also mandates any firm using these types of technologies to adopt written polices and procedures reasonably designed to prevent violations and achieve compliance with the proposed rules and includes a requirement to test the technology to determine whether it could give rise to a conflict of interest.[6]
2. Systemic Risk. Gensler has highlighted in many of his speeches that too much concentration among AI programs could pose a potential systemic risk to the financial system.[7] In 2020, as a professor at MIT, Gensler wrote a working paper warning of the systemic risks posed by broad adoption of deep learning in finance.[8] Notably, Gensler warned that regulation could inadvertently cause problems and wrote, “challenges of explainability, fairness, and robustness may lead to regulatory gaps as well as how regulatory design may promote homogeneity in deep learning models…regulatory approaches to address these challenges inadvertently may lead to…model uniformity due to standardization of regulatory requirements.”[9] Gensler stated that a future financial crisis could be sparked “because everything was relying on one base level, what’s called (the) generative AI level, and a bunch of fintech apps are built on top of it.”[10] AI technology could pose a systemic risk in the very near future if there are concentrated AI data aggregators and concentrated generative AI and a widely used platform makes an error.[11]
3. Bias and Misinformation. The SEC and Office of the Comptroller of The Currency (OCC) have acknowledged that the use of AI in the financial sector raises unique ethical issues.[12] AI is dependent on data input, and therefore, the program developers need to prevent the program from incorporating data that reinforces historical inequities and reflects bias, affecting fair access and prices in the markets.[13] Michael J. Hsu, acting Comptroller of the Currency, also flagged that AI has the capacity to enable fraud and the spread of misinformation.[14] Gensler also wrote that “the outcomes of its predictive algorithms may be based on data reflecting historical biases as well as latent features which may inadvertently be proxies for protected characteristics.”[15]
4. Accountability. Given AI’s ability to teach itself, when it learns and moves further from its initial programming, who should be held accountable if it makes an error?[16] Providers should be capable of answering these questions as their AI initiatives expand.[17] Gensler’s 2020 working paper noted that deep learning models outcomes are often unexplainable and that “human agency and traditional intervention approaches may be lost as a consequence of lack of model explainability and transparency.”[18]
IAC Recommendations
In attempting to address these concerns, the SEC’s Investor Advisory Committee (IAC) has recently advocated for additional measures to encourage the SEC to focus on the tenets of equity, consistent and persistent testing, and governance and oversight when developing additional AI guidance.[19] In furtherance of these tenets, the IAC has recommended that the SEC hire additional employees with AI and machine learning expertise.[20] The IAC also encouraged the SEC to draft and publish best practices regarding the use of AI and to expand guidance on the unique aspects of algorithm-based investment models, including enhanced monitoring and/or conducting risk-based reviews of the use of AI.[21]
Robo-Advisers and Algorithmic Trading
Looking ahead, it’s likely that SEC regulations targeting robo-advisers and algorithmic trading can be used as guidance when looking at future regulation. For example, the SEC has requested comment as to whether index and model portfolio providers, as well as pricing services, should be considered investment advisers.[22] Although the request for comment does not specifically mention AI, it implicitly raises AI-related questions to the extent that providers use AI to perform their services. While it may not be necessary for these types of providers to register as investment advisers, and they may nonetheless be exempt under the publisher’s exclusion, the SEC could mandate that the developers of these types of services register with the SEC.[23]
The SEC has already addressed their treatment of automated advisors, which are often referred to as robo-advisers, and determined that they should be treated as traditional SEC-registered investment advisers, as defined by the Investment Adviser Act of 1940 (Advisers Act). Due to this categorization, and the robo-adviser’s unique business model, which includes reliance on algorithms and limited, if any, interaction with clients, the SEC’s Division of Investment Management and their Office of Compliance Inspections and Examinations outlined the following guidance for which robo-advisers should consider to ensure compliance with the Advisers Act.[24]
- Substance and presentation of disclosures to clients about robo-advisers and the investment advisory services it offers.[25]
- Obligation to obtain information from clients to support the robo-adviser’s duty to provide suitable advice.[26]
- Adoption and implementation of effective compliance programs reasonably designed to address particular concerns relevant to providing automated advice.[27]
These considerations are especially relevant because, under the Advisers Act, robo-advisers owe a fiduciary duty to their clients.[28] Accordingly, key aspects of this fiduciary duty are for the robo-adviser to fully disclose its offerings and to not deviate from the client’s stated investment objective.[29]
Similarly, the SEC has regulated algorithmic trading. In doing so, the SEC required algorithmic trading developers to register as “Securities Traders” if they are associated with a FINRA member and are primarily responsible for the design, development, or significant modification of algorithmic trading strategies or are responsible for supervising or directing such activity.[30] Under the rule, “algorithmic trading strategy” is defined as an automated system that generates or routes orders and order-related messages.[31] Further, the rule is designed to increase market transparency and accountability for firms engaged in electronic trading.[32]
Use of AI by the SEC
Like many other organizations, the SEC is adopting these new technologies to analyze complex data.[33] Scott W. Bauguess, the Deputy Director and Deputy Chief Economist for the Division of Economic and Risk Analysis (DERA), stated, “we have begun a host of new initiatives that leverage the machine learning approach to behavioral predictions, particularly in the area of market risk assessment, which includes the identification of potential fraud and misconduct.”[34] For example, the Corporate Issuer Risk Assessment (CIRA) program was developed to detect anomalous patterns in financial reporting.[35]
While new technology may flag a filing as high risk, the classification alone does not serve as a clear indicator of potential wrongdoing. Rather, as Bauguess explained, the data collected via machine learning methods helps the SEC prioritize examinations so that they can direct resources to areas of the market that are the most susceptible to potential violative conduct.[36] Thus, as of now, machine learning methods do not generally point to a particular action or conduct indicative of fraud or other violations. The SEC staff must still look for the human element of fraud and scienter.[37]
The next article in this series will discuss FINRA’s view of AI technology.
[1] Securities and Exchange Commission, Prohibition of Conflicted Practices for Broker-Dealers That Use Certain Covered Technologies.
[2] Securities and Exchange Commission, Prohibition of Conflicted Practices for Investment Advisers That Use Certain Covered Technologies.
[3] Gary Gensler, Investor Protection in a Digital Age, Remarks Before the 2022 NASAA Spring Meeting & Public Policy Symposium (May 17, 2022).
[4] Securities and Exchange Commission, SEC Proposes New Requirements to Address Risks to Investors From Conflicts of Interest Associated With the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers(July 26, 2023).
[5] Securities and Exchange Commission, Fact Sheet – Conflicts of Interest and Predictive Data Analytics.; Securities and Exchange Commission, Proposed Rule – Conflicts of Interest Associated with the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisors (July 26, 2023).
[6] Fact Sheet, supra note 5.
[7] Id.
[8] Gary Gensler and Lily Bailey, Deep Learning and Financial Stability, Working Paper (November 1, 2020).
[9] Id.
[10] John Divine, How AI Could Spark the Next Financial Crisis, U.S. News (June 30, 2023).
[11] Id.
[12] Gensler, supra note 3; Acting Comptroller of the Currency Michael J. Hsu, Remarks to the American Bankers Association (ABA) Risk and Compliance Conference “Tokenization and AI in Banking: How Risk and Compliance Can Facilitate Responsible Innovation 9 (Jun. 16, 2023).
[13] Gensler, supra note 3.
[14] Hsu, supra note 12 at 11.
[15] Gensler, supra note 8.
[16] Hsu, supra note 12 at 9-10.
[17] Id.
[18] Gensler, supra note 8.
[19] Christopher Mirabile & Leslie Van Buskirk, Establishment of an Ethical Artificial Intelligence Framework for Investment Advisors, U.S. Securities and Exchange Commission Investor Advisory Committee 2 (Apr. 6, 2023).
[20] Id.at 3.
[21] Id.
[22] Securities and Exchange Commission, Request for Comment on Certain Information Providers Acting as Investment Advisers 3 (Aug. 16, 2022).
[23] Investment Advisers Act of 1940, 76th Cong. § 2(a)(11).
[24] U.S. Securities and Exchange Commission Division of Investment Management, Guidance Update 2 (Feb. 2017).
[25] Id.
[26] Id.
[27] Id.
[28] Id.
[29] U.S. Securities and Exchange Commission, Information for Newly-Registered Investment Advisers (Nov. 23, 2010).
[30] Bloomberg Professional Services, SEC Approves FINRA rule requiring registration of algorithmic trading developers (Apr. 20, 2016).
[31] U.S. Securities and Exchange Commission, Order Approving a Proposed Rule Change to Require Registration as Securities Traders of Associated Persons Primarily Responsible for the Design, Development, Significant Modification of Algorithmic Trading Strategies or Responsible for the Day-to-Day Supervision of Such Activities (Apr. 7, 2016).
[32] Bloomberg Professional Services, supra note 30.
[33] Scott W. Baugess, Has Big Data Made Us Lazy?.
[34] Id.
[35] Mark J. Flannery, Insights into the SEC’s Risk Assessment Programs.
[36] Baugess, supra note 33.
[37] Id.
Artificial Intelligence: An Introduction and General Regulatory Landscape
Introduction to AI and Definitions
Artificial intelligence (AI), a term first coined in the 1950s, is a field of technology that engages in problem-solving by using AI algorithms to make predictions or classifications based on input data.[1] With the recent emergence of generative AI technology, the regulation of AI has become a priority for various governmental agencies due to its expansive capabilities and potential uses. As AI technology continues to develop and be used across industries, investment managers, developers, and regulators alike must consider the implications and risks associated with using this technology.
This series of articles will discuss the potential regulation of AI technology in the financial services industry. This article will begin the series with a discussion of the general regulatory landscape of AI in the United States.
While many definitions of AI exist, a popular definition of AI is the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings.[2] Different types of AI exist, including weak AI, strong AI, deep learning, and machine learning (ML).[3] Weak AI, also referred to as narrow AI, is the most common AI used today and is AI trained to perform specific tasks. It is used in popular speech recognition virtual assistants and autonomous cars.[4] Strong AI is based on the Turing test, in which a machine would have the intelligence equal to that of a human or exceeds the intelligence of a human, respectively.[5]
ML is a subfield of AI and involves training algorithms on large datasets to identify patterns and relationships and then using these patterns to make predictions or decisions about new data.[6] Deep learning is a subfield of ML and is composed of neural networks with multiple layers that are used to analyze complex patterns and relationships in data, thereby enabling the use of larger data sets.[7]
Generative AI refers to deep-learning models that can learn to generate new outputs based on the data they have been trained on.[8] Generative models can create new content in the form of images, text, and audio.[9] Deep-learning AI models will soon outpace or replace narrow AI models as they can work across broad domains and solve an array of problems.[10]
With more industries exploring or beginning to incorporate AI into their operations and the developing capabilities of AI models, government bodies and corporations alike have voiced their concerns about the potential risks and ethical implications of AI.
Innovation
The U.S. government has yet to pass any comprehensive legislation regulating the development and use of AI in enterprise. The limited legislation passed and regulatory guidelines issued by government agencies, particularly throughout 2020, suggested that the federal government was committed to encouraging the innovation of AI technology. In 2020, a memo issued by the Office of Management and Budget encouraged federal agencies to avoid passing regulations or taking nonregulatory action that would interfere with AI innovation or growth.[11] The memo reasoned that the innovation and growth of AI is important because it “holds the promise to improve efficiency, effectiveness, safety, fairness, welfare, transparency, and other economic and social goals, and America’s continued status as a global leader in AI development is important to preserving our economic and national security.”[12]
The legislation passed to date by Congress concerning AI has encouraged the use and development of AI technology. For example, the AI in Government Act of 2020 “establishes the AI Center of Excellence to facilitate and coordinate federal government adoption of AI technologies.”[13] The National AI Initiative Act of 2020 “sets forth policies regarding certain federal activities related to artificial intelligence, including implementation by the president of a National Artificial Intelligence Initiative to support research and development, education, and training programs.”[14] Finally, the Advancing American AI Act seeks to “encourage agency artificial intelligence-related programs and initiatives that enhance the competitiveness of the United States in innovation and entrepreneurialism” and “enhance the ability of the Federal Government to translate research advances into artificial intelligence applications to modernize systems and assist agency leaders in fulfilling their missions.”[15]
The National Artificial Intelligence Initiative Office (NAIIO), established in the National AI Initiative Act of 2020, has released ways in which federal agencies are incorporating the use of AI into their operations.[16] With respect to financial services, the U.S. Department of the Treasury and the Securities and Exchange Commission (SEC) are adopting AI technology and machine learning.[17] The SEC, in particular, is implementing machine learning algorithms to monitor and detect potential investment market misconduct.[18]
As AI technology has developed and been implemented across industries, it is evident there are risks associated with its use, and there has been a shift this year towards understanding how best to regulate AI while still promoting innovation. Although very little legislation regulating AI technology has passed, it is likely that more comprehensive legislation will be passed in the future as members of Congress continue to educate themselves on the potential uses and risks of AI technology. Senator Chuck Schumer recently announced his SAFE Innovation Framework for AI Policy, a two-step plan to regulate AI technology.[19] The first step in the plan establishes a framework for action that encourages innovation while calling for security, accountability, protecting democratic foundations, and explainability.[20] The second step in the plan outlines Congress’ approach to creating legislation. In the fall of 2023, members of Congress will host a series of AI insight forums, hearing from AI developers, executives, scientists, advocates, and others to further their understanding of AI technology and lay the groundwork for creating AI policy.[21] The House Energy and Commerce Committee approved the AI Accountability Act in mid-July and will send the bill for a possible vote on the House floor in the fall. The AI Accountability Act would direct the Assistant Secretary of Commerce for Communications and Information to conduct a study and hold public meetings with respect to artificial intelligence systems.[22] In the Senate, there is a proposal for the AI LEAD Act, which would establish the Chief Artificial Intelligence Officers Council, Chief Artificial Intelligence Officers, and Artificial Intelligence Governance Boards.[23]
Risks and Risk Management
The primary risks associated with AI that federal agencies have identified are safety, risks to democratic values, and risks to privacy.[24] AI Algorithms can be biased and can discriminate against a protected class. For example, many industries use AI in the hiring process. Evidence shows that input data used in these algorithms can lead to biased outcomes in hiring.[25] For example, biased AI algorithms could have significant negative implications for companies seeking to increase diversity in their hiring practices or comply with specific ESG standards.
In addition to the risk of AI technology infringing on democratic values, the use of AI systems presents risks to users’ privacy. Given the importance of data in AI algorithms making predictions, AI models may engage in abusive data practices and using and storing data without the consent of the user.[26] AI algorithms could prove to be a particular risk to investors and fund managers because many corporations use AI models developed by third parties. Therefore, confidential data could be compromised and exposed to third parties when input into an AI algorithm.
While the federal government has yet to take legislative action against the potential risks associated with the use of AI technology, federal agencies have issued guidelines on how to mitigate potential risks. As AI continues to be developed and used in ways that it hasn’t been before, the risk of using AI for harm, whether intended or not, increases.
Biden Administration
The Biden administration has published an AI Bill of Rights to offer guidance on how AI should be developed and used to be safe and effective without infringing on civil rights.[27] The AI Bill of Rights suggests that AI systems should be developed with consultation from diverse communities and consider the potential impact on diverse users and communities in general.[28] To ensure risks are minimized, the AI Bill of Rights recommends that AI systems undergo pre-deployment testing, risk identification and mitigation, ongoing monitoring, mitigation of unsafe outcomes, including those beyond the intended use, and adherence to domain-specific standards.[29]
The pre-deployment testing suggested in the AI Bill of Rights recommends following domain-specific best practices and mirroring the conditions in which the technology will be employed to ensure the technology will work in its real-world context.[30] The testing should account for both the AI technology and the role of human operators who may affect system outcomes.[31] The AI Bill of Rights suggests that developers of AI systems identify potential risks of the system, both before and during the system’s deployment. Mitigation of the potential risks should be proportionate to the level of risk identified, and systems should not be used until risk can be mitigated.[32] AI systems should be designed to proactively protect users from harm stemming from both intended and unintended, yet foreseeable uses or impacts of the AI system.[33]
To protect users against abusive data practices, the AI Bill of Rights suggests that AI developers create built-in protections and give users agency over how their data is used.[34] AI developers should seek users’ permission and should limit the data collected to only that which is strictly necessary for the specific context in which it is collected.[35] For AI system users to have agency over how their data is used, they should receive notice and an explanation that an AI system is being used. The AI Bill of Rights encourages developers to provide plain language documentation of the AI system being used and a clear description of its role.[36] The explanations should be technically valid, meaningful, and useful to the user and others who may need to understand the system.[37]
Finally, the AI Bill of Rights suggests that AI systems should have human alternatives to fall back on if the system experiences unintended problems.[38] Human alternatives should give users the ability to opt out of the AI system, where appropriate, and give them access to a human operator who can assist with and remedy any issues, such as a system error or failure.[39] The appropriateness of opting out is determined by reasonable expectations within the context and focuses on ensuring broad accessibility and protecting the public from especially harmful impacts.[40]
The Biden administration is also working with several AI companies, and this July, received voluntary commitments from seven companies to move toward safe, secure, and transparent development of AI technology. The voluntary commitments include testing the safety and capabilities of the AI systems, ensuring that the technology does not promote bias and discrimination, strengthening privacy protections, and shielding children from harm.[41]
While the federal government has been relatively hands off to date in regulating the development and use of AI technology, the use of AI can pose certain risks and may have regulatory implications on the financial services industry.
The next article in this series will discuss the SEC’s proposed regulation of AI technology.
The author wishes to acknowledge the contributions of summer associate Stephanie Flynn
[1] What is Artificial Intelligence (AI)?, IBM (2023).
[2] Chethan Kumar, Artificial Intelligence: Definition, Types, Examples, Technologies, Medium (Aug. 31, 2018).
[3] IBM, supra note 1.
[4] Id.
[5] Id.
[6] Id.
[7] Scott W. Bauguess, Acting Director and Acting Chief Economist, DERA, The Role of Big Data, Machine Learning, and AI in Assessing Risks: A Regulatory Perspective (June 21, 2017),.
[8] Id.
[9] Id.
[10] Id.
[11] Off. Of Mgmt. & Budget, Exec. Off. Of the President, M-21-06, Guidance for Regulation of Artificial Intelligence Applications (Nov. 17, 2020).
[12] Id.
[13] AI in Government Act, H.R. 2575, 116th Cong. (2020) (incorporated in H.R. Con. Res. 133, 116th Cong. (2020) (enacted)).
[14] National AI Initiative Act of 2020, H.R. 6216, 116th Cong. (2020).
[15] Advancing AI Research Act of 2020, S. 3891, 116th Cong. (2020).
[16] Nat’l AI Initiative off., Applications.
[17] Id.
[18] Id. See Bauguess, supra note 7.
[19] Schumer, supra note 11.
[20] Id.
[21] Id.
[22] https://www.congress.gov/bill/118th-congress/house-bill/3369/text?s=1&r=38
[23] https://www.congress.gov/bill/118th-congress/senate-bill/2293/text?s=1&r=2
[24] Off. Of Sci. And Tech. Pol’y, Exec. Off. Of the President, Blueprint for an AI Bill of Rights: Making Automated Systems Work for the American People.
[25] Id.
[26] Id.
[27] Id.
[28] Id.
[29] Id.
[30] From Principles to Practice: A Technical Companion to the Blueprint for an AI Bill of Rights, The White House (Oct. 2022).
[31] Id.
[32] Id.
[33] Id.
[34] Off. Of Sci. And Tech. Pol’y, supra note 21.
[35] Id.
[36] Id.
[37] Id.
[38] Id.
[39] Id.
[40] Id.
[41] https://www.whitehouse.gov/wp-content/uploads/2023/07/Ensuring-Safe-Secure-and-Trustworthy-AI.pdf
FINRA Approves the First Special Purpose Broker Dealer to Custody Digital Asset Securities
The delay has subsided with custody of digital asset securities by special purpose broker-dealers (SPBDs). By way of background, on July 8, 2019, SEC and FINRA staff issued a joint statement addressing how registered broker-dealers could facilitate transactions in digital asset securities without taking custody of the assets. The solution involved bilateral clearance and settlement of the transactions.
A year later, the SEC’s Division of Trading Markets staff issued a no-action letter to FINRA articulating the staff’s position on how alternative trading systems (ATSs) could facilitate trading in digital asset securities using a three-step process. However, per its terms, the no-action letter requires the ATSs to not take custody of the digital asset securities.
Continue Reading FINRA Approves the First Special Purpose Broker Dealer to Custody Digital Asset SecuritiesFinders, Registration Is Required!
Between November 2017 and November 2021, three individuals actively solicited investments in securities, including providing marketing materials and advising on the merits of the investment, and receiving commissions for their sales. In May 2022, the U.S. Securities and Exchange Commission (SEC) halted the activities of the individual defendants involved in May 2022, for operating a vast network of sales agents in connection with a $410 million fraud. Because such unauthorized activity is hard to contain, now in 2023, the SEC has followed up on its previous action by charging the three sales agents of the unregistered broker-dealer with fraud and unregistered broker activity.
As highlighted by the SEC’s recent complaint, sales agents must obtain licenses and broker-dealer registrations, pursuant to Section 15(a) of the Exchange Act, to engage in the business of facilitating securities transactions. While broker-dealer registration requirements are based on a facts and circumstances analysis, when an individual is advising on the merits, passing along marketing materials, and receiving compensation tied to those activities, there is likely to be a presumption that registration is required.
Broker Registration Requirements
The Exchange Act requires securities brokers to register with the SEC or, if they are individuals, to be associated with a brokerage firm registered with the SEC. The types of activities that may require broker-dealer registration based on the full set of facts and circumstances include:
- Participating in important parts of a securities transaction, including solicitation, negotiation, or execution of the transaction.
- Receiving compensation for participation in the transaction dependent upon, or related to, the outcome or size of the transaction or deal.
- Handling the securities or funds of others in connection with securities transactions.
Risks Relating to Unregistered Activities
While the recent allegations are a particularly egregious example that include fraudulent statements hiding transaction-based compensation, the SEC’s enforcement action serves as a reminder that flouting broker-dealer registration requirements can result in expensive enforcement actions for both individuals and firms. Regulators and courts will look to the economic reality of the transaction and seek to determine whether there is direct or indirect compensation based on hallmarks of broker-dealer conduct.
FINRA Emphasizes Reg BI Standards for Complex Product Recommendations
In the blitz of regulatory and financial developments that have made headlines throughout the first quarter of 2023, a recent FINRA enforcement action serves as a reminder to both broker-dealers and their representatives that Regulation Best Interest (Reg BI) remains an area of focus for FINRA. This action underscores how important it is for broker-dealers to ensure that their representatives fully understand the risks and other features of complex financial products, make only suitable recommendations, and otherwise comply with Reg BI.
The Relevant Regulatory Standards
Reg BI (Customer Best Interests)
Reg BI requires broker-dealers and their representatives to act in the best interest of retail customers when making a recommendation regarding any securities transaction or investment strategy. The care obligation, set forth at Section (a)(2)(ii) of the rule, requires broker-dealers and their representatives to exercise reasonable diligence, care, and skill to, among other things, understand the potential risks, rewards, and costs associated with a recommendation, and have a reasonable basis to believe that the recommendation could be in the best interest of at least some retail customers. As noted in the Letter of Acceptance, Waiver, and Consent (AWC), the SEC’s adopting release for Reg BI states that whether reasonable diligence, care and skill exist
[depends] on, among other things, the complexity of, and risks associated with the recommended security . . . and the broker-dealer’s familiarity with the recommended security ….”
FINRA Rule 2111 (Suitability)
FINRA Rule 2111 requires broker-dealers and their representatives to have a reasonable basis for believing that a recommended transaction or investment strategy is suitable for the customer based on the customer’s investment profile.
FINRA Rule 2010 (Standards of Commercial Honor and Principles of Trade)
FINRA Rule 2010 requires broker-dealers to observe high standards of commercial honor and just and equitable principles of trade.
Recommendations without a Reasonable Basis
In the recent action, the representative settled the AWC with FINRA and was fined for recommending unsuitable leveraged and inverse ETFs to retail customers without having a sufficient understanding of the product risks and features. As stated in the AWC, the representative should have been particularly mindful in light of FINRA’s previous Regulatory Notice 09-31, which notified and alerted members that complex ETFs, such as those that offer leverage or are designed to perform inversely to an index, “typically are not suitable for retail investors who plan to hold them for more than one trading session, particularly in volatile markets.”
Considering the facts at issue in the AWC, FINRA determined that the representative did not have a reasonable basis for making the recommendations because, among other things, he
did not understand that losses in leveraged and inverse exchange-traded funds can be compounded because of the daily reset function.”
FINRA cited the representative for violating Reg BI’s care obligation and FINRA Rules 2111 and 2010.The representative was fined $2,500, required to make restitution to the customers that lost money based on his recommendations, and suspended for three-months.
Key Takeaways
- Reg BI continues to be a regulatory priority for FINRA.
- Broker-dealers must train, test and monitor their representatives to ensure they understand complex financial products and are capable of assessing for whom such products are appropriate.
- Broker-dealers should also ensure that along with the customer’s age, income, investment objectives, investment horizon, and other information, that data regarding the customer’s indebtedness is also collected and factored into the analysis.
Investment Company Status Considerations for Cash Positioning in Wake of Bank Failures
Given this week’s headlines, many emerging companies may be asking themselves: “Why am I holding so much cash?”
The Investment Company Act of 1940 (the 1940 Act) may be to blame.
“Inadvertent” Investment Companies
But I don’t have any intent of being an investment company. Aren’t those mutual funds or hedge funds? I’m an operating company that produces goods or provides services.”
The 1940 Act can apply to all companies, not just those with investment-related businesses. The term “investment company” under the 1940 Act has two primary meanings:
- A company that is or holds itself out as primarily engaged in the business of investing or trading in securities, which describes many investment funds; or
- A company whose total assets (exclusive of government securities and cash items) are comprised of at least 40% “investment securities” (which is more broadly defined under the 1940 Act than “securities” are defined under the Securities Act of 1933).
In other words, even if it doesn’t hold itself out as an investment-related business, an operating company that has 40% or more of its assets invested in stocks, bonds or other securities (even conservative corporate bonds held for cash preservation purposes) is an “investment company.” As such, the company will be subject to the registration and other requirements of the 1940 Act unless it meets a conditional exemption under Section 3 of the 1940 Act or another relevant provision or rule under the 1940 Act. Intent is not an element of the second primary meaning of “investment company,” and even a company that produces goods and services could inadvertently meet the definition simply by having a balance sheet comprised of too high a percentage of “investment securities” in relation to its total assets.
Generally, Government Securities and Cash Items are Out of the Equation
Importantly, along with government securities, “cash items,” such as cash, bank demand deposits and federally-regulated money market funds, are typically (but not always) removed from both the “investment securities” numerator and the total assets denominator when calculating the 40% test described above or similar tests under exemptive provisions. Holding corporate assets in cash as opposed to “investment securities” helps a company avoid investment company status if an exemption is not available. Even holding cash can complicate things, though, since excluding a large cash position from the 40% test can cause a company’s “investment security” holdings to disproportionately affect the results of that calculation.
Many start-up companies find themselves in the position of holding proceeds from fund raising but with relatively few other offsetting assets. Seeking to avoid becoming an inadvertent investment company, they may choose to hold the proceeds in cash items or government securities. Rule 3a-8 excludes certain research and development companies (and some other start-up companies) from the definition of an investment company if certain conditions are met, but many companies are not able to meet the conditions. This means that start-ups may be at particular risk from instability in the banking system as they may be limited in the choices they have to invest their fundraising proceeds.
Avoiding Inadvertence
What should I do if I am worried about holding cash but want to avoid an investment company status issue?”
Even if a regulator does not approach a company, many significant deals require a representation or even an outside legal opinion as to the company’s status under the 1940 Act. Significant securities holdings could delay a deal or even require restructuring the business. So, while it may be tempting to move your liquid holdings from bank accounts to securities or other assets in light of recent bank developments, careful planning can help prevent inadvertently jumping from the frying pan into the fire.
Contact a securities lawyer knowledgeable on investment company status issues should you have any questions.
Market Turmoil Caused by “Run on the Banks” Leads to Trading Halts
On March 10, 2023, volatility resulting from concerns regarding runs on certain banks triggered trading halts in those banks’ stocks on the New York Stock Exchange (NYSE) and Nasdaq. March 13, 2023, saw additional trading halts on bank stocks. This post provides a brief explanation of the Limit Up Limit Down (LULD) rules that pause and prevent trading in a single security from taking place outside a specific range, either up or down, from the average trading price during the previous five minutes.
Overview
When there is single-stock, industry specific, or market wide volatility, trading halts are a way to create speed bumps to allow markets to absorb information and calm volatility. The Securities and Exchange Commission (SEC) took emergency action to halt trading in response to the 2008 financial crisis. More recently, as detailed in our 2020 post summarizing the overarching framework in which exchanges may halt trading, exchanges imposed trading halts in response to extreme market volatility with unprecedented magnitude and velocity triggered by the coronavirus pandemic. Our prior post explained that exchanges may utilize different types of regulatory halts during periods of market turmoil, particularly when individual stocks endure large fluctuations over short periods. Last week the securities markets experienced turmoil in one sector, regional banks, when shares in three banks experienced sharp declines following the announcement that another bank had been place in receivership with the Federal Deposit Insurance Corporation (FDIC).
Types of Trading Halts
Exchanges such as NYSE and Nasdaq may initiate different types of trading halts, such as market-wide circuit breaker halts (MWCB), which temporarily halt trading in all National Market System (NMS) securities in the event a MWCB is breached, but not all trading halts are the same. In the case of the March 10 trading halts, it was the LULD rules that triggered NYSE and Nasdaq single-stock circuit breaker halts.
LULD Rules
Part of the SEC’s mission is to maintain fair, orderly, and efficient markets. To address extraordinary market volatility, in April 2019 the SEC adopted the LULD rules. The LULD rules prevent trades in NMS securities from occurring outside of specified price bands, which are set at a percentage level above and below the average reference price of a security over the preceding five-minute period. The previous day’s closing prices on the security’s primary listing exchange is used to set the bands for a given day. The price band percentage itself does not change intraday. The LULD rules apply during regular trading hours of 9:30 am ET – 4:00 pm ET.
The percentage levels for the price bands of a security’s circuit breakers are dependent on whether the security is a Tier 1 or Tier 2 security.
- Tier 1: All securities in the S&P 500, the Russell 1000 and select exchange traded products.
- Tier 2: With limited exceptions, all other NMS securities.
Per the LULD Operating Committee, the current price bands themselves are as follows:

Under LULD rules, there is a five-minute trading pause on an NMS security if trading is unable to occur within the specified price band after fifteen seconds. The trading pause may be extended for another five minutes and, thereafter, all markets may resume trading. If a security is in a trading pause during the last ten minutes of the exchange’s regular trading hours, the exchange will not reopen trading and will attempt to execute a closing transaction using its established closing procedures. Reg NMS Rule 611 provides certain exceptions to LULD price bands (i.e., Self Help, Not Regular Way, Open/Close, Crossed Markets, Intermarket Sweep, Benchmark, Flickering Quotes, and Stopped Orders) as well as other exemptions.
Reg SHO
Reg SHO also imposes a single-stock circuit breaker. Instead of halting or suspending trading, it places heightened restrictions on certain types of trades. Rule 201 of Reg SHO prohibits short-selling at or below the national best bid in a security that declines 10% or more from its prior day’s closing price. Once triggered, the circuit breaker remains in effect for the remainder of the trading day and the following day. The circuit breaker can be retriggered on the following day. One of the primary policy goals of the Rule 201 circuit breaker is to allow long sellers a chance to sell without short sellers placing additional downward pressure on the price.
Conclusion
As the markets navigate the second-biggest bank failure in U.S. history, trading halts may continue in certain banking stocks and could have collateral consequences in other sectors.
DOJ Brings First Criminal Charges Stemming from Use of Rule 10b5-1 Trading Plan
On March 1, 2023, the U.S. Department of Justice (“DOJ”) unsealed an indictment against the CEO of a publicly traded health care company (the “Executive”) relating to charges of an insider trading scheme. The indictment represents the first time that DOJ has brought criminal insider trading charges stemming from an executive’s use of a Rule 10b5-1 trading plan. The investigation is part of a data-driven initiative led by DOJ’s Fraud Section to identify executive abuses of 10b5-1 trading plans.
Rule 10b5-1 Plans
Rule 10b5-1 trading plans offer an affirmative defense from insider trading liability on the basis of material non-public information (“MNPI”) under Section 10(b) of the Securities Exchange Act of 1934 and Rule 10b-5. It is an affirmative defense whereby insiders can set up future trades pursuant to a binding contract adopted in compliance with the rule. However, the defense is unavailable if the insider is in possession of MNPI at the time the insider adopts the trading plan. Additionally, a plan does not protect an insider who does not enter into a plan in good faith or uses a plan as part of an effort or scheme to evade the prohibitions of Rule 10b5-1.
DOJ’s Criminal Indictment for the Insider Trading Scheme
DOJ alleges that the Executive allegedly avoided more than $12.5 million in losses by entering into two Rule 10b5-1 trading plans while in possession of MNPI concerning the likelihood that the health care company’s then-largest customer would terminate its contract. In May 2021, the Executive allegedly entered into his first 10b5-1 plan shortly after learning that the relationship between the health care company and the customer was deteriorating. Then, in August 2021, the Executive allegedly entered into his second 10b5-1 trading plan approximately one hour after the health care company’s chief negotiator for the contract confirmed to the Executive that termination was likely.
In establishing the 10b5-1 plans, the Executive allegedly refused to engage in any “cooling-off” period – the time between entering into the plan and selling the stock – despite warnings from the Executive’s brokers. Instead, the Executive allegedly began selling shares of the health care company on the next trading day after establishing each plan. On August 19, 2021, just six days after the Executive adopted his August 10b5-1 plan, the health care company announced that the customer had terminated its contract and the health care company’s stock price declined by more than 44%. The Executive is charged with one count of engaging in a securities fraud scheme and two counts of securities fraud for insider trading.
Renewed Focus on Rule 10b5-1 Plans by SEC and DOJ
This first criminal indictment comes immediately on the heels of recent amendments to Rule 10b5-1 by the U.S. Securities and Exchange Commission (“SEC”) which went into effect on February 27, 2023. The amendments made a number of key changes to Rule 10b5-1 summarized in more depth here by our sister blog, Public Chatter. Key elements of the newly amended Rule 10b5-1 include:
- Mandatory cooling off periods for directors and non-directors.
- Restrictions on Overlapping and Single-Trade Plans.
- Written Certifications when entering into Rule 10b5-1 Plans.
- Annual Reporting of Insider Trading Policies and Procedures and Quarterly Reporting on Use of Trading Plans.
- Narrative and Tabular Disclosure About Timing of Option Grants.
Implications for the Digital Asset Industry
DOJ’s indictment relating to insider trading should interest digital asset issuers, trading platforms, and other digital asset intermediaries. As the SEC continues to take the stance that “a vast majority” of digital assets are securities, DOJ’s case demonstrates that the relevant risks relating to securities offerings and subsequent transactions may involve criminal as well as civil charges. Digital asset industry participants should give serious consideration to the value of implementing insider trading policies including the use of 10b5-1 plans.
What’s Next
DOJ’s criminal case against the Executive is now proceeding in federal court in California. If convicted, the Executive faces a maximum penalty of 25 years in prison on the securities fraud scheme charge and 20 years in prison on each of the insider trading charges. Meanwhile, the indictment is an indication of the Fraud Section’s and federal law enforcement’s focus on executive abuses of Rule 10b5-1 plans. To rely on its affirmative defense, strict adherence to the rule is necessary.