A.I. Regulation is On the Way

The words “artificial intelligence” used to provoke thoughts of dystopian science fiction. Now, we think more of ChatGPT and less about “The Terminator.” Artificial intelligence, or A.I., is now ubiquitous and quickly being integrated into every facet of our lives. The consumer finance industry is not immune from its reach. A.I. has become so pervasive and its implications so wide that our government is taking note. On October 29th, the Biden Administration issued a wide-ranging Executive Order (the “Order”) establishing guidelines for consumer safety and protection while emphasizing equity.

For banks that use proprietary A.I., the Order includes provisions directed at A.I. developers and designers. It directs the Secretary of Commerce, in coordination with certain other agencies, to establish guidelines, with the aim of creating industry standards for developing “safe, secure, and trustworthy A.I. systems.” The Order encourages agencies to consider using the full range of their authorities to protect consumers from fraud, discrimination, and threats to privacy, and to address other risks that may arise from A.I., including risks to financial stability. The agencies are also encouraged to consider rulemaking, as well as emphasizing or clarifying where existing regulations apply to A.I. The agencies are also encouraged to clarify the responsibility of regulated entities to conduct due diligence and monitor any third-party A.I. services they use, and to emphasize or clarify requirements and expectations related to the transparency of A.I. models and regulated entities’ ability to explain their use of A.I. models.

The Order encourages the CFPB Director and the Director of the Federal Housing Finance Agency, in order “to address discrimination and biases against protected groups in housing markets and consumer financial markets…to consider using their authorities, as they deem appropriate, to require their respective regulated entities, where possible,” to do the following:

  • Use appropriate methodologies including A.I. tools to ensure compliance with federal law;
  • Evaluate their underwriting models for bias or disparities affecting protected groups; and
  • Assess automated collateral valuation and appraisal processes in ways that minimize bias.

The Order also requires the Secretary of Housing and Urban Development and encourages the CFPB Director, in order “to combat unlawful discrimination enabled by automated algorithmic tools used to make decisions about access to housing and in other real estate-related transactions,” to issue additional guidance addressing how the Fair Housing Act, the Consumer Financial Protection Act, or the Equal Credit Opportunity Act apply to the advertising of housing credit, and other real estate-related transactions through digital platforms, including those that use A.I. models to facilitate advertising delivery, as well as best practices to avoid violations of federal law.

The main takeaway for banks should be “stay tuned.” Our country’s vast regulatory state has been directed to seriously take a look at the state of the A.I. and its application across the economy, including the banking industry. While current A.I. may not be able to invoke “The Terminator” and say, “I’ll be back,” don’t forget about A.I regulation. It’s coming.