President Joe Biden’s new artificial intelligence executive order will boost efforts underway at federal banking, consumer finance, and housing regulatory agencies to police uses of the new technology for discrimination and privacy violations.
The Federal Reserve, the Federal Deposit Insurance Corp., the Office of the Comptroller of the Currency, and the Consumer Financial Protection Bureau are monitoring how banks and other lenders use AI in loan decisions, looking out for potential fair lending violations. The CFPB is also focused on the unwitting use of customers’ financial data to feed the generative learning tools used to build AI models.
Biden’s order is part of an government-wide effort to ensure that AI is deployed in a safe and fair way across the economy. The order also tasks federal employment, privacy, and other regulators to redouble their efforts to oversee the industry.
The executive order unveiled Monday will speed up banking regulators’ existing efforts and add a few new tasks, particularly to address AI-related abuses in rental housing.
1. How do banks use AI?
Banks and fintechs have been at the forefront of recent efforts to develop and deploy AI technologies. Financial institutions are eyeing the AI tools for tasks including loan decisions, anti-money laundering compliance, and capital planning.
The banking industry sees AI as a way to speed up credit decisions, and potentially make them fairer by cutting down on human bias.
But there are concerns biases can still make their way into the computer models AI tools rely on to decide who should get a loan and who shouldn’t. Developers like
2. What are regulators doing?
Monday’s AI executive order is the third Biden has issued during his nearly three years in office. It follows the release of a Blueprint for an AI Bill of Rights in October 2022 and an executive order targeting algorithmic discrimination in February.
The president’s previous efforts directed financial regulators to tighten their scrutiny of AI products in the marketplace.
The latest AI order is no different, instructing federal agencies to “consider using their full range of authorities to protect American consumers from fraud, discrimination, and threats to privacy and to address other risks that may arise from the use of AI, including risks to financial stability.”
Banking regulators and the Federal Trade Commission have all said they plan to use existing fair lending laws to ensure credit decisions by computer models are free from discrimination.
The Fed, the FDIC, and the OCC have requested information from the banks they supervise about their uses of AI, and warned those financial institutions to monitor their AI lending tools for discrimination baked into their models.
The OCC, under Acting Comptroller of the Currency Michael Hsu, also created a dedicated Office of Financial Technology to keep tabs on technological changes in the banking industry. Hsu has urged banks to put in place strong anti-discrimination controls.
Banks eyeing AI tools for anti-money laundering efforts and capital management can expect similar warnings.
3. What about the CFPB?
The CFPB has been especially active, with the agency’s director, Rohit Chopra, speaking frequently about the risks of AI.
Fair lending has been a top concern. The CFPB has said it intends to enforce existing statutes like the Equal Credit Opportunity Act in a technology-neutral manner.
In December 2021, the agency put out a call for AI industry whistleblowers to report practices that can harm consumers, including discriminatory lending practices.
But the CFPB’s focus goes beyond banks’ lending decisions.
The agency under Chopra has taken an interest in how banks deploy automated systems, including AI, to comply with consumer protection laws.
The CFPB also aims to bar data brokers and other companies from selling consumer data without their permission under a forthcoming rewrite of the Fair Credit Reporting Act. AI developers can use unauthorized customer data to build computer models and algorithms, the CFPB says.
In addition, the agency is concerned about banks’ use of chatbots and other tools, including AI, in their customer service operations.
4. What’s new in Biden’s order for banks?
Biden’s latest order focuses on the potential danger that AI tools will further entrench discrimination in both the rental and purchase housing markets.
The order calls on the CFPB and the Federal Housing Finance Agency to closely monitor the use of AI in tenant screening applications and other decision-making processes.
The CFPB and the FTC in February began an inquiry into rental background checks, with a focus on the use of algorithms in rental AI tools. The agencies said they were concerned about how those tools could be used to exclude otherwise eligible tenants from rental housing.
Monday’s order calls on the FHFA and the CFPB to push the companies they regulate to evaluate their underwriting and appraisal models for biases. It also instructs the Department of Housing and Urban Development to issue guidance on the use of AI in tenant screening and real estate advertising within 180 days.
The CFPB, an independent agency, was urged to issue similar guidance in the same time period.
To contact the reporter on this story:
To contact the editor responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.
