President Trump’s recent Executive Order (EO) on “Ensuring a national policy framework for artificial intelligence,” should have auto finance industry worried.
Nominally, the EO appears very pro-business. The EO’s stated objective is “to sustain and enhance the United States’ global AI dominance through a minimally burdensome national policy framework for AI. While avoiding a complex, fragmented “patchwork” of state regulations is intuitively appealing to national firms, the approach of preempting state authority without a robust federal replacement creates a regulatory vacuum.
This is particularly concerning given the auto finance sector’s reliance on AI for everything from credit underwriting and fraud detection to compliance oversight—processes that directly impact consumer access, fairness, and legal recourse. The industry, and firms like ours, which are deeply invested in the compliant use of AI, must critically evaluate the downstream risks of a regulatory model built on an aggressive federal ceiling with no comprehensive floor.
Does the Executive Order Help Lenders Leverage AI?
For the lending community, there is a more pressing question: Does the Executive Order help lenders more effectively leverage AI, a technology that AI pioneer Andrew Ng has called the “new electricity?” Not at all.
The financial services industry’s use of AI must comply with a raft of existing federal laws such as the Equal Credit Opportunity Act (ECOA), the Fair Credit Reporting Act, the Fair Housing Act, and Federal prohibitions against Unfair, Deceptive, or Abusive Acts or Practices (UDAAP). As far back as April 2023, multiple federal agencies noted in a joint statement against discrimination and bias in automated systems that “existing legal authorities apply to the use of automated systems and innovative new technologies just as they apply to other practices.” Nothing in the EO will affect lenders’ continuing obligation to comply with these laws.
What’s At Stake: The Premature Preemption
Preemption is typically the capstone of a strong, established federal regulatory structure; here, it is merely a political starting point. The federal government has yet to pass a comprehensive, substantive AI law that creates a “meaningful level playing field,” as many state efforts—such as the Colorado AI Act—sought to do in the absence of Congressional action.
Instead, the EO directs agencies to use litigation, funding levers (like withholding federal broadband dollars), and administrative power to challenge state laws deemed “onerous”. This aggressive stance, driven by tech industry lobbying, threatens to eliminate existing or nascent consumer protections without establishing an equivalent, or better, federal standard.
For the auto finance industry, which operates under strict anti-discrimination and consumer protection laws (like ECOA and TILA), this lack of clarity is dangerous. Current federal frameworks, like those enforced by the CFPB and FTC, were designed before the emergence of complex, evolving AI models, which can exacerbate issues like algorithmic bias or “black-box” decision-making. Removing the ability for states to innovate and regulate in this gap increases the industry’s exposure to novel legal and compliance risks, not decreases it.
Stifling State-Led Innovation and Consumer Protection
Like many involved in the lending industry, I favor well-designed Federal preemption. When 50 states have conflicting data protection laws, a Federal national standard makes sense. Operating a national business under such constraints is torturous, time consuming, cost-prohibitive, and distracting.
However, within the AI Policy field, that is not the place we are now. Only four states—California, Colorado, Texas, and Utah—have enacted substantial AI laws, and the State law most relevant to banking, Colorado, is not even in effect yet.
- Colorado has championed the Colorado AI Act, which is the most notable in that it will require certain high-risk AI models “to protect consumers from known or foreseeable algorithmic discrimination”.
- California has passed the “Transparency in Frontier AI Act,” which requires large AI companies to disclose their risk protocols and greater transparency in their models.
- Utah enacted the AI Policy Act, mandating disclosures for generative AI in consumer interactions.
- Texas has passed several bills on responsible use, deepfakes, and biometric data.
While the Executive Order frames measures like those in Colorado as burdensome, they are, in fact, vital blueprints for establishing trustworthy AI practices. Preemption, in this context, risks tying the hands of states that are actively and intelligently reacting to emerging consumer crises.
Seeing The Bigger Picture: A Gamble on a Promise
The EO implies that State laws are superfluous because the Administration will fast-track federal AI legislation to address critical issues like child safety and the build out of a national AI compute infrastructure. This is dangerously wishful thinking. The speed of AI’s evolution far outstrips the pace of Congressional lawmaking. The real risk is precisely what industry observers fear: preemption will prevent States from reacting to an emerging crisis.
In the next year, the industry could face situations—such as a new wave of deepfake-driven identity fraud in auto applications, or systemic bias discovered in large language models used for customer service and debt collection—where swift, localized legal remedies and laws are necessary. If the Justice Department’s new AI Litigation Task Force successfully challenges state laws and a comprehensive federal solution is absent, the hands of both states and industry compliance officers will be tied. This regulatory paralysis leaves the public vulnerable and forces companies to operate in a gray zone, increasing the risk of class-action lawsuits based on ambiguous applications of old laws to new technology.
Responsible AI use requires proactive governance, not just a clear path for development. The current federal move risks prioritizing speed of innovation over the necessary safety, fairness, and accountability that a mature financial sector demands. Auto finance, with its unique blend of consumer credit, vast data, and regulatory scrutiny, needs more than a policy of regulatory forbearance; it needs a durable and detailed playbook that the current Executive Order fails to provide.
Related Stories:
