AI may help with background checks, but FCRA is still the boss
When the Fair Credit Reporting Act (FCRA) was introduced in 1970, Artificial Intelligence (AI) wasn’t on the authors’ minds, but a federal agency has confirmed that in 2024 the law does apply to data collected by AI.
Employers were warned about using AI to monitor and evaluate workers in guidance issued October 24 by the U.S. Consumer Financial Protection Bureau (CFPB).
The guidance says that employers using third-party consumer reports — including background checks and surveillance-based AI or algorithmic scores about applicants or employees — must follow FCRA rules.
This means employers must:
- Get prior consent from applicants and employees,
- Be transparent about information used in adverse employment decisions, and
- Allow applicants and employees to dispute inaccurate information.
Tools are increasingly invasive
The agency introduced the guidance saying that as employers increasingly use invasive tools to assess applicants and workers, people must be ensured they have rights over the data that is affecting their livelihoods and careers.
The FCRA is the principal law regulating employment screening, such as traditional background checks, also called “consumer reports.” The CFPB said it wishes to make clear that the FCRA’s protections apply to newer AI-based evaluation technologies, as well.
“The kind of scoring and profiling we’ve long seen in credit markets is now creeping into employment and other aspects of our lives,” said CFPB Director Rohit Chopra in a press release announcing the guidance. “Our action makes clear that long-standing consumer protections apply to these new domains just as they do to traditional credit reports.”
These “new domains” are AI tools that:
- Assess and evaluate candidates and employees;
- Monitor employee productivity; and
- Predict employee behavior, including the likelihood of engaging in union organizing or estimating the probability that an employee will quit.
Recent technologies may use sensitive information unknown to workers, the CFPB said, which can significantly impact:
- Hiring decisions,
- Job assignments, and
- Career advancement.
If reports contain inaccurate information that people aren’t given a chance to correct, it may cause them to:
- Lose job opportunities,
- Face unfair treatment, or
- Suffer career setbacks.
The FCRA rules must be followed regardless of the use of AI in compiling the background check information. The agency recommends that employers review their practices when it comes to background checks to ensure compliance with the FCRA.
Key to remember: A federal agency released guidance for employers reminding them that background checks conducted using new technologies, such as AI, are still subject to FCRA rules.