
The California rules, which would regulate automated decision making in hiring, are designed to address a problem hiding in plain sight. Many large California employers outsource the initial screening of applications and resumes to third parties, like Indeed or Monster Jobs. These organizations use sophisticated AI tools to do the “first cut.” It makes sense for employers who might otherwise have to sift through thousands of applications.
But what’s in the algorithm? Does it weed out applicants over a certain age or graduates of historically black colleges and universities (HBCUs) or those with Middle Eastern- sounding last names? And if the algorithm runs afoul of California’s Fair Employment and Housing Act (FEHA), who is legally liable?
Troubling evidence
In July 2024, the Northern District of California held that Derek Mobley, an African American male over the age of forty, with a history of depression and a bachelor's degree in finance from an HBCU, could sue Workday, a third-party vendor that provided applicant screening services. The lawsuit was brought under several federal laws that prohibit employment discrimination, including Title VII of the Civil Rights Act.
Despite his qualifications, Mobley was denied employment for every one of the 100-plus applications that he submitted to companies using Workday's platform, sometimes in less than an hour, sometimes in the middle of the night. Even though the named defendant was the third-party service provider, cautious employers walked away with the realization that they could also be held accountable under principles of agency law.
iTutorGroup, which settled with the EEOC, was similarly accused of programming their tutor application software to automatically reject female applicants aged 55 or older and male applicants aged 60 or older. iTutorGroup allegedly rejected more than 200 qualified applicants based in the United States because of their age.
As hard as it can be for individual plaintiffs to assemble evidence of this kind of algorithmic discrimination, common experience suggests it is widespread. Did you never get a call back? Is your first name Jacinta or Jamal? Did you graduate from Howard, rather than Harvard? Was it in 1995?
Hmm.
This may be why the California Civil Rights Department chose to act.
California’s Fair Employment and Housing Act and Interviewees
FEHA protects job applicants and employees from discrimination and harassment based on protected characteristics like age, ancestry, color, disability, and more. It applies to public and private employers, labor organizations, and employment agencies. Job applicants are generally understood to have fewer protections than job holders, but they are not nonexistent.
For example, employers may not ask an interviewee about:
- age;
- marital status
- children
- plans for having children;
- religion;
- disability;
- race; or
- national origin.
And the solutions are still developing, especially considering the current federal administration’s pushback on diversity, equity and inclusion initiatives. This has particular significance for California employers that receive federal contracts.
Anti-anti DEI regulations – a small piece in a large puzzle
The proposed California rules would:
- clearly define “automated-decision-making” (ADS or AI) systems;
- prohibit ADS discrimination;
- expand the liability for agents developing ADS technology; and
- Increase recordkeeping requirements.
READ MORE CALIFORNIA LABOR LAW LEGAL NEWS
But does it go too far?
Challenges to come
The rules indicate that an employer’s use of AI tools cannot result in discrimination based on accent, English proficiency, and height and weight, which are technically not protected categories under California law. California law is already seen by some as broader than federal law. All this may lead employers to question whether proposed regulations create new protected categories.
In any event, it is clear that California employers that choose to use automated AI-driven screening tools to evaluate applicants have affirmative obligations to ensure that the new tools do not create a result that is prohibited under existing law.