Hiring tools and bias, regulatory posture

By AI Resource Zone Admin · March 10, 2026 · 4 min read

Automated hiring tools face growing rules on bias audits and candidate disclosure. The map is uneven but becoming clearer.

Share LinkedIn X Facebook

Automated tools that screen resumes, score video interviews, or rank candidates have attracted early and specific regulation. New York City's Local Law 144, effective in 2023, requires employers using automated employment decision tools to commission an independent bias audit and to notify candidates before use. Illinois's Artificial Intelligence Video Interview Act, on the books since 2020, imposes consent and disclosure obligations for video-based AI analysis. Several other states have followed with their own narrower statutes.

At the federal level in the United States, the Equal Employment Opportunity Commission has repeatedly stated that existing civil rights statutes, including Title VII of the Civil Rights Act and the Americans with Disabilities Act, apply to algorithmic hiring decisions. Enforcement examples include a settlement with a tutoring company over age discrimination allegedly embedded in its screening tool. In Europe, the AI Act lists employment-related decisions as high risk, which triggers conformity assessment and documentation requirements before deployment.

Bias audits themselves are not a standardized product. Methodologies vary in how they define protected groups, how they handle intersectional categories, and whether they consider deployment context or only the tool in isolation. Some auditors publish their methods; others treat the approach as proprietary. Candidate notices, where required, also vary widely in clarity, and enforcement of notice obligations has so far been limited compared with the attention paid to audit thresholds.

Editor's note: Candidates benefit when employers disclose tool use early and in plain language. They also benefit when audits are published rather than merely performed. Regulators that push for public reporting, even at an aggregate level, will produce a better feedback loop than those that settle for private attestations. For employers, treating algorithmic hiring as another form of employment decision subject to long-standing civil rights duties remains the most defensible posture.

Share LinkedIn X Facebook