Few companies apply New York’s new automated AI hiring law

  • New York’s Law 144 compels companies that use automation and AI in their hiring process to report it
  • The law went into enforcement in July 2023 but very few companies are complying
  • Low compliance and legal vagueness point to the challenges of regulating AI accountability

New York City has become the first in the world to enact a law regulating the use of automated employment decision tools (AEDTs). Researchers from Cornell University found that difficulties with the interpretation and enforcement of the law result in few companies sticking to the rules.

AEDTs use machine learning, statistical modeling, data analytics, or artificial intelligence to help employers and employment agencies decide which applicant to hire while replacing discretionary decision-making.

These tools use AI to process piles of resumes, analyze sentiment or psychological factors from an interview, and a host of other factors to eventually say, ‘Hire this guy.’

With the inherent biases in AI models, AEDTs can manifest bias in their selection of the job candidates they recommend, hence the introduction of Local Law 144 (LL144).

The law, which was enacted on 1 January 2023 with enforcement starting last July, aims to combat the potential harm the use of AEDTs could bring to marginalized job seekers.

The law says employers in New York City can use AEDTs but they need to have their tools audited for bias, publish the audits, and inform job applicants that they’re using the tool in their hiring process.

How’s the law working out?

Researchers from Cornell University found that the vagueness in the definition of what AEDTs are and how they are used may be behind very low compliance.

The researchers sent out 155 undergraduates to act as potential job seekers and found only 18 bias audits and 11 transparency notices from nearly 400 employers that they analyzed.

Are employers not having audits done or publishing transparency notices because they don’t meet the threshold defined by the law? Maybe, but with the reported widespread use of AEDTs, it’s unlikely.

Lucas Wright, a PhD candidate who co-authored the study, said, “We found that, because the law gives discretion to employers to determine whether the law applies to their use of automated employment decision tools (AEDTs), we can only make claims about compliance with the law, not non-compliance.”

While the idea behind the law was good, Wright said, “While the law has created a market for independent algorithm auditors and led to some transparency, it has actually created incentives for employers to avoid auditing.”

AI tools are going to increasingly be used to make hiring decisions and unless strict rules are coded into them, AEDTs will continue to perpetuate biases that were in their training data.

LL144 is a good start, but it doesn’t say what should happen if an AEDT fails a bias audit. It also says that anyone can be an auditor as long as they don’t work for the employer or the publisher of the AEDT.

Transgressors of LL144 face fines between $500 and $1,500. The law relies on self-regulation and doesn’t give the Department of Consumer and Worker Protection (DCWP) proactive investigatory or discovery powers.

This makes the low level of compliance unsurprising. If lawmakers want to shield job seekers from a potentially glitchy AI deciding their future, they may need to relook at LL144.

© 2023 Intelliquence Ltd. All Rights Reserved.

Privacy Policy | Terms and Conditions

×
 
 

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI


 

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.



 
 

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions