Help

BI’s Article search uses Boolean search capabilities. If you are not familiar with these principles, here are some quick tips.

To search specifically for more than one word, put the search term in quotation marks. For example, “workers compensation”. This will limit your search to that combination of words.

To search for a combination of terms, use quotations and the & symbol. For example, “hurricane” & “loss”.

Login Register Subscribe

AI-driven hiring could spark employment litigation

Reprints
AI video interview

Companies using artificial intelligence technologies in the recruitment and hiring process may face employment practices claims if other states follow Illinois in trying to restrict the use of artificial intelligence in video interviews.

The Artificial Intelligence Video Interview Act passed by the Illinois General Assembly on May 29 and sent to Gov. J.B. Pritzker is considered first-of-its-kind legislation but could be adopted by other states with strong employee and job applicant protections, experts say.

The Illinois governor has 60 calendar days to sign a law or veto it — doing nothing means the bill becomes state law after the 60 days. The Governor’s office was not immediately available to respond to a request for comment.

The law is an “important first step” in legislating how artificial intelligence should be used in the employment context, said Gary Clark, Chicago-based chair of the labor and employment group for Quarles & Brady LLP.

“The failure to fully investigate and understand the use of artificial intelligence technology whether in this context or others could expose employers to huge legal liability if they just jump right in and start using it,” said Mr. Clark.

“Any time you are using artificial intelligence in the human resources or recruiting or employment function, you need to do a deep dive into how it’s being used and to fully evaluate what risks may result from how it’s being used with regard to video interviews,” he said.

While the Illinois statute requires employers to notify applicants before an interview that artificial intelligence may be used and explain how the technology works, it “doesn’t have a whole lot of teeth” in its current form, said John Litchfield, senior counsel at Foley & Lardner LLP in Chicago.

“It’s hard to know if rules will be developed by a state agency like the Department of Labor to enforce the law and, if they are developed, what type of penalties or issues may arise as a result for employers and for companies developing the software,” he said.

The Illinois law is considered the first of its kind, but other states may follow its lead, which could trigger an increase in employment practices liability insurance claims and force companies to increase their focus on managing these risks, experts say.

“There will be other states that will come on board, mainly the employment-sophisticated states like California, New York, Massachusetts, New Jersey — states that have strict regulations over how employers can treat employees and applicants,” Mr. Litchfield said.

While in its current form there are no penalties or fines attached to the Illinois act, it is “definitely something to watch for in case there are amendments,” said Sheryl Falk, Houston-based partner and co-lead of the global privacy and data security task force at Winston & Strawn LLP.

Citing another ground-breaking law, the Illinois Biometric Information Act passed in 2008, Ms. Falk said “the keen difference is that the Illinois (biometrics) law provides a private right of action, so a consumer can sue the employer if the employer has not complied with the law.”

“Any time there’s a new law on the books that’s setting up best practices and required standards, this raises liability. It’s even more critical for companies to be clear and accurate about how they are using this data,” said Ms. Falk, who is also a member of the technical advisory board of Utah-based HireVue Inc., a software firm that uses artificial intelligence in its video interviewing platform.

The Illinois artificial intelligence statute incorporates elements from the ethical frameworks that many companies already operate under, such as concepts of notice and transparency, she said.

Employers also must obtain consent from applicants before the interview and only share the videos with those whose expertise is necessary to evaluate the applicant’s fitness for a position, she said.

Similar to the Illinois Biometric Information Act, assuming a private right of action is granted under the law, there is a potential for claims alleging that the employer failed to adhere to the specific notice requirements of the Artificial Intelligence Video Interview Act, said Talene Carter, New York-based national employment practices liability product leader for FINEX North America at Willis Towers Watson PLC.

“Those types of claims could trigger employment practices liability coverage as invasion of privacy, as misrepresentation,” Ms. Carter said.

However, the act could also spur an increase in other, more traditional employment practice violations, experts say.

These types of issues relate to whether employers are screening applicants out based on their gender, race or disability at a higher rate than people that don’t fall within those protected categories, said Mr. Litchfield.

“It’s important for employers and software developers creating this technology to keep in mind the traditional, everyday bread-and-butter labor and employment issues that exist regardless of whether an interview is taking place by a machine or robot or a human-to-human interaction,” he said.

If the artificial intelligence technology that evaluates a candidate’s suitability for a job based on certain characteristics such as speech patterns or body language is found to discriminate against people with disabilities, that could violate the Americans with Disabilities Act, said Mr. Clark.

“Similarly, if the characteristics or speech patterns this technology looks for tend to discriminate against a specific race, gender, national origin or sexual orientation, that could have huge risks as it could violate Title VII,” he said, referring to the Civil Rights Act of 1964.

There are different ways coverage under an employment practices liability insurance policy could be triggered, said Ms. Carter.

“There’s a potential for discrimination claims, there’s a potential for failure to hire, there’s the potential for unintentional discriminatory impact,” she said.

For example, there’s no reference in the statute to what happens if an individual did not consent to the use of artificial intelligence, she said. In this scenario, if an employer removes an applicant from consideration, “there might be a potential claim there,” Ms. Carter said.

State Rep. Jaime Andrade Jr., D-Chicago, a sponsor of the legislation, said the purpose of the act is not to “set up landmines for companies so that people can sue.”

“It is about being fair and transparent,” he said.

Kevin Parker, chief executive officer of HireVue Inc., said the company is “broadly supportive of the intentions and goals of the act in terms of candidate privacy and transparency.”

“There’s always been an abundance of caution for companies broadly to make sure there’s no adverse impact associated with any part of the hiring process. That has always extended to how they think about how they use video interviewing generally and how they use artificial intelligence to support their interviewing efforts,” Mr. Parker said.

HireVue does a “significant amount” of analysis before interviewing anyone to ensure “there is no adverse impact, or we eliminate it if we find it,” he said.

“We do a lot of testing with gender differences, age differences, ethnic differences … We can look at every group as they go through the model and see where the adverse impact might arise and eliminate that from the analysis going forward,” he said.

In the meantime, both employers and insurers should be looking at their employment practices liability insurance policies to check their coverage for “artificial intelligence-type liability in the employment context,” said Mr. Litchfield.

“Everyone should be looking at this and analyzing the risks,” he said.

From the insurers’ perspective, if a private right of action is added to the statute and an influx of claims follows, markets may start to narrow coverage for invasion of privacy employment practices liability violations, said Ms. Carter.

“There’s a potential that as we start to see more use of artificial intelligence in the employment context and if we see more states enacting similar statutes, that we may see some changes in the underwriting of employment practices liability policies,” she said.

 

 

Read Next

  • Illinois high court takes on biometrics privacy case

    The Illinois Supreme Court is set to decide whether plaintiffs can successfully sue firms for violating the Illinois Biometric Information Act for allegedly failing to properly notify people about their policies even if no actual harm is claimed.