Human resources firms cautiously adopting AI

  • Comments
  • Print
Listen to this story

Subscriber Benefit

As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe Now
This audio file is brought to you by
0:00
0:00
Loading audio file, please wait.
  • 0.25
  • 0.50
  • 0.75
  • 1.00
  • 1.25
  • 1.50
  • 1.75
  • 2.00

Artificial intelligence has gotten a lot of headlines—and hype—since San Francisco-based OpenAI introduced its generative AI tool, ChatGPT, last November.

But many local companies say they are taking a cautious approach when it comes to using AI for human resources tasks even as they also see great potential for the technology.

Darrian Mikell

“We’re considering a lot of different areas for AI to be involved,” said Darrian Mikell, co-founder and CEO of Indianapolis-based Qualifi Technologies Inc. “There’s a lot of opportunity there.”

Qualifi, which launched in 2019, sells a product that allows customers to quickly conduct initial telephone screenings of job candidates. The company’s customers are typically employers that do a large volume of hiring, including hospitals, call centers and staffing agencies.

As of now, Mikell said, Qualifi makes “very light” use of AI—mostly to transcribe candidate interviews and identify keywords mentioned in those interviews.

But another local HR-tech firm, Indianapolis-based Pillar, has built AI into its core product—a tool that helps customers gain insights from interviews conducted in person, by phone or online.

Mark Simpson

“Pillar is built with AI at its core,” said CEO Mark Simpson, who added that he has used some form of AI or machine learning for 17 years at the three companies he’s founded, including Pillar.

The growth of such tools and the increasing sophistication of the technology has made AI a target for regulators, especially in hiring.

Last month, the U.S. Equal Employment Opportunity Commission issued guidance clarifying that Title VII of the Civil Rights Act—the law prohibiting employment discrimination based on race, color, national origin, religion or sex—also applies to automated and AI-powered human resources tools.

“Without proper safeguards, their use may run the risk of violating existing civil rights laws,” the EEOC’s May 18 statement said.

And on July 5, New York City will begin enforcing what’s believed to be the nation’s first ordinance regulating the use of AI-powered hiring tools. Under the ordinance, employers who use such tools must audit the tools annually for bias.

According to an analysis of the ordinance by San Francisco-based labor and employment law firm Littler Mendelson PC, the law likely applies only to New York City residents in New York City-based jobs, though this is not clearly stated in the statute.

Mikell, however, expects to see ordinances like New York City’s become more widely adopted.

“The anticipation is that it will start to go to more states over time and potentially to the federal level,” he said.

Unintentional bias

Keeping track of such rules will matter more and more for companies that create HR tech and for those that use them.

Qualifi will soon roll out an AI-powered feature that will allow customers to transform text into speech. The tool is designed to help recruiters more easily generate interview questions and personalize conversations at scale—perhaps using the job applicant’s name during the automated phone interview, for instance.

But Mikell said the company is mindful of potential pitfalls—namely, unintentional bias that AI might introduce into the process. In fact, one of Qualifi’s pitches to customers is that its automated interview process can help reduce the unconscious human bias that can affect hiring decisions.

Being unbiased “is such an important factor when people are looking at jobs,” Mikell said. “They want a fair and equal chance to get that job.”

In one high-profile example of what can go wrong, Amazon.com Inc. in 2017 scrapped an AI-powered recruiting tool that scored job candidates based on their resumes. Amazon discovered the technology was penalizing resumes that included the word “women’s,” but after fixing that problem, Amazon lost confidence that it could eliminate other potential discrimination from the system, Reuters reported.

Of course, AI has improved since then. ChatGPT and similar products such as Google’s Bard, are what’s known as generative AI, trained on huge pools of data and capable of creating new writing or images in response to prompts. But while more technically advanced than previous generations of AI, the new tools still have limitations.

Karl Ahlrichs

Karl Ahlrichs, a senior consultant at Indianapolis-based Gregory & Appel Insurance, said it’s hard to know how many employers are actively using AI for one or more of their human resources functions. But if pressed, he’d estimate that it’s no more than 25%, he said.

Ahlrichs has experience in a variety of areas, including human resources, insurance, content creation and computer programming.

Job applicants have already started using AI tools to give them a leg up on the job-search process. They might use ChatGPT to complete writing assignments or other skills tests, Ahlrichs said. But the situation is different on the other side of the interview table.

Compared with other fields, Ahlrichs said, people in human resources tend to be cautious and slow to make decisions. “They need to follow all the regulations—and they need to make good decisions, not fast decisions.”

Pillar, for example, is not intended to make the hiring decision. “We never use AI to judge a candidate,” said Simpson, the CEO.

For instance, Pillar’s technology includes AI tools that can help interviewers develop good questions to ask candidates for different jobs. But it doesn’t evaluate how well the candidate answers those questions, Simpson said.

The limits of AI

Pillar is continually evaluating new ways it might use AI in its software platform, Simpson said, but in some cases, the company is waiting for the technology to catch up.

“Some of it is just not ready for our enterprise customers,” he said.

For example, if a customer is interviewing candidates for a sales position, the customer can now use Pillar’s software platform before the interview to generate a list of questions specific to that job. In the future, Simpson said, Pillar could add a feature that uses generative AI to produce real-time questions related to specific things said in the interview.

The reason Pillar hasn’t yet rolled out such a feature is that, in its current iterations, AI would likely “surface questions that you definitely don’t want people to ask,” Simpson said. “We need to work out a way to train those biases out of the results that we’re given.”

Simpson said he thinks that, within a year or so, AI might become sufficiently advanced to make this type of tool feasible.

OpenAI acknowledges that its technology, while improving, has its limitations.

In introducing its most recent generative AI tool, GPT-4, OpenAI said in March that the model was trained on data, including publicly available internet data, that includes “correct and incorrect solutions to math problems, weak and strong reasoning, self-contradictory and consistent statements, and presenting a great variety of ideologies and ideas.”

For this reason, OpenAI said, the base model of GPT-4 might respond to prompts “in a wide variety of ways that might be far from a user’s intent.”

Getting started

Other local firms are just beginning to venture into AI.

Indianapolis-based Fullstack, which offers outsourced human resources services, is in the process of implementing its first AI tool—something that will handle administrative tasks such as logging employee terminations and address-change information for the firm’s various clients.

Ann Brandon

Ann Brandon, Fullstack’s director of operations, said the firm views AI as a tool to help it achieve its growth goals. “We were a startup, and now we’re in scale-up mode,” she said.

Fullstack’s staff of 14 provides HR services for clients with a combined total of nearly 1,200 employees, Brandon said, and using the AI automation tool for routine tasks will free up that staff to concentrate on more substantive work.

Brandon said it’s too early to say how else Fullstack might implement AI beyond the automation tool it expects to start using within the next month.

“I don’t think at this point we’re thinking beyond just some automations to make us more efficient,” she said.

Indianapolis attorney Phillip Jones, who specializes in employment as an associate with the firm Ogletree Deakins, said AI has many potential uses in human resources, from analyzing job candidates to determining the best approach for successfully recruiting those candidates.

Phillip Jones

But he also offered a couple of caveats.

Companies should be wary, Jones said, of adding sensitive data such as personal employee information into an open-source AI tool where it could be used to generate responses for other users. “You have all this employee data,” he said. “You don’t want to be inputting it into different things.”

And companies that use third-party AI tools should make sure they understand the algorithm those tools use, Jones said. If the use of AI tools results in bias that violates employment laws, for instance, the company using those tools might be on the hook.

And perhaps the main danger, Jones said, is that generative AI is prone to what are known as hallucinations—generating confident-sounding responses that are factually incorrect.

As part of a recent presentation about AI, Jones said, he asked ChatGPT to summarize the most recent Indiana court case on the use of AI in the workplace. The tool produced details about a case that Jones couldn’t find himself because it didn’t exist. Jones said he had to challenge ChatGPT twice before the tool acknowledged that its previous answers had been incorrect.

This experience shows that blindly relying on ChatGPT is a big mistake, Jones said. “You need to question it, for sure.”•

Please enable JavaScript to view this content.

Story Continues Below

Editor's note: You can comment on IBJ stories by signing in to your IBJ account. If you have not registered, please sign up for a free account now. Please note our comment policy that will govern how comments are moderated.

Get the best of Indiana business news. ONLY $1/week Subscribe Now

Get the best of Indiana business news. ONLY $1/week Subscribe Now

Get the best of Indiana business news. ONLY $1/week Subscribe Now

Get the best of Indiana business news. ONLY $1/week Subscribe Now

Get the best of Indiana business news.

Limited-time introductory offer for new subscribers

ONLY $1/week

Cancel anytime

Subscribe Now

Already a paid subscriber? Log In

Get the best of Indiana business news.

Limited-time introductory offer for new subscribers

ONLY $1/week

Cancel anytime

Subscribe Now

Already a paid subscriber? Log In

Get the best of Indiana business news.

Limited-time introductory offer for new subscribers

ONLY $1/week

Cancel anytime

Subscribe Now

Already a paid subscriber? Log In

Get the best of Indiana business news.

Limited-time introductory offer for new subscribers

ONLY $1/week

Cancel anytime

Subscribe Now

Already a paid subscriber? Log In