New Reference Check Law (AB-2534): What it Means for School Districts
What is Bill AB 2534? Put into effect September 2024, Bill AB-2534 requires all local education agencies (LEAs...
Sean Fahey (CEO of VidCruiter) hosted a core session at HR Tech Conference 2023 about the future of interviewing, focusing on the advancements in AI and its regulatory and ethical limitations, alongside leading hiring tech such as workflow automation and structured interviewing.
Unum uses hiring tools such as structured interviewing, video interviewing, and workflow automation tools to create a highly efficient, engaging, and equitable hiring process.
Holly Haynes (AVP of Talent Acquisition, Unum) explains the benefits of using currently available recruitment tech such as VidCruiter’s structured interviewing, engaging interview processes, and custom automation.
Unum hires between 2000-3000 people using VidCruiter. “VidCruiter keeps us efficient, it’s also fair and equitable. And we can see from a continuous improvement perspective, are we getting hung up in the process? And if we are, we can make some changes.”
There are 1000s of transactions that take place to get to those 2000-3000 hires every single year. Tools such as automated workflows and pre-recorded video interviews can significantly reduce administrative tasks and allow recruiters to focus on the candidates.
“We’ll have saved 15,000 hours of recruiter time by the end of this year through having this technology. Now our recruiters are really able to act like talent advisors.”
Unum’s 90-day attrition is down by 40% for the last year. Employees are engaged and happy to be working there, which has a direct relationship benefit for their external customers, too.
AI hiring tools aren’t scientifically backed yet. HR teams should acknowledge the current capability of AI and prepare for the future.
Olivia Gambelin (AI Ethicist and CEO, Ethical Intelligence) asks the audience, “Who’s excited about AI and what it could bring to the interviewing process?” Several members raise their hands, and a man in the front row signals “so-so”. She then asks who is a bit frightened of AI. More raise their hands, and the man up front confidently raises his up high.
Olivia shares how AI tools provide new ways of testing candidates’ fit, such as games that test cognitive capacity. However, these tools are still in their infancy. “Most of these AI tools are not actually scientifically backed. They're all still in the innovation phase.”
“You have to be incredibly critical when it comes to analyzing these new styles of interviewing. With this new AI innovation, you may be looking at the correlation instead of the causation.” Additionally, a lot of AI tools collect data on ‘good’ job candidates only. To hire from a diverse demographic, teams need data on the entire candidate pool to learn what makes an unfit candidate versus a good fit candidate.
“It's kind of like a Wild West in terms of regulation and AI. We're just starting to get these regulations and policies coming into place. The US is behind. We are all looking towards Europe which is leading this movement.”
The EU AI Act will be enforced by 2026. Organizations with ties to the EU need to ensure that AI vendors have been certified or they’ll risk a fine, and for a high-risk industry such as HR, that fine could easily be millions of dollars. We’re already seeing the tip of the US regulations come into effect in places like New York City, Maryland, California, New Jersey, and more.
“Start preparing for the regulations. It's so much easier to do while we still have a little bit of wiggle room than in 2026 when you're facing a substantial fine of over 35 million dollars.”
Having confidence in using AI for hiring can be difficult because of reputational risks and legal risks. But also because you might just be missing great candidates.
Andrew Buzzell (Director of Responsible AI, VidCruiter) shares how VidCruiter looked into AI capabilities back in 2014, only to realize after a lot of research that AI had a long way to go before it could be used ethically.
“People from different backgrounds could describe exactly the same experience but in very different ways, which will change the way the AI system assesses.” Research showed that using “we” rather than “I” when discussing a job project could lead to a lower evaluation.
It’s not easy to identify bias within the algorithms through an audit, especially for the many organizations using a vendor for their AI. “It can be very difficult to get inside these systems and know how they’re coming up with their results.” Some developers wrongly try to use the 4/5 rule to detect bias in their processes, but the rule is meant as an indicator that further investigation is needed. Recent court cases have determined legal responsibility with as little as 1% disparity affecting protected groups – far below the threshold given in the 4/5 rule.
In Andrew’s expert opinion, AI cannot fairly evaluate candidates at this time. However, he is optimistic there may be low-risk opportunities that would allow AI to improve the hiring process.
“In almost all cases, the AI systems are pointed to the candidate. What if it was turned around? What if we look at using AI to improve the interview process to make structured interviewing more accessible?”
“At first, we wanted to be early adopters, but we felt AI wasn’t ready. Now, we have regulatory responses that give us confidence in our decisions. We have a body of research that helps us make decisions about which kinds of applications of AI can be explored in the HR context, in terms of compliance, as well as fairness and validity.”
Thanks to our panelists for sharing their views and expertise. We’ll see you at HR Tech 2024!
Explore these additional articles for more insights.