Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Could AI Help Remove Bias from Government Hiring Processes?

Shifts in how we think about work in a post-COVID-19 world could create an opening for fairer hiring with the help of asynchronous interviews, using artificial intelligence to help reduce recruiting bias.

computer code on blue background
Shutterstock/PabloLagarto
It is not a simple proposition, but the pandemic aftertimes, coupled with the novel use of some not-so-new technology, could provide an interesting test bed for hiring the right person for the right job as public agencies backfill open positions, all with the bonus of mitigating implicit bias against job candidates.

As with most inflection points, the moment has a number of moving parts. Employers, including public agencies, are scrambling to figure the flavor of hybrid working setups that will meet with the least resistance from employees while supporting basic operations. The best-case scenario would be to put public employees closer to the people who need government services — that probably is not the old office, but probably is not the employees’ homes either.

According to a Korn Ferry estimate, up to 40 percent of employees are contemplating a career change prompted at least in part by their pandemic experiences. Put simply, they want something else out of life, and the forced work-from-home (WFH) lockdowns gave many the sense that their work is not tied to a specific geography. (Through legislation and administrative rules, public employees may be more limited than their private-sector or nonprofit counterparts, but cubicles are no longer the only answer for most employees.)

The coincidence of last year’s WFH lockdowns along with a racial reckoning that manifested itself in many parts of the country may bring with it an opportunity to do things differently as government recruits for its next iteration as a hybrid workforce. Many public agencies are working aggressively to refresh policies to make them more welcoming or inclusive of underrepresented groups.

Just before COVID hit, no less an authority than the Harvard Business Review ran a piece called “Your Approach to Hiring Is All Wrong,” which detailed employers’ fears and employees’ frustrations about how the next hire gets made. The discussion included the need to clean up the hiring funnel to get more well-suited — and fewer — candidates that fit the role and the organization.

HBR quotes the editor of a newsletter who claims that “companies get five to seven pitches every day — almost all of them about hiring — from vendors using data science to address HR issues.”

As the keeper of the GovTech100 inventory of startups, our sister site Government Technology has some visibility into this generation of HR technology, which includes those that support asynchronous interviewing, in which applicants respond to interview questions independently. AI and machine learning can now not only administer the questions, but also score the responses. The approach may have considerable merit in making the hiring process more inclusive and fairer.

First, it takes the bias out of the mechanics of panel interviews. Rather than hoping interview panel members can approach each candidate with the same energy, the same attitude and ask the same questions in the same order (with the same social prompts), a screening driven by robotic process can deliver a consistent experience. And then it can transcribe and score the responses without human partiality. The prospective employer can set time limits for each response and give candidates the option of multiple responses — or not — but all candidates get the same shot.

Second, and perhaps more significantly, this technology can also be configured to mask cultural cues, including names, that could disadvantage candidates with certain gender, BIPOC (Black, Indigenous, people of color) or AAPI (Asian American Pacific Islander) characteristics.

It is incumbent on human resource professionals to carefully develop appropriate questions to use in this process — and for data scientists to catch biased algorithms — both of which are vital to giving everyone a fair shot.

At this moment in the country’s national life, it is important for organizations to act in good faith. One way to do that is to make hiring processes less of a black box affair and show our work. That is perhaps nowhere truer than in government. Technology can help.

Ironically, one of the best shots we have at a fairer, more diverse, and inclusive workforce is to take the human out of human resources.



Government Technology is a sister site to Governing. Both are divisions of e.Republic.
Paul W. Taylor is the Senior Editor of e.Republic Editorial and of its flagship titles - Government Technology and Governing.