All the Ways Hiring Algorithms Can Introduce Bias
Miranda Bogen
ArticleIn Harvard Business Review, Miranda explains what we mean when we talk about “hiring algorithms” and why predictive hiring technology is far more likely to erode equity than it is to promote it.
Understanding bias in hiring algorithms and ways to mitigate it requires us to explore how predictive technologies work at each step of the hiring process. Though they commonly share a backbone of machine learning, tools used earlier in the process can be fundamentally different than those used later on. Even tools that appear to perform the same task may rely on completely different types of data, or present predictions in substantially different ways. An analysis of predictive tools across the hiring process helps to clarify just what “hiring algorithms” do, and where and how bias can enter into the process. Unfortunately, most hiring algorithms will drift toward bias by default. While their potential to help reduce interpersonal bias shouldn’t be discounted, only tools that proactively tackle deeper disparities will offer any hope that predictive technology can help promote equity, rather than erode it.
Related Work
We urged the Subcommittee to ensure that hiring technologies are developed and used in ways that respect people’s civil rights, and offered recommendations concerning transparency and oversight.
Labor and EmploymentWithout active measures to mitigate them, bias will arise in predictive hiring tools by default. This report describes popular tools that many employers currently use, explores how these tools affect equity throughout the entire hiring process, and offers reflections and recommendations on where we go from here.
Labor and EmploymentAfter Google’s announcement that it will ban ads for payday loans, Aaron explains why this was a good call.
Credit and Finance