Don’t Rely on Data Science and Machine Learning to Hire (Yet)
In 2003, a team of researchers at the National Bureau of Economic Research performed a field experiment to measure racial discrimination in the labor market. They responded with fictitious resumes to help-wanted ads in Boston and Chicago newspapers. To manipulate perception of race, each resume was assigned either a very African American sounding name or a very White sounding name. The results
show significant discrimination against African-American names: White names receive 50 percent more callbacks for interviews. (Marianne Bertrand, Sendhil Mullainathan, 2003) When such unmeasurable factors like human bias and prejudice are prevalent, plain sailing algorithms become biased, resulting
into machine learning inaccuracy and misinterpretations, and thereby leading towards lack of diversity and talent.
It is a fact that due to shortage of time and budget, the companies now have to find a way to fight the war for talent. The new research from the McKinsey Global Institute (MGI) suggests that by 2020, the world could have 40 million, too few, college-educated workers. (Richard Dobbs, Susan Lund, and Anu Madgavkar, 2012) Moreover, according to the Manpower Survey 2020, 54 percent of companies
reported skill shortages with businesses in 36 out of 44 countries finding it more difficult to attract skilled talent than in 2018. (ManpowerGroup Employment Outlook Survey, 2020) With job markets getting more and more competitive, the tech companies are turning their heads to latest technologies
like artificial intelligence and machine learning algorithms to hire the best candidates for their companies.
“To change the way the companies have been hiring, companies are increasingly turning to HR-tech vendors, driving the HR-tech market to be worth at least USD$3.6 billion by 2024.” (Driving the HR-tech market to be worth at least USD$3.6 billion by 2024., 2019) This resulted in great development. The companies are increasingly using artificial intelligence and data science to determine their hiring decisions.
The algorithms developed by data scientist are meant to avoid human bias, but algorithms are in fact, increasingly biased. Originally it is believed that humans are unjust and possess judgmental behaviour, which makes the hiring decisions partial. Although the algorithms remove those clouded judgments but they outperform the human bias. Amazon was the victim and culprit of that. Amazon relied on an
internal ‘secret’ hiring algorithm that was created in 2014, to assess candidates based on their skills relative to the top performers in the company. It was found three years later when they discovered that it favored males. The company’s top performers were disproportionately males (which could be a result of inherent gender bias). (Maude Lavanchy, 2018) .
Never Fully Trust Algorithmic Recommendations
The companies often fall into trap of these algorithms to make decisions regarding hiring of candidates. These algorithms are written based on different human mentality and it does involves human perceptions. This takes us to our original premise that the hiring algorithms are not clear enough to tell that they are fair or not. (Louis DiPietro, 2019)
While there is still a bright future for this, for now, hiring algorithms are still not the solution that companies are hoping for and they must be careful about it.
Humans Must Always be Behind the Wheel
The Tesla cars introduced the feature of autopilot in the year 2014 and it is meant that “cars can automatically steer, brake and accelerate within its lanes.” Recent development has allowed it to navigate complex, tighter roads as well. However, it’s not full autonomy. Tesla clearly states: “Current Autopilot features require active driver supervision and do not make the vehicle autonomous.” (Future of Driving)
Though making hiring process autonomous is a lengthy process, which is going to take a few years, we need to keep working in the right direction by reviewing the algorithms rigorously. It should be taken as an aid and not as the cure.
Data Needs to Have Context
To create a good algorithm, data scientists must understand the variables in the context of employment. Every algorithm is different. Data will not be enough to predict a person’s potential at work without context. To avoid this, hiring algorithms often push to get more data.
Today, the companies need to tackle the power of algorithms and making data-driven decisions. To move fast and hire faster, the companies need to produce algorithms which are not biased and also they have all data which is need to get the candidate hired. Also, the algorithms need to be totally trust worthy and for that the companies needs to invest a lot of hard work and money. Companies need to consider a few factors when using an algorithm to make hiring decisions:
Have a unity in company-wide about what is considered fair in hiring process.
The candidates, hiring managers, team leaders, and the AI team all have different views on what is considered a fair hiring decision. Every point of view makes sense in a different way and so getting clarity on the process can help avoid poor decision making.
Use Algorithms as a guide and not as a pill.
The algorithms are not inherently neutral. Having the human opinion encoded, which means that it is the digital human with single line thinking: how much this person is going to match with my criteria? Over relying on algorithms can result in worsening the process.
Algorithms may lead to similarity.
Suppose your algorithm hires candidates X and Y in the company. This means that they are being hired based on same criteria and this leads to lack in diversity, not just of race and gender, but also of thought and leadership. Will the algorithms eventually rule out the human bias? The answer may be a yes, but it is too early to
claim it. Perhaps in time, we might see a day where a company focuses on the details, fixing bias in hiring with machine learning like every HR professional yearns to do. (Andy Chan)
Driving the HR-tech market to be worth at least USD$3.6 billion by 2024. (2019, 6 20).
(2020). ManpowerGroup Employment Outlook Survey. Milwaukee: The Economic Times.
Andy Chan. (n.d.). Don’t Rely on Data Science and Machine Learning To Hire (Yet). towards data science. A Medium.
Future of Driving. (n.d.). Retrieved from tesla.com: https://www.tesla.com/autopilot?redirect=no
Louis DiPietro. (2019, 11 20). Are hiring algorithms fair? They’re too opaque to tell, study finds.
Marianne Bertrand, Sendhil Mullainathan. (2003, July). Are Emily and Greg More Employable thanLakisha and Jamal? A Field Experiment on Labor Market Discrimination. National Bureau of Economic Research.
Maude Lavanchy. (2018, 11 1). Amazon's sexist hiring algorithm could still be better than a human.
Richard Dobbs, Susan Lund, and Anu Madgavkar. (2012, November). Talent tensions ahead: A CEO briefing. McKinsey Quarterly.
© 2020 Kairavi Shah