Is Corporate America Making a Mistake by Letting AI Take Over the Hiring Process?

AI is clearly, the greatest technological advancement since the introduction of the internet. In the race to modernize and streamline operations, corporate America has embraced AI as not only a tool, but as a centerpiece to many of their business units. Human Resources and the screening and hiring process is an area that seems to be really leaning to AI management. From scanning resumes, handling job-seeker accounts, and even conducting virtual interviews, AI is now shaping the collective destinies of millions of job seekers. The promise for corporate America is alluring — efficiency, fairness, and cost savings. Yet those of us on the outside, watching this hard shift are asking the same troubling question: Is this progress or peril?
The Continuing Pull of Automation
It's no secret; putting a job opening online is nothing short of overwhelming. Dozens or thousands of applications can come in through various channels in a rather short span of time. For decades, HR professionals and recruiters have been overwhelmed by the sheer volume of applications. With the advent of AI, and the emergence of dozens of companies offering to manage HR functions through AI platforms, changing from the old way to this new way seemed like the perfect solution. Promises of speed, impartiality, and no limits on how it could be scaled, spoke like a magic solution to corporate America. As early adopters, companies like Unilever and Hilton immediately boasted of the successes they had using AI to screen candidates more quickly and identify the best fits.
In what seemed like a snap of the fingers, America went from reading resumes by hand to handing the task off to AI. According to the Equal Employment Opportunity Commission, over 80% of U.S. employers now use some form of automation in hiring. And why not, one might ask. Powerful algorithms quickly analyze resumes, rank candidates, and even conduct video interviews using facial and vocal analysis to gauge traits like confidence or enthusiasm.
From the Board Room, the upside with AI is undeniable, and cost-conscious. Implementing a well-trained AI program can do things that humans could easily miss such as identifying transferable skills, checking social media histories, and doing deep searches on a candidate's internet presence. In theory, this kind of power appears to be a game-changer, giving the corporations a clear upper hand. In practice, however, the results so far have been less than perfect.
When the Machines Get It Wrong
Emerging technologies have flaws, regardless of how much work is done prior to implementation. These flaws are often exposed in the early stages and have developers scrambling to put on band-aids to keep them from being sent to the scrap heap. But when the flaw is exposed by one of the largest companies on earth, the stakes are too high to measure. Amazon, the internet giant, developed an internal AI recruiting tool to evaluate applicants, only to discover in short order that it had learned to discriminate against women. How could this happen, one must ask. We must remember that AI modules learn from data. In this case, the algorithm was trained on past hiring data. Data that was dominated by male engineers, led the program to downgrade resumes that included any reference to women. Resume topics such as “women’s chess club” or “women’s coding society” hurt the applicant's chances, rather than help them. Once the flaw was identified, Amazon scrapped the project completely.
The Amazon example isn't the only evidence of poor implementation. The University of Washington commissioned a study only to discover that large-language-model screening tools showed inconclusive and troubling results when screening identical resumes. The difference depended on the implied gender or race of the applicant’s name; the AI model was showing a clear assumptive bias. Another company, Workday, Inc., was the subject of a 2024 lawsuit, alleging that its AI-driven hiring software discriminated against older and disabled applicants.
What Is Lost When the Human Factor is Factored Out
Human intuition: that powerful gut feeling that tells us when things are good or bad. It's been the deciding factor on going to war, making an investment, choosing a mate, and hiring or firing employees. AI might be able to assess education, work experience, and certain skills, but it can’t truly read people. It has no mechanism for seeing value in candidates of different cultures, emotional intelligence, or the spark of creativity that often defines a great hire.
On the outside of the corporate towers, job seekers are becoming frustrated. They feel uncomfortable at being treated like a number, often passed over due to being judged unfit for a particular role by AI. It is impersonal and often alienating to apply for job after job, hoping and praying that a computer program will somehow pick your resume for further consideration. Or worse, knowing that submitting a resume might be meaningless and there's a good chance that will likely be sent directly to a digital trash can, never to see the light of day.
So, What's the Solution?
On one hand, it's a strong move by corporate America. We need to be honest here; AI is useful in the screening process. When used responsibly, AI can enhance fairness, speed, and insight. But when it’s allowed to make unexamined decisions, or worse, replace human judgment entirely, it becomes a liability.
Hiring should not be a choice between humans and machines, but rather a collaboration. Imagine a scenario where AI does the time consuming and expensive parts of the process, by screening, sorting, and pre-qualifying candidates before handing the files off to a hiring manager for interviewing and evaluating. At its heart, hiring is a deeply human endeavor, and intuition plays an important role, especially in key leadership positions. It’s about making a connection, deciding how a candidate might fit into the culture of the business, and to determine trustworthiness. At this point in the development cycle, there are no AI programs that can replicate these uniquely human functions. Companies that choose to let automation take over entirely, risk eroding their employer brand and alienating the very people they hope to attract.
© 2025 Ralph Schwartz
