HOUSTON, Texas. According to QZ, when some companies created algorithms to rank resumes, the programmers quickly learned, much to their alarm, that the algorithms often ranked women’s resumes lower than men’s resumes. When it comes to AI-based systems, the systems are only as robust as the data programmers are putting into the systems. If a company already has some intrinsic bias in hiring practices, or if the representative sample is itself biased, problems can arise. In industries where men are overrepresented and minorities underrepresented, an AI system can quickly become biased when combing through resumes.
Ideally, an AI hiring system would work like this. Employers scan in resumes of successful and high-achieving employees. The system analyzes these resumes for things like backgrounds, education, experience, and other extracurricular activities, and creates a profile of what a good applicant would look like. Unfortunately, in businesses like tech and the oilfield industry, certain types of workers may be overrepresented, which can result in an AI algorithm with some biases.
In fact, according to the MIT Technology Review, bias in algorithms is widespread. As more companies use automated processes to select new hires and even evaluate employee performance, the burden is on these companies to ensure that their hiring practices are not biased by the algorithms they use.
According to QZ, when companies tried to train AI to identify strong resumes, the AI favored men over women. Yet, these biases can lead to widespread inequality. AI is already used to determine who gets asked to sit for an interview, who gets a loan, and in some municipalities, AI is used to determine who goes to jail and who gets parole.
Companies have to explain their hiring decisions, especially if employees or applicants claim bias. If a company’s hiring tools are found to be biased, QZ reports that the company could be held legally accountable. In fact, the Equal Employment Opportunity Commission requires that employers keep the data they use to make hiring decisions in house in the event a claim should arise.
Part of the problem is that many algorithms used to make major decisions utilize proprietary software or programs. Companies that use these products may not even be aware of what is going on under the hood.
As more companies use AI to make major decisions, we may see more cases where employees bring forth discrimination and bias claims on these grounds. This is still a very new area of the law and there aren’t many regulations to oversee AI-based decision making.
So, what can you do if you feel you were passed up for a promotion or for an interview because of your gender, race, disability, sexual orientation, or religion? Moore & Associates are employment lawyers in Houston, Texas who can review your case and help you understand your rights and options. If you have a strong case, you may be entitled to financial compensation. Have questions about workplace bias? Visit us at Moore & Associates today.
Moore & Associates
440 Louisiana Street, Suite 675
Houston, TX 77002
713-581-9001