Colorful confetti and streamers party popper illustration

We raised $10M to help Banks and Lenders identify and correct blind spots in their decisioning systems!

Learn More!

AI Hiring Fail: When Overfitting Leads to Bias

Share this post

Just heard a wild one!

A company launched a new AI hiring tool.

Its top predictor of employee success?

Being named Jared and having played high school lacrosse. U+1F92F

What does that tell us about the AI?

For starters, “Jared” and “lacrosse” probably aren’t causally linked to job performance.

This is classic overfitting—where the model mistakes quirky patterns in the training data for real insight.

It’s not finding signal; it’s chasing noise.

It also points to a likely problem with the training data—maybe a small, homogeneous group of past hires.

At best, this is a funny story.

At worst, it’s a biased model that could reinforce stereotypes and lock out qualified candidates.

I’m all for data-driven insights, but this is a hilarious (and slightly terrifying) reminder that AI isn’t infallible.

That’s why at FairPlay, we build bias detection solutions to ensure AI decisions are based on merit, not stereotypes.

Because let’s face it: We can’t all be Jareds with lacrosse sticks to succeed!

Cartoon man holding lacrosse stick in field with clouds

Contact us today to see how increasing your fairness can increase your bottom line.