Algorithmic Accent Bias: Are AI Video Interview Tools Discriminating by National Origin?

Algorithmic Accent Bias: Are AI Video Interview Tools Discriminating by National Origin?Video job interviews are becoming more common. Even though applicants often express dislike of this type of screening, many have no choice but to submit to an online job interview for a job they’re interested in. So what happens if you submit the video and never hear back? Perhaps you assume they found a candidate more aligned with their needs. But if you have an accent, you might also start to wonder whether your speech caused you to be filtered out. The truth is that you may have been denied your job, and it might not have even been because of a human bias. AI (artificial intelligence) video interview tools may penalize you based on how you sound, rather than what you say.

This unfortunate scenario begs the question, “If AI downgraded my job application score because of my accent, is that national origin discrimination?” It could be—and that’s something you need to take seriously.

What is algorithmic accent bias?

Some companies use tools like HireVue, which use AI to score video interviews. The interviewer answers questions into a webcam, and then the tool evaluates and scores them based on factors including their word choice, tone, cadence, and facial expressions. Those speech-to-text or video scoring algorithms often struggle to accurately interpret the speech of non-native English speakers and individuals with strong regional dialects.

Additionally, when the algorithm assigns a score, it compares each candidate to past successful candidates. Those candidates might be white, native English speakers. As a result, the algorithm could be discriminating unfairly against those who don’t speak in a specific way, regardless of their competency or fit for the role.

Who is being disadvantaged by these AI systems?

AI may discriminate based on many different factors. When it comes to video interview tools, groups that could be impacted include:

Immigrants and non-native English speakers

Some studies show word error rates (WER) as high as 22% for Chinese-accented speakers. Compare that to 10% for native US English speakers, and it becomes clear that these algorithms could be unfairly discriminating against people with accents.

Applicants with regional dialects

The US population has a variety of accents. AI algorithms could be trained on certain common or more neutral accents and not regional dialects. Southern, Appalachian, African American Vernacular English (AAVE) and other accents may not be as easy for the tool to interpret, making it possible that these applicants will be rejected based on how they talk.

Individuals with speech disabilities

Although it’s illegal to discriminate against people with disabilities, AI transcription tools may not accommodate stutters, slurred speech, or speech affected by physical or neurological conditions, putting people with certain disabilities at an unfair disadvantage in the job market. While not the same as an accent, it should further potential problems.

Why this may be national origin discrimination

Title VII of the Civil Rights Act prohibits employment discrimination based on national origin (or race, color, sex, or religion). Of course, it’s impossible to attribute a discriminatory goal to a bot, but the law doesn’t require malicious intent—practices that have a markedly distinct impact are enough. It’s possible that an accent or dialect bias could serve as a proxy/substitute for national origin if AI tools systematically disadvantage:

  • Foreign-born applicants.
  • Non-native English speakers.
  • Speakers of certain regional English variations.

Accent discrimination can be illegal if it interferes with hiring and isn’t job-related or based on business necessity.

The technology behind the bias

There are several reasons that AI might be unfairly biased against some candidates. When it comes to video interview analysis, the technology involved may be problematic for the following reasons:

Speech-to-text AI training

Speech-to-text tools have come a long way in recent years, but they are far from perfect. Also, those who train these models may not use diverse speakers to improve the technology’s understanding of non-standard accents. As a result, the AI software transcribes less accurately, with fewer matching “keywords,” when nonnative speakers are involved. This may result in lower scores for foreign-born candidates.

Similarity-based scoring

Were the past successful applicants from the same area with similar backgrounds to one another? Algorithms will work to match the applicant’s language to successful past applicants. Candidates who don’t speak like them (due to national origin) can see their odds of securing a person-to-person interview drop.

Opaque and unaccountable systems

Even if you suspect you might have been discriminated against, you might have a hard time proving it. There’s often very little explanation for rejection, and the employers themselves might not even know how their tools are sorting candidates. The decision-making of AI algorithms can be a bit of a mystery. Unfortunately, this means it might be using criteria that are not selecting the best candidates, or it might be illegally discriminating.

Real research, real risks

Research indicates that bot-based biases are real. One investigation found that speech transcription accuracy varies widely by accent. This is not theoretical bias—it’s measurable. A University of Melbourne study found that AI tools trained on US-centric datasets excluded international voices. While HR professionals raised concerns over these findings, vendors could offer no proof that their tools promote fairness.

Despite these issues, HireVue says 72% of employers used AI hiring tools in 2025. As a result, biased bots could be affecting millions of candidates across the globe, including in the US, where strict anti-discrimination laws are meant to prevent such outcomes.

Legal liability: Who’s responsible when bots are biased?

Under US law, employers are responsible. Importantly, they can be liable even if they outsource hiring to an AI vendor. The vendors who create those tools may also be liable. Additionally, the important factor is the results, not the way the tools are making decisions.

The onus is on employers to make sure they’re selecting tools that comply with laws like Title VII. Employers should be proactive about requesting the training data, auditing systems, and including human review in their processes. Transparency about the use of AI tools will also continue to be important.

What can applicants do if they suspect accent bias?

Identifying indications that you were rejected because of accent bias can be challenging. You might notice that your video application was rejected instantly, such as by an instant post-video interview rejection. Perhaps you’ve realized that you aren’t alone, and that others with similar backgrounds are also unable to secure a face-to-face interview with a human.

You can protect yourself by keeping video interview confirmations, timestamps, and rejection emails. Reach out to the company and request feedback or transparency from the employer or vendor. If you strongly suspect something, consult with an employment attorney about whether you might have been discriminated against in violation of Title VII or the Americans with Disabilities Act.

Buckley Bala Wilson Mew LLP is watching this issue closely

Our firm has a long history of standing up for workers facing workplace discrimination. Today, we’re noticing how AI is changing–and in some ways worsening hiring biases. Now, we are leading the conversation on algorithm bias and AI in employment law. We’re committed to protecting job seekers from being penalized based on their protected status, including how they speak.

If you believe your accent or speech pattern led to unfair rejection due to your national origin or a disability in an AI job interview, contact Buckley Bala Wilson Mew LLP. We’re here to help. Our attorneys offer confidential consultations for clients across Georgia.