Can a Job Ad Be Discriminatory Before You Even Apply? How Targeted Advertising and Algorithmic Steering May Violate Your Rights
Most of us are aware that advertisements are targeted at us. Sometimes, it’s a brand that you already like, other times a similar brand that may suspect you are a likely customer for them. And at times, it might feel like the ads are based on something less unique to you, more like your age, sex, or location. Targeted advertisements are nothing new, and while they might be disturbing at times, many take them for granted.
But what if those ads are for job opportunities? The same types of algorithms that target ads for products and services are also used by job platforms to determine who sees which postings. If those ads are based on gender, age, or other protected classes, the platform could be discriminating against job seekers before they ever even send out a resume.
According to the Equal Employment Opportunity Commission (EEOC), this type of ad targeting for jobs could violate laws like Title VII and the Age Discrimination in Employment Act (ADEA). What can you do to avoid being screened from even seeing potentially relevant job postings? And how can you protect yourself if you believe algorithms aren’t letting you see ads because of who you are?
What is algorithmic steering in job ads?
Algorithmic steering is meant to be more efficient. The idea is to get job posts in front of people who are more likely to be a good fit for the role. However, algorithms on Facebook, Google, and other platforms often use broad categories to filter what people see. Perhaps they target an age bracket like 20-35. Or maybe the algorithm looks at interests that it considers more common in men than women.
As a result, a 40-year-old woman with all the skills for the role may never even see that there is an opening that fits her interests and experience. Part of the issue is that she may never even know that she was discriminated against. She’s just missing opportunities because she doesn’t fit a bot’s idea of the right candidate demographics.
The ProPublica exposé and EEOC rulings
The EEOC ruled that several companies using Facebook’s targeted advertising to display jobs had excluded certain groups, including women and older workers. Those findings followed discrimination investigations made by ProPublica and The New York Times. The EEOC found that the algorithms were violating civil rights laws and required Capital One, Edward Jones, Enterprise Holdings, and other involved companies to choose between settling or going to court regarding their use of tools that discriminated based on age and gender.
This case shows that AI has the ability to steer opportunities away from certain people and eliminate job prospects before those individuals even have a chance to apply. Since those EEOC findings, Facebook stated that it would make changes to its advertising policies so that employment, credit, and housing ads didn’t target or avoid specific demographics based on discriminatory criteria.
How does this practice violate anti-discrimination laws?
There are a number of anti-discrimination laws that prohibit hiring practices based on protected categories. For instance, Title VII of the Civil Rights Act prohibits hiring discrimination based on sex, race, religion, and national origin. ADEA protects workers 40 and older from discriminatory practices based on age.
It’s important to point out that the tools involved need not intentionally discriminate for the company to face legal consequences. For example, Capital One did not need to express a preference for younger employees or male workers for their job advertising to be considered a violation. The fact that the algorithm, even if it was neutral, had a discriminatory impact is enough to file a legal claim.
Why is this so dangerous for workers?
If you’re seeking a new job, you may never know what you didn’t see. That job that would have been a perfect fit might never have popped up on your screen. It’s not like you’ll receive a notice informing you that you were excluded. And there won’t be any paper trail you can take to an attorney to prove that a company discriminated against you. That information will be hidden away in algorithms and platform metrics.
As a result of this, it’s possible that older workers and women could be blocked from high-paying roles in tech, STEM, and certain trades. Minorities may also be filtered out because of preferences or interests that the algorithm disfavors. The result could be large-scale and highly efficient discrimination that reinforces or even creates new biases.
Targeted ads vs algorithmic screening
In a way, targeted ads are the front end of screening. By choosing who to show a job posting to, the algorithm is also deciding who doesn’t see that posting. However, even if a possible candidate sees the ad, the risk of discriminatory bots isn’t neutralized. Resume bots may be deciding who gets in the door for that first interview. In other words, job seekers now face layers of bots that make decisions based on opaque criteria, some of which might be unfairly and illegally screening them.
Legal remedies: What are your rights?
You might have the ability to fight back if you’ve been discriminated against in the job market. However, proving disparate impact or discrimination by bots can be challenging. You might have options like capturing screenshots that show ads being displayed to some user profiles and not yours. With evidence of discrimination, you may have the option to file a claim with the EEOC, and possibly to pursue litigation based on disparate impact under Title VII or ADEA, or denial of equal opportunity under civil rights laws. If you believe you were discriminated against, you should contact an attorney to discuss your suspicions and potential legal options.
Contact Buckley Bala Wilson Mew LLP
At Buckley Bala Wilson Mew LLP, we advocate for fairness, transparency, and justice in the workplace. We’re closely following changes in hiring and employment practices as technology, especially artificial intelligence, reshapes how companies are finding and selecting employees. When new forms of discrimination are occurring, we need new legal strategies to fight back.
We’re ready to take the lead and protect the rights of job seekers and employees who are struggling with unfair hiring tools and algorithms. The more widespread these AI hiring practices become, the more determined we become to find ways to hold bad actors accountable.
If you suspect your job opportunities were limited by algorithmic steering or discriminatory targeting, contact our Atlanta-based team at Buckley Bala Wilson Mew LLP. We offer confidential consultations for prospective clients across Georgia.