Oakland Tech Company Hiring Discrimination Attorney

Standing up for job seekers facing AI discrimination

You spent hours perfecting your resume. You carefully crafted your cover letter to highlight your qualifications. You clicked submit on the job application, feeling confident about your chances. Then, within minutes or hours, you received an automated rejection email. No interview. No human review. Just an instant “no” generated by an algorithm you never saw.

Tech companies across the Bay Area are increasingly relying on automated systems to screen job applicants, promising efficiency and objectivity. But these tools often perpetuate and intensify discrimination based on age, race, disability and other protected characteristics. What was supposed to eliminate human bias has, in many cases, baked discrimination directly into the hiring process.

If you’ve applied for numerous jobs only to face rejection after rejection from automated systems, you may not be imagining things. Not only is discrimination in the hiring process real, but it may also be illegal.

AI Hiring Tools Are Everywhere

As of 2025, around 87 percent of companies use AI for recruitment. Major applicant tracking systems like Workday, Workable, Pinpoint ATS, Rippling and BambooHR rely heavily on AI to manage and automate the hiring process. Among Fortune 500 companies, 492 employers used applicant tracking systems in 2024 to streamline recruitment.

Automated tools promise to help employers handle the flood of applications they receive for every open position. They scan resumes for keywords and rank candidates using algorithms, rejecting applicants who don’t meet certain criteria. Such systems have become standard practice, particularly in tech-heavy regions like the Bay Area.

Oakland’s Tech Industry and AI Hiring

Oakland has developed a growing tech sector, attracting both established companies and innovative startups. Major employers include Pandora, Marqeta and VSCO. Beyond tech companies, the city is home to employers in healthcare, professional services, transportation and other industries that are increasingly adopting AI hiring tools. Kaiser Permanente, headquartered in Oakland, employs thousands of workers.

Many of these companies are increasingly turning to AI and automated systems to manage hiring. While AI can handle hundreds of applications quickly and efficiently, it can also unfairly filter out qualified candidates.

How Do AI Systems Discriminate?

Artificial intelligence sounds neutral and objective. How could a computer program discriminate? The reality is more complicated. AI systems learn from data, and if that data reflects existing biases and inequalities, those same patterns may be perpetuated. The result may be discrimination against job seekers based on age, race, gender and other legally protected characteristics.

  • Data Bias: AI systems train on historical data, often using information about an employer’s current workforce or past successful hires. If a company’s existing employees are predominantly white, male or young, the AI may conclude that the “best” candidates should share those characteristics. The system doesn’t understand why this correlation exists. It simply identifies patterns and replicates them.

In 2024, the University of Washington examined AI resume screening systems using 500 applications that spanned nine occupations. The results were stark. Applicants with white-sounding names were preferred 85 percent of the time, compared to only 11 percent for those with female-sounding names. Some tests showed Black male candidates consistently received worse outcomes than white male candidates with identical qualifications.

  • Algorithmic Bias: Sometimes the bias comes from how developers code the algorithm itself. A developer’s own biases, whether conscious or unconscious, can become embedded in the system. For example, an AI tool designed to flag leadership skills might prioritize resumes containing terms like “club president,” “team captain” or “committee chair.” This approach may screen out candidates from less privileged backgrounds or minority groups whose leadership skills don’t fit traditional markers.

Amazon discovered this problem firsthand in 2018 when its AI recruiting tool was found to discriminate against women. The company stopped using the tool, but many other organizations continue using similar systems without recognizing the built-in bias.

  • Proxy Data Bias: Even when AI systems don’t directly use legally protected characteristics like race or age, they often rely on proxy data that correlates with these characteristics. For example, an algorithm that prioritizes candidates from Ivy League universities may systematically exclude applicants who attended historically Black colleges and universities, community colleges or state schools. The system never mentions race, but the discriminatory impact is clear.

The Workday Lawsuit Sets Important Precedent

In 2024, a job applicant filed a groundbreaking discrimination lawsuit against Workday, Inc., an HR software company. The plaintiff, a Black man over 40 with anxiety and depression, alleged that Workday’s AI-powered screening system discriminated against him based on his race, age and disabilities. He claimed he applied for at least 80 jobs over seven years and was rejected every time, often within minutes or hours of submitting his application.

Four more plaintiffs joined the lawsuit, all over age 40, with similar experiences. They described receiving automated rejections for hundreds of positions, sometimes within hours or even at odd times outside business hours, suggesting no human ever reviewed their applications. In some cases, rejection emails erroneously stated they didn’t meet the minimum requirements for roles they were clearly qualified for.

In May 2025, a California district judge granted preliminary approval for the case to proceed as a collective action, allowing other affected individuals to join the claim. The lawsuit alleges that Workday’s algorithm disproportionately disqualifies applicants over 40 from job opportunities when screening and ranking candidates. With the growing use of AI in hiring, the outcome could help define how employers use algorithms to make employment decisions.

California Law Protects Workers from Discrimination

California has some of the strongest employment discrimination protections in the country. The Fair Employment and Housing Act (FEHA) prohibits employers from discriminating against job applicants and workers based on protected characteristics, including age (40 and over), race, national origin, disability, gender and more.

Employment laws are trying to keep up with AI’s evolving role in hiring. In October 2024, California’s Civil Rights Department (CRD) issued new regulations specifically addressing discrimination through “automated decision systems.” These rules make it clear that existing antidiscrimination laws, including FEHA, apply to AI and other automated hiring tools. They are meant to prevent AI systems from reinforcing or worsening existing biases.

The regulations make employers responsible for the outcomes of their AI systems. Companies cannot hide behind technology if their tools produce discriminatory results. Employers can be held liable even if they use software from a third-party vendor, and FEHA applies whether discrimination comes from a human or an automated system.

Common Types of AI Discrimination

A 2025 AARP survey found that nearly 74 percent of older job seekers believe their age could prevent them from being hired. About 34 percent of workers over age 50 fear that AI could affect their job security, while about two-thirds said they had seen or experienced age discrimination in the workplace. AI hiring discrimination can affect anyone in a protected class, but data and lawsuits have identified several groups facing particular risk:

  • Workers over 40: Age discrimination is one of the most common forms of AI bias. Systems trained on younger workforces may view extensive experience as a negative rather than an asset. The tech industry has a median worker age of 38, compared to 43 in other sectors. As a result, it may be more prone to discrimination in AI hiring.
  • Racial minorities: Studies consistently show AI systems favor white-associated names over names belonging to Black, Latino, Asian and other minority applicants. A Northwestern University study found that employers were 36 percent more likely to call back white applicants than Black applicants and 24 percent more likely than Latino applicants, even when the resumes were identical.
  • Women: AI systems can learn gender bias from historical data showing male-dominated workplaces. For example, Amazon’s scrapped recruiting tool actively penalized resumes containing the word “women’s,” such as “women’s chess club captain.”
  • People with disabilities: Automated systems may screen out candidates with employment gaps that resulted from medical treatment or recovery. They may also discriminate against job seekers who need accommodations or reject those who disclose disabilities in their applications.

What to Do If You Suspect AI Hiring Discrimination

With AI hiring systems, proving discrimination can be challenging because the decision-making process is often opaque, even to the employers using the technology. However, California’s new rules make employers responsible for proving their AI hiring tools don’t discriminate.

  • Document everything: Start keeping detailed records of every job application. Note the company name, position, date you applied and when you received a response. Save all email communications, including rejection notices. Screenshot job postings before they’re taken down. If the rejection email provides a reason, save that information even if it seems inaccurate.
  • Pay attention to patterns: Are you being rejected by companies using the same applicant tracking system? Do rejections happen unusually quickly? This information can be valuable evidence.
  • Request information from employers: You can file a complaint with the CRD if you suspect discrimination. You can also request documentation from employers about their hiring processes. While employers may not readily volunteer information about their AI tools, an attorney can look into how the system works, what data it was trained on and whether it has been audited for bias.
  • Work with an experienced employment lawyer: AI hiring discrimination cases can be complex. The technology is new, the legal landscape is evolving and employers have significant resources to defend themselves. You need an experienced lawyer who understands both employment discrimination law and the technical aspects of how AI systems work.

Contact an Oakland Employment Attorney Today

At Erlich Law Firm, we believe in using the power of the law to right workplace injustices. If you’ve been rejected from numerous jobs and suspect an AI system discriminated against you based on your age, race, disability or another protected characteristic, we can help.

Our Oakland employment attorneys understand the complexities of AI discrimination cases. We know how to gather evidence, build strong cases and fight for the compensation you deserve. Contact us today for a free initial consultation about your AI hiring discrimination claim.

background-quote

My parents were heavily involved in community organizing. Seeing neighbors and friends’ parents struggle in the workplace gave me a sense of purpose that I wanted to help others. Many employees feel helpless and powerless in the workplace, and helping them vindicate their rights is the right thing to do.

- Jason Erlich

Client testimonials
Jason took the time to explain and guide me through the challenging process, and went the extra steps in consistently providing guidance and putting my concerns and questions high on their list.

Jeff V., Oakland

Jason Erlich made me feel like I was in the right place right away.I had some serious problems with a previous employer and he took care of everything. From the start he helped with my concerns and fears going up against a big corporation.

Susan W., Pacifica

With Jason’s expertise, commitment and aggressiveness, the case is now over and it’s only been 6 months!! I would HIGHLY recommend Jason Erlich to anyone that needs an employment lawyer.

Carla, Petaluma

I can highly recommend Jason Erlich Esq. for any employment law matter. He is an outstanding lawyer, embodies a mix of honesty, knowledge, client care and tough mindset.

Roger J., Oakland

When employees' legal rights are violated, we take time to explain their legal options, listen to their goals, and aggressively argue their case until we achieve the resolution they deserve.

SEEN ON
cnnmoney
marin-ij
dailypost
news10