A state court in Washington, D.C., in July issued a decision that broke new ground in algorithmic accountability. It built on legal doctrine dating back to the Freedom Riders and challenged racial segregation on Meta-owned social media platforms Facebook and Instagram. Almost no one noticed, but the decision provides a model for how states and advocates can use state laws to protect civil rights in the era of artificial intelligence (AI).
Court advances novel algorithmic discrimination case against Meta
Equal Rights Center v. Meta is a case about discriminatory advertising on social media. Led by the Lawyers’ Committee for Civil Rights Under Law where I formerly worked, the case’s core allegation is that Meta’s algorithms deliver ads for colleges and universities based on race. Last year, Princeton researchers published a study showing that Meta steers ads for public schools disproportionately to white users and ads for for-profit colleges disproportionately to Black users—even when the advertiser uses neutral targeting criteria. The study reaffirmed similar findings in other sectors: Meta does not just give advertisers tools to discriminate; its own algorithms discriminate based on race.
Meta is providing separate but equal service to its users based on their race and gender. The company claims it is just giving people the ads most relevant to them, and any discrepancies will even out at scale.
Researchers have documented Meta’s algorithmic discrimination for years. ProPublica found in 2019 that Meta delivered construction job ads to men 87% of the time, even though the company who purchased the ad did not target candidates based on gender. A construction workers union bought ads featuring women and with neutral targeting, yet Meta still delivered them to a two-thirds male audience. Meta delivered an ad for an engineering job with no age or gender targeting to almost 70% men, predominantly between ages 18 and 34.
A study by researchers at Upturn and Northeastern University found “ads for jobs in the lumber industry reach[ed] an audience that is 72% white and 90% male” while delivering supermarket cashier job ads to an 85% female audience and taxi driver ads to a 75% Black audience—despite using identical targeting criteria for all three. Carnegie Mellon researchers found 58% of neutrally targeted credit ads were sent to a greater percentage of men than women, even though women received more ads on average and had higher Facebook usage. When University of Southern California researchers ran the same employment ads on Facebook and LinkedIn, Meta’s delivery showed statistically significant gender bias while LinkedIn’s did not. In 2022, the Department of Justice sued Meta for violating the Fair Housing Act after an investigation showed major racial disparities in Meta’s delivery of housing ads. Meta settled that case by agreeing to implement an intervention system designed to mitigate discriminatory steering in housing, employment, and credit ads—but notably not other protected opportunities like education, health care, or insurance. New research shows even that limited intervention has faults.
Rather than giving users what they purportedly want, Meta actually is stereotyping users based on their race, gender, and age, and therefore reenforcing redlining, segregation, and other longstanding barriers to equal opportunities.
Applying state civil rights and consumer protection laws to algorithmic decisions
ERC v. Meta is the first case challenging Meta discrimination in education advertising. It is also one of the few cases focused solely on ad delivery algorithms, rather than ad targeting. Notably, it combines state civil rights and consumer protection laws in ways that are both new and very, very old. In so doing, ERC provides a road map for algorithmic fairness and accountability at a time when the federal government is abandoning civil rights protections.
In its recent decision, the D.C. Superior Court characterized ERC’s allegations as “archetypal” discrimination claims and denied Meta’s motion to dismiss on all counts. ERC alleges Facebook and Instagram are public accommodations (i.e., businesses that must serve everyone without discrimination), receiving valuable targeted ads is part of the benefit of the bargain a user makes with Meta, and that by discriminating in ad delivery, Meta is providing different quality service for the same price. The court held that ERC had properly alleged that Meta’s conduct, if proven, would constitute both unlawful disparate treatment (i.e. intentional discrimination) and unlawful disparate impact (i.e., unjustified disparities from facially neutral policies or practices) under the D.C. Human Rights Act. In D.C., a violation of the Human Rights Act is a per se unfair trade practice under the D.C. Consumer Protection Procedures Act (D.C. CPPA).
The court also held that if Meta failed to disclose accurately how its advertising system works, that could constitute deceptive trade practices that violate the D.C. CPPA in multiple ways. First, Meta engages in misrepresentation when it publicly claims it is giving its users the ads that are most relevant and valuable to each individual based on their individual traits, when actually Meta is steering these ads based on stereotyped group traits. Second, Meta says its product—its advertising system—is of one quality or nature, when in fact it is of another. Third, Meta fails to disclose its racial discrimination, which is an omission of a material fact. Users might choose not to trust the advertisements they are receiving are good fits for them if they knew they are the product of discrimination.
A century of using consumer protection law to fight segregation
What makes the pending case more interesting is that the court’s holding—that covert racial discrimination constitutes a deceptive trade practice—harkens back to Federal Trade Commission (FTC) cases combatting housing discrimination in the 1960s. In two 1968 cases, E.G. Reinsch, Inc. and First Buckingham Community, Inc., the FTC found that landlords engaged in deceptive practices violating the FTC Act when they failed to disclose they would not rent to Black tenants. “An advertiser’s failure to disclose material facts in circumstances where the effect of nondisclosure is to deceive a substantial segment of the public is as much deception as if it were accomplished through affirmative misrepresentations. ‘To tell less than the whole truth is a well-known method of deception,’” the FTC wrote in First Buckingham.1
In fact, there is a long history of civil rights advocates using consumer protection law to combat segregation, dating back decades before Brown v. Board of Education.
The Interstate Commerce Act (ICA), enforced by the Interstate Commerce Commission, was pivotal in the fight for racial integration, specifically the desegregation of interstate transportation. Section 3 of the ICA, described by the Supreme Court as the “unjust discrimination” provision, prohibited common carriers “to make or give any undue or unreasonable preference or advantage to any particular person … or to subject any particular person … to any undue or unreasonable prejudice or disadvantage in any respect whatsoever.” The Supreme Court held as early as 1914 that the ICA was “certainly sweeping enough to embrace all the discriminations of the sort described which it was within the power of Congress to condemn.” In 1941, the Supreme Court held in Mitchell v. United States that the ICA prohibited racial segregation in railroad cars. Notably for considering algorithmic discrimination today, the court wrote, “[T]he comparative volume of traffic cannot justify the denial of a fundamental right of equality of treatment.” This means just because most people of a certain race may be more inclined to prefer advertisement A to advertisement B does not mean that it is permissible to treat all people of that race as preferring A to B; everyone is entitled to equal treatment as individuals regardless of their race.
The Supreme Court emphasized this point again in Henderson v. United States in 1950, another railroad desegregation case brought under the ICA. It held that “limited demand” cannot justify discrimination because “it is no answer to the particular passenger who is denied service … that, on the average, persons like him are served.” To bring this into the modern context, probabilistic algorithms that gatekeep economic opportunities cannot justify race or gender steering by saying relatively fewer people of the excluded class are interested in that opportunity.
The Interstate Commerce Commission’s (ICC) consumer protection authorities played a central role in the Civil Rights Movement. In 1955, the ICC issued a decision prohibiting segregation on interstate buses after Sarah Keys, a Black private in the Women’s Army Corps, refused to give up her seat at the front of a bus to a white soldier.2 Then in 1960, the Supreme Court relied on the ICA again to compel desegregation of bus terminals in the landmark case, Boynton v. Virginia. The Freedom Riders put Boynton to the test in 1961.
The unjust discrimination provision from the ICA also was the basis for an almost identical provision in the Communications Act of 1934, known as Section 202. The courts and the Federal Communications Commission (FCC) have applied this consumer protection language to discrimination based on race, national origin, and even income. The Biden administration repeatedly enforced the FTC Act against car dealers and pharmacies that engaged in discriminatory practices.
State attorneys general and civil rights advocates should use consumer protection
For over a century, federal consumer protection law has prohibited discrimination. Meanwhile, state consumer protection laws routinely use the same language and statutory structure as federal law. Some state statutes even incorporate FTC Act precedent to help define their own scope. However, this means that as the federal government walks away from civil rights enforcement, especially against tech companies, there is an open lane for state law to pick up the slack.
State attorneys general and civil rights advocates can and should use their state consumer protection laws to investigate algorithmic discrimination as an unfair or deceptive trade practice. Moreover, an unfair trade practice claim can provide an additional avenue for liability even as some courts may begin to chip away at disparate impact liability. Unfair trade practice is a fundamentally different legal doctrine with a different legal test from disparate impact liability, even if both could be applied to some situations.
The D.C. Superior Court’s decision in ERC v. Meta is an acknowledgement of what many have known for a long time: There is no way to get around the fact that algorithmic discrimination is just another form of discrimination. Employing probabilistic systems and opaque AI models does not transform racial steering into a unique special circumstance beyond the scope of our civil rights and consumer protection laws. As the Supreme Court held 75 years ago, it is no answer to a particular user who is denied service that, on the average, persons like him are served.
-
Acknowledgements and disclosures
The author formerly led the legal team at the Lawyers’ Committee for Civil Rights Under Law that investigated and developed this case.
Meta is a general, unrestricted donor to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the authors and are not influenced by any donation.
-
Footnotes
- After passage of the Fair Housing Act in 1968 and the creation of the Department of Housing and Urban Development, the FTC deprioritized these types of housing discrimination cases, although its legal authority under the FTC Act remained unchanged.
- The U.S. Army website used to have an article about Pfc. Keys. The Internet Archive shows that it was removed between July 2 and July 7, 2025. A copy of that webpage is archived here.
The Brookings Institution is committed to quality, independence, and impact.
We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).
Commentary
Equal Rights Center v. Meta is the most important tech case flying under the radar
November 13, 2025