In The Atlantic, we argue that digital platforms — which deliver exponentially more ads than their newsprint predecessors — are making core civil-rights laws increasingly challenging to enforce.
Corrine and Aaron write: "Facebook must redouble its efforts to address all facets of potential discrimination in its ad system. As a part of that, the company should provide the public with far more detail about how its advertising system works, especially more information about the ads it runs, including aggregate demographic statistics about the groups that ultimately saw them. Facebook has taken some small steps in this direction, mostly limited to political ads and the commission of a civil-rights audit, but there is much more to do."
Related Work
We filed a legal brief arguing that Section 230 should not fully immunize Facebook’s Ad Platform from liability under California and D.C. law prohibiting discrimination. We describe how Facebook itself, independently of its advertisers, participates in the targeting and delivery of financial services ads based on gender and age.
Across the FieldOur empirical research showed that Facebook’s “Special Audiences” ad targeting tool can reflect demographic biases. We provide experimental proof that removing demographic features from a real-world algorithmic system’s inputs can fail to prevent biased outputs.
Across the FieldOur empirical research showed that Facebook’s ad delivery algorithms effectively differentiate the price of reaching a user based on their inferred political alignment with the advertised content, inhibiting political campaigns’ ability to reach voters with diverse political views.
Across the FieldIn a paper presented at the 2020 Conference on Fairness, Accountability, and Transparency in Machine Learning, we describe how and when private companies collect or infer sensitive attribute data, such as a person’s race or ethnicity, for antidiscrimination purposes.
Across the Field