Testimony on DC's Stop Discrimination by Algorithms Act of 2021
Emily Paul, Natasha Duarte, and Urmila Janardan
Written testimonyWe submitted the following testimony to DC Council's Committee on Government Operations and Facilities regarding B24-0558, the Stop Discrimination by Algorithms Act of 2021.
Chair White and members of the Committee on Government Operations and Facilities,
Thank you for the opportunity to testify on the Stop Discrimination by Algorithms Act (SDAA). This bill represents a positive step toward acknowledging and addressing technology’s role in determining DC residents’ access to basic economic needs and opportunities. Our testimony provides some concrete examples of discrimination we believe the Council must address — through the SDAA and other legislation.
Upturn is a DC-based research and advocacy organization whose mission is to advance justice in the design, governance, and use of technology. We study and challenge the systems that mediate people’s access to essential opportunities, like housing, jobs, and health care. Our team includes computer and data scientists, lawyers, researchers, and policy experts. We often work in partnership with community-based organizations.
Our work aims to uncover and fight the types of discriminatory harm that the SDAA seeks to address. For example, our research has exposed how Facebook’s ad delivery algorithm showed users different job ads based on their race and gender; how job applicants in DC are screened using ableist assessments when they apply for hourly positions at companies like Walmart, CVS, and Starbucks; and how an algorithm proposed in Missouri would cut or take away in-home care for many people in the state. We recently launched the Benefits Tech Advocacy Hub, a toolkit and community of practice for challenging the systems used to determine people’s access to public benefits programs. We write to share with the Council what we’ve learned from doing this work over the years. These lessons should inform the Council’s approach to the SDAA and other important legislation for combatting discrimination.
1. Technology's role in discrimination is an important and timely issue for DC Council to address because it is already affecting people in DC.
The SDAA seeks to address discrimination that is driven, exacerbated, or obscured by automated decision systems. The discrimination that many DC residents experience when they apply for housing, jobs, loans, or public benefits is not new. However, discriminatory outcomes can scale quickly and evade detection when these systems are standardized, automated, and outsourced to third-party vendors.
As researchers analyzing these systems, we and others in our community of practice have uncovered several examples of this problem. More examples can be found in the new Blueprint for an AI Bill of Rights released by the White House Office of Science and Technology Policy this week.
A. Screening job applicants in DC
For a research study last year, we completed and documented the online application process for 15 hourly, entry-level jobs in DC at large employers like Walmart, CVS, and Starbucks. These employers used standardized applicant tracking systems, which allow them to integrate assessments — such as multiple-choice tests and resume screeners — from different vendors into one application process.
We found that large employers are using ableist personality tests at scale to screen job applicants in DC. People who can perform the essential functions of a job — such as ringing up customers, counting change, or stocking shelves — but don’t fit a particular personality model could find themselves repeatedly knocked out of applicant pools. The scoring of personality tests is often calibrated based on a disproportionately white and middle-class population. These personality tests are not new — they’ve been around since people applied for jobs using a pencil and paper. But online job application systems allow these tests to scale more easily, so that someone applying to multiple cashier positions may see the same personality questions over and over.
Some hiring assessments have a history of being used or developed to weed out job applicants who may be more likely to organize, including Black workers. Today’s personality tests still include questions that may be part of a union-avoidance strategy. For example, we saw questions that asked if we questioned authority, or prioritized our well-being over our performance at the job. One question asked if we preferred a job where “there are high performance expectations” or where we are “highly compensated for [our] work.” These questions were not clearly related to performing the essential functions of the jobs we were applying for.
Job application systems ask candidates to provide their availability and pay preferences, without telling candidates what shifts the employer seeks to fill, what salary they offer for the job, or how the information will be used to assess the applicant. As applicants, we could not see how this information was being used to score or disqualify candidates. These practices may pressure workers to overstate their availability and can disadvantage people with caretaking or other responsibilities, like school or a second job.
B. Background checks and digital records as a barrier to housing and employment
You may not think of housing or employment background checks when you think of automated decision systems. However, background checks are one of the most widespread and racially discriminatory applications of data and algorithms that impede DC residents’ access to basic needs like jobs and housing every day.
In our job application research, we found that the applicant tracking systems employers used made it easy for them to integrate background checks from third-party vendors into the job application process. Many applications required us to agree to background checks, but did not disclose what information would be checked or how it would be used in hiring decisions.
Similarly, almost everyone who searches for housing in DC must undergo a tenant screening process for each unit they apply for. Landlords usually purchase reports from tenant screening companies (the DC Housing Authority contracts with RentGrow to screen tenants for public housing). These companies use algorithms to match housing applicants with eviction, credit, criminal, or other records, which may be used to create a numerical score, a risk assessment, and/or a recommendation about whether to accept the tenant, reject them, or charge them a higher security deposit.
Background checks and the records that populate them — especially criminal, credit, and eviction histories — have become overwhelming barriers to housing and employment that especially harm Black, brown, low-income, and disabled people in DC. For example, evictions in DC are disproportionately concentrated in Wards 7 and 8, because landlords in those neighborhoods serially file evictions against their tenants as a first resort to collect rent and avoid making repairs. Tenant screening companies collect those eviction records and translate them into high-risk ratings or lower scores on tenant screening reports. In turn, landlords tend to reject or charge higher security deposits to any tenant who has an eviction history. The result of this system — which generates great profits for data brokers and tenant screening companies — is that Black residents are disproportionately locked out of access to housing.
C. Race and gender discrimination in online advertising
In 2019 and 2020, Upturn partnered with academic researchers to conduct several studies on how Facebook ads were targeted and delivered to users. Previous studies had shown that advertisers who actively wanted to use Facebook ads to discriminate could do so. But we also found that even when advertisers tried to avoid targeting their ads to a particular type of audience, Facebook’s algorithm still (at the time) delivered ads to audiences with significant race and gender skews. For example, even when researchers directed ads to all US users, ads for jobs in the lumber industry were more likely to be delivered to white men, while ads for janitor jobs were more likely to be delivered to Black women. These studies, along with research by others in the field, helped support litigation that ultimately led to Meta making several significant changes to its targeting and delivery of housing, credit, and employment ads.
D. Alternative data and "educational redlining" in lending decisions
Over the last two years, Upturn, along with the NAACP’s Legal Defense Fund and the Student Borrower Protection Center (SBPC), has been part of an effort to assess the fair lending outcomes of the machine learning models used by lending platform Upstart. The investigation into Upstart’s model initiated by SBPC led Upstart to make changes to its lending model, which was penalizing loan applicants based on the average SAT and ACT scores of the colleges they went to. Research shows standardized test scores are not correlated with academic merit or success but that they are correlated with race and socioeconomic status. But it was only after significant advocacy and an inquiry from several US Senators that SBPC was able to uncover this discriminatory use of educational background. Upstart refuted the results of SBPC’s investigation, stating that they were invalid because Upstart changed its model during the course of the investigation. This is a common response by companies when external researchers uncover discrimination, and it’s hard to verify because we don’t have visibility into these model changes.
The ongoing investigation of algorithmic discrimination in Upstart’s lending model shows both the benefits of independent research and the significant information asymmetry between the developers of algorithms and those trying to identify and address algorithmic discrimination. The SDAA is a positive step toward making it easier to do these investigations. Making the audits provided to the OAG publicly available would further enable the type of independent research that has played a significant role in identifying algorithmic discrimination in the past.
E. Using algorithms to cut home care hours for people with disabilities
Many states and DC use or are considering using an algorithm to assess whether people are eligible to receive care in their homes and how many hours of care people receive. In many cases, the hours of care people receive have been cut, sometimes dramatically, after these algorithmic assessments are deployed. The assessments may operationalize policy changes to the maximum amount of care available, and may also substantially change which conditions are considered when allocating care to people. The people impacted by these cuts, along with advocates like legal aid attorneys, have sought to challenge these systems by demonstrating that they don’t account for many people’s needs, perpetuate austerity policies, and push people into institutions instead of home-based care. People are sometimes forced to litigate and/or file public records requests to try to find out the factors and formulas the assessments use to calculate care hours or determine eligibility. Many of these lawsuits have revealed that unconscionably restrictive and arbitrary algorithmic assessments have affected people with disabilities across the country. Upturn is currently engaged in research to try to learn more about the factors used to screen and assess people for home care eligibility and hours in DC.
In 2018, the Missouri Department of Health and Senior Services proposed and published a new home care assessment algorithm for public comment (other states have not had this type of public process). Legal aid organizations, home and community based service providers, and Upturn tested the algorithm and showed that it could disqualify as many as 66% of currently eligible people.It contained basic errors and fundamentally failed to assess people’s needs. For example, the algorithm considered people’s mobility issues with getting in and out of bed, but not with getting up and down stairs.
Public scrutiny of Missouri’s home care algorithm has helped to at least slow its implementation and ensure that Missouri residents who wouldn’t qualify under the new assessment system are not currently cut off from their benefits.
2. These discriminatory harms are not new, but they often go unaddressed because of gaps in civil rights laws and enforcement.
Technology’s role in discrimination has not been adequately addressed by existing civil rights enforcement and litigation for several reasons, including:
A. Existing civil and human rights laws don’t always explicitly cover technology vendors that create, sell, and/or administer the systems that determine people’s access to essential economic needs and opportunities.
When our federal and DC civil and human rights laws were drafted, they did not contemplate that so many decisions about access to housing, jobs, credit, and other economic opportunities would be mediated by systems created by technology vendors. While these laws clearly regulate first-party decision-makers, such as employers, landlords, and banks, the laws are often much less clear about the liability of third parties like tenant screening companies, hiring assessment vendors, and online advertising platforms.
For example, Title VII, which protects against employment discrimination, covers employers and employment agencies, but there is no guidance as to whether platforms like ZipRecruiter, LinkedIn, or Meta qualify as employment agencies. Vendors routinely disclaim civil rights liability by stating that they do not make decisions about who ultimately gets a job, a loan, or an apartment — even though their products are designed and marketed to help make and standardize those decisions at scale.
B. A lack of information hinders enforcement.
Much of civil rights enforcement relies on impacted people or advocates to file complaints. But the automated or standardized processes used to help make life-altering decisions about people are often obscured or invisible. For example, as applicants to entry-level retail jobs in DC, we were aware that we were taking standardized hiring assessments like personality tests, but we couldn’t see whether employers were using the scores on those assessments to rank candidates, or whether they were rejecting all candidates below a certain score. We could see that we were asked to provide our availability and pay preferences, but we couldn’t see whether we were disqualified based on our stated salary preference.
As another example, DC has struggled to enforce its tenant protections due to obscurity in how tenants are screened. DC law prohibits landlords from doing a criminal background check on potential tenants until they’ve extended a conditional offer of housing, and then it limits the types of criminal records landlords can use to deny applicants. But some tenant screening tools may not even reveal to landlords, let alone tenants, the specific criminal records they use to produce scores and recommendations. The US District Court for the District of Connecticut is currently hearing a fair housing case brought by Carmen Arroyo, whose disabled son was denied access to move into her apartment based on a tenant screening report that simply concluded “disqualifying record” found, and did not reveal any of the underlying information about the records to the property manager.
Even when people are able to find out that they were subject to an automated decision system, it’s usually after the decision has been made, and too late to recover the benefit or opportunity they were denied. By the time people are able to gather enough information to file a complaint about a hiring assessment or tenant screening process, the job or apartment has already gone to someone else. Few people in that position have the time and resources to research and challenge the decision-making process that left them without income or shelter.
In some cases, litigation fails because courts expect plaintiffs’ prima facie cases to include statistical evidence of discrimination that plaintiffs have no good way of obtaining. For example, the EEOC has said that national statistics support a finding that excluding job candidates based on criminal records will have a racially disparate impact. However, plaintiffs relying on national statistics to challenge employment background checks have been dismissed for failing to show disparate impact statistics on the specific applicant pool for the job.
The audit reports and adverse action notices required under the SDAA would help impacted people and DC agencies find out about and enforce against civil and human rights violations. Requiring entities to assess and disclose information about their systems before deploying them could help prevent more people from unfairly losing opportunities and benefits in the first place, when effective remedies are still possible.
3. Independent research to investigate the role of technology in discrimination can make a difference.
External research into automated decision systems has been a catalyst for the following positive developments:
External research into Facebook’s ad targeting and delivery system fed directly into several fair housing lawsuits, which ultimately led to Meta announcing that it would discontinue discriminatory ad targeting and delivery tools for housing, credit, and job ads.
SBPC’s study and report on Upstart’s lending model brought about a Congressional inquiry, prompted Upstart to change its model, and eventually led to a monitorship of Upstart designed to test for disparate impacts.
After gaining access to and testing public benefits eligibility and care allocation algorithms, advocates and beneficiaries have been able to slow, alter, or in some cases stop the use of these systems to cut people’s benefits.
Research into payday lenders’ harmful advertising and lead generation practices prompted Google to ban payday loan ads.
In 2015, Quirtina Crittenden documented that Airbnb hosts repeatedly denied her booking requests until she shortened her name to Tina and changed her profile picture so hosts couldn’t tell she was Black. Crittenden’s advocacy — she launched the #Airbnbwhileblack hashtag which inspired many similar accounts — eventually led Airbnb to undergo a civil rights audit, make changes to its system to hide profile pictures from hosts until after booking, and launch a new research program to test its products for discrimination.
These are just a few of the many examples where external research has catalyzed important changes to technologies and systems that impact people’s daily lives and opportunities. This research has complemented, supported, and often prompted regulatory enforcement and litigation.
4. Public access to information about how these systems work is critical for enforcing the law.
The SDAA has the potential to enable external researchers and advocates like Upturn, as well as impacted people in DC, to scrutinize automated decision systems and identify discrimination against DC residents. While we applaud OAG for its attention to these problems, we know that one public agency cannot investigate and litigate every case of discrimination. External research will continue to be critical for discovering, focusing attention on, and challenging the harms the SDAA is designed to address. Complaints and litigation from impacted people are also a critical enforcement mechanism — not only for enforcing SDAA but also for existing DC human rights laws. However, to achieve this potential, the SDAA must facilitate some public disclosure of information about the technologies that impact DC residents.
Currently, the SDAA provides for some disclosure of information to the public and to impacted people. It requires covered entities to disclose whether and how they use personal information in covered automated decision systems, and requires them to provide adverse action notices. These are both positive steps, but more disclosure may be needed to effectively enforce the law. A disclosure on a company’s website can help people who already know where to look for information. But it might not help someone who is preparing to apply for public benefits and wants to know ahead of time what system(s) will be used to screen them. However, the SDAA as written would only require the results, methods, and other documentation of audits to be disclosed to OAG. The Council should consider making some subset or version of this information available to the public so that, for example, external researchers can help scrutinize the legitimacy and soundness of the audit reports and DC residents can better identify systems that may have adversely impacted them.
5. The Council must address the technologies that people encounter every day even if they’re not novel.
As the examples in this testimony demonstrate, the problems the SDAA describes are not limited to big tech companies or complex algorithms that use machine learning or other sophisticated techniques. In our work, we often see simple standardized tools and checklists used to make decisions that have widespread impacts in terms of denying people access to resources. It’s often small companies building software for specific purposes using simple logic and data matching, for example, tenant screening software companies that purchase eviction and criminal records from data brokers and rely on basic name matching to link these records to housing applicants (often erroneously).
It’s also important to note that DC residents experience material harms in the form of denials of housing, jobs, healthcare, and other essential needs even when they are not interacting with an entity online or even when the data being used to discriminate doesn’t come from their online presence or past activities. While it’s true that companies collect massive amounts of data about our online activities, people can experience algorithmic discrimination even if they themselves are not interacting with the entity online. For example, a person applying for housing could fill out a paper application, or a simple online application, and using tenant screening software the landlord can access court records about that person that lead them to make a discriminatory decision not to offer that person housing.
6. The Council shouldn’t overlook other policies that are needed to address these problems.
As the SDAA acknowledges, technology is deeply embedded in all systems that mediate access to basic needs and impact civil rights. However, while algorithms add a new vector for discrimination, they are not the root cause of discrimination. While the SDAA is an important step, Upturn is also advocating for other interventions that are complimentary to the SDAA and are critical for addressing technology’s role in discrimination.
For example, one source of discrimination the Coucil must address is the use of data — such as court records and information held by credit reporting agencies — to lock DC residents out of jobs, housing, and other resources. In May, Council passed the Eviction Record Sealing Authority and Fairness in Renting Amendment Act of 2022, which implemented automatic sealing of eviction records that did not result in a judgment after 30 days. This is a significant step in limiting the use of eviction records to deny housing to DC residents. However, as the DC Council Office or Racial Equity (CORE) has acknowledged, eviction records must be sealed immediately upon filing in order to improve the status quo of racial inequity. Data brokers scrape court websites and gather eviction filings as soon as they are posted, and those filings can remain in circulation long after they’re sealed. Moreover, all eviction records in DC are products of racial injustice and using them to produce tenant screening scores or make housing decisions only deepens that injustice. Automatically sealing all eviction records at the point of filing, and limiting the types of information tenant screening companies can report, are important steps the Council can take toward addressing algorithmic discrimination.
For similar reasons, Council should also move to limit access to criminal records by passing the RESTORE Amendment Act. Criminal records are one way discriminatory policing practices are codified into data, and can be used by algorithmic decision-making tools to both replicate the surveillance and suspicion of marginalized communities, as well as limit people’s housing and job opportunities. Mentioned earlier are the ways background checks can be easily implemented in employment and tenant screening processes. DC has some of the weakest criminal record sealing laws in the country. Non-conviction arrests can appear on a background check for seven years, and convictions of any kind can appear on a background check indefinitely. This means arrests or convictions that have nothing to do with a person’s ability to perform a job or uphold their lease can still keep them from getting a job or securing housing for years or even decades after an encounter with the criminal legal system. Of the criminal record sealing reform bills currently introduced, the RESTORE Amendment Act is the strongest. The Act would automatically seal non-conviction records from public view and provide a streamlined process and shorter waiting periods for sealing certain convictions. Passing the RESTORE Amendment Act would offer significant relief to DC residents and keep criminal records from being used in algorithmic decision-making systems.
7. Many of the algorithmic decisions that impact DC residents’ wellbeing are decisions made by government agencies.
When a District resident applies for home-based care through Medicaid, they are subject to an eligibility decision based on a scoring algorithm that does not consider the impact of cognitive issues on the amount of care that they need. In this case the technology is relatively simple: there is a 286-question assessment conducted by a nurse in-person. The responses to these questions are scored and entered into a software system. This software then uses a small subset of these questions to calculate an eligibility score and, if the person is found eligible, to decide how many hours of care should be allocated to that person. This algorithmic decision has serious implications. It can dictate whether someone is able to stay in their community to get the care they need or is forced into an institution or to go without care. The failings of this algorithmic decision system also mean that people are forced to appeal the decisions, get legal support, and spend the time to make their case in a hearing in order to have a chance at getting the care they need. Whether in the SDAA or other legislation, the Council must address DC agencies’ discriminatory use of technology, as well as the underlying policies that limit residents’ access to the care they need.
8. Demographic testing is essential to civil rights enforcement. However, collecting and inferring demographic data for antidiscrimination testing requires careful planning and safeguards.
In several civil rights domains, demographic testing has been a historically important (and in some cases legally mandated) means of rooting out discrimination. Fair housing testers investigate whether landlords treat potential tenants differently based on their race or source of income. Mortgage lenders are required to collect demographic data from borrowers and analyze their lending practices for disparities. Many employers are required to ask job applicants and employees to answer voluntary demographic questions and to submit reports to government agencies on the aggregate demographic makeup of their workforce, broken down by race and gender categories.
In recent years, in response to pressure from civil rights groups, some technology firms have begun to acknowledge the need to collect or infer demographic data to test products and algorithms for discriminatory impacts. As Upturn wrote in a paper on demographic testing, “Organizations cannot address demographic disparities that they cannot see.” Thus, it’s important that the audits under the SDAA include testing for discrimination.
Testing algorithms for discrimination will require covered entities to collect or infer demographic data. In many cases, covered entities may not already have access to such data. Choosing a methodology for collecting or inferring this data and for conducting discrimination testing is a sensitive and context-specific process. There is no one-size-fits-all approach. For example, when Airbnb tests for discrimination by its hosts against potential guests, it wants to measure guests’ “perceived” race (i.e. how hosts perceive guests). However, when analyzing whether homes in neighborhoods of color are systematically undervalued in appraisals, it may be reasonable to infer neighborhood demographics using census data. While we should expect covered entities to self-test their systems, we cannot assume that they already possess the data or expertise needed to do it responsibly.
Finally, the process of collecting and using demographic data for anti-discrimination purposes must be subject to safeguards. Of course, demographic data about an individual, like race, can be very sensitive and potentially harmful if it’s shared or used in the wrong way, particularly in the process of obtaining employment or housing. Covered entities should be required to store such data separately from other data, and should only access and use this data for antidiscrimination purposes. The Council should consider including these safeguards in the SDAA.
We would be happy to meet with you and your offices to share more about the harms of algorithmic decision-making that we mentioned as well as how through both the SDAA and other legislation Council can tackle these issues.
Emily Paul, Project Director (emily@upturn.org)
Natasha Duarte, Project Director (natasha@upturn.org)
Urmila Janardan, Policy Analyst (urmila@upturn.org)
1
Muhammed Ali et al., Discrimination Through Optimization: How Facebook’s Ad Delivery Can Lead to Skewed Outcomes, Proceedings of the ACM on Human-Computer Interaction 2019, https://arxiv.org/abs/1904.02095.
1
Aaron Rieke et al., Essential Work: Analyzing the Hiring Technologies of Large Hourly Employers, July 2021, https://www.upturn.org/work/essential-work/.
1
See Benefits Tech Advocacy Hub, Case Study Library, Missouri Medicaid Home and Community Based Services Eligibility Issues, https://www.btah.org/case-study/missouri-medicaid-home-and-community-based-services-eligibility-issues.html.
1
Benefits Tech Advocacy Hub, https://www.btah.org/.
1
White House Office of Science and Technology Policy, Blueprint for an AI Bill of Rights: A Vision for Protecting Our Civil Rights in the Algorithmic Age, Oct. 4, 2022, https://www.whitehouse.gov/ostp/news-updates/2022/10/04/blueprint-for-an-ai-bill-of-rightsa-vision-for-protecting-our-civil-rights-in-the-algorithmic-age/.
1
Rieke et al., supra note 2.
1
Id at 11.
1
Id. at 19–20, 25–27.
1
See Id. at 25–26 (citing Susan T. Stabile, The Use of Personality Tests as a Hiring Tool: Is the Benefit Worth the Cost?, 4 U. Pa. J. Labor & Employment L. 279, 304, 2002, https://www.law.upenn.edu/journals/jbl/articles/volume4/issue2/Stabile4U.Pa.J.Lab.&Emp.L.279(2002).pdf; Gabriel Salvendy & Douglas Seymour, Prediction & Development of Industrial Work Performance 252, 1973; Frank J. Cavico et al., Personality Tests in Employment: A Continuing Legal, Ethical, & Practical Quandary, 2 Advances in Soc. Sci. Research J. 60, 70, https://www.researchgate.net/publication/277621154_Personality_Tests_in_Employment_A_Continuing_Legal_Ethical_and_Practical_Quandary; Margaret Talbot, The Rorschach Chronicles, N.Y. Times, Oct. 17, 1999).
1
See Id. at 26 (citing Staff of Subcomm. On Labor-Management Relations of H. Comm. on Educ. & Labor, 96th Cong., Rep. on Pressures in Today’s Workplace 7–8, Comm. Print 1981).
1
See Id. at 26–27 (citing John Logan, The Union Avoidance Industry in the United States, 44:4 British J. Industrial Relations 651 (2006), https://www.jwj.org/wp-content/uploads/2014/03/JohnLogan12_2006UnionAvoidance.pdf).
1
Id. at 27.
1
Id. at 12–13, 27–28.
1
Id. at 21–22.
1
See DC Housing Auth. 2019 Oversight & Performance Hearing, Comm. on Housing & Neighborhood Revitalization, Responses to Pre-Hearing Questions 28, https://dcha.us/img/guest_uploads/temp_Uf9tOu36yq1550855713Q8mBF4DZk9upGMGzt6LI.pdf.
1
See, e.g., Tinuola Dada & Natasha Duarte, How to Seal Eviction Records: Guidance for Legislative Drafting 13–18, July 2022, https://www.upturn.org/static/files/how-to-seal-eviction-records-071322.pdf. See also, e.g., Eric Dunn & Marina Grabchuk, Background Checks and Social Effects: Contemporary Residential Tenant-Screening Problems in Washington State, 9 Seattle Journal for Social Justice 319, 334–37, 2010; Kaveh Waddell, How Tenant Screening Reports Make it Hard for People to Bounce Back from Tough Times, Consumer Reports, Mar. 11, 2021, https://www.consumerreports.org/algorithmic-bias/tenant-screening-reports-make-it-hard-to-bounce-back-from-toughtimes-a2331058426/; Bureau of Consumer Financial Protection, Bulletin 2021–03, Consumer Reporting of Rental Information, 86 Fed. Reg. 35595, 35597–98, 2021.
1
Brian J. McCabe & Eva Rosen, Eviction in Washington, DC: Racial and Geographic Disparities in Housing Instability 14–21, 2020, https://georgetown.app.box.com/s/8cq4p8ap4nq5xm75b5mct0nz5002z3ap. See also Kyle Swenson, A Small Group of Landlords is Behind Nearly Half of D.C.’s Evictions, Report Says, Wash. Post, Oct. 8, 2020, https://www.washingtonpost.com/dc-md-va/2020/10/08/small-group-landlords-is-behind-most-dcs-evictions-report-says/.
1
See, e.g., Dada & Duarte, supra note 16, at 15–18 (discussing one example of this, National Tenant Network’s sample DecisionPoint tenant screening report).
1
Wonyoung So, Which Information Matters? Measuring Landlord Assessment of Tenant Screening Reports, Housing Policy Debate, 2022, https://www.tandfonline.com/doi/citedby/10.1080/10511482.2022.2113815?scroll=top&needAccess=true.
1
Ali et al., supra note 1; Piotr Sapiezynski et al., Algorithms that “Don’t See Color”: Comparing Biases in Lookalike and Special Ad Audiences, Proceedings of the 2022 AAAI/ACM Conference on AI, Ethics, & Society, https://arxiv.org/abs/1912.07579; Muhammed Ali et al., Ad Delivery Algorithms: The Hidden Arbiters of Political Messaging, 2019, https://arxiv.org/abs/1912.04255.
1
See, e.g., Julia Angwin, Ariana Tobin & Madeleine Varner, Facebook (Still) Letting Housing Advertisers Exclude Users by Race, ProPublica, Nov. 21, 2017, https://www.propublica.org/article/facebook-advertising-discrimination-housing-race-sex-national-origin.
1
Ali et al., supra note 1.
1
Id.
1
See, e.g., National Fair Housing Alliance et al. v. Facebook, Inc., 18 Civ. 02689 (JGK) (S.D.N.Y.); Complaint, U.S. v. Meta Platforms, Inc., 1:22-cv-05187 (S.D.N.Y. 2022), https://www.justice.gov/opa/press-release/file/1514026/download.
1
US Dep’t of Justice, Justice Dep’t Secures Groundbreaking Settlement Agreement with Meta Platforms, Formerly Known as Facebook, to Resolve Allegations of Discriminatory Advertising, June 21, 2022, https://www.justice.gov/opa/pr/justice-department-secures-groundbreaking-settlement-agreement-meta-platforms-formerly-known (“Under the settlement, Meta will stop using an advertising tool for housing ads (known as the ‘Special Ad Audience’ tool) that . . . relies on a discriminatory algorithm. Meta will also develop a new system to address racial and other disparities caused by its use of personalization algorithms in its ad delivery system for housing ads. That system will be subject to Department of Justice approval and court oversight.”).
1
See Relman Colfax, Fair Lending Monitorship of Upstart Network’s Lending Model, https://www.relmanlaw.com/cases-406.
1
Fair Lending Monitorship of Upstart Network’s Lending Model, Initial Report of the Independent Monitor 3, 22–23, Apr. 14, 2021, https://www.relmanlaw.com/media/cases/1088_Upstart%20Initial%20Report%20-%20Final.pdf.
1
See Id. at 23 n.100.
1
See Id. at 21–23.
1
Id. at 21.
1
See, e.g., Benefits Tech Advocacy Hub, Case Study Library, https://www.btah.org/case-studies.html.
1
See generally Benefits Tech Advocacy Hub, btah.org.
1
See Benefits Tech Advocacy Hub, Case Study Library, Missouri Medicaid Home and Community Based Services Eligibility Issues, https://www.btah.org/case-study/missouri-medicaid-home-and-community-based-services-eligibility-issues.html.
1
Id.
1
Id.
1
Id.
1
See, e.g., US EEOC, CM-631: Employment Agencies, Dec. 1990, https://www.eeoc.gov/laws/guidance/cm-631-employment-agencies.
1
See, e.g., Dada & Duarte, supra note 16, at 16–18 n.33.
1
See Id. at 16–17.
1
See, e.g., Rieke et al., supra note 2, at 34 (“The EEOC’s enforcement of Title VII and the ADA largely relies on individuals to file charges of discrimination.”).
1
See Id. at 23–24, 28.
1
See Cohen Milstein, Connecticut Fair Housing Ctr. et al. v. CoreLogic Rental Property Solutions, https://www.cohenmilstein.com/case-study/connecticut-fair-housing-center-et-al-v-corelogic-rental-property-solutions ([Carmen] Arroyo, whose son Mikhail was injured in a July 2015 accident that left him unable to speak, walk, or care for himself, is her son’s conservator. . . . Arroyo asked her landlord for permission to move Mikhail into her home . . . . But, his application was denied. CoreLogic’s “CrimSAFE” background check stated that Mikhail had a ‘disqualifying [criminal] record.’ Arroyo claims that CoreLogic’s criminal background report did not provide the landlord with any details about Mikhail’s underlying criminal history—only a computer-generated notation that the application did not meet the landlord’s criteria.”).
1
Conn. Fair Housing Ctr. v. CoreLogic Rental Property Solutions, LLC, No. 3:18-CV-705 at 4–6, 15 (D.Conn. 2020), https://www.cohenmilstein.com/sites/default/files/RULING%20-%20CoreLogic%20Summary%20Judgment%2008072020.pdf.
1
US EEOC, Enforcement Guidance on the Consideration of Arrest and Conviction Records in Employment Decisions under Title VII of the Civil Rights Act, 2012, https://www.eeoc.gov/laws/guidance/enforcement-guidance-consideration-arrest-and-conviction-records-employment-decisions.
1
Mandala v. NTT Data, Inc., No. 19-2308 (2d Cir. 2021).
1
See supra text accompanying notes 20–25.
1
See supra text accompanying notes 26–30.
1
See generally Benefits Tech Advocacy Hub, Case Study Library, https://www.btah.org/case-studies.html.
1
Aaron Rieke & Logan Koepke, Led Astray: Online Lead Generation and Payday Loans, Oct. 2015, https://www.upturn.org/work/led-astray-online-lead-generation-and-payday-loans/.
1
See Aaron Rieke, Google was Right to Get Tough on Payday Loan Ads, May 13, 2016, https://www.upturn.org/work/google-was-right-to-get-tough-on-payday-loan-ads/.
1
Aja Romano, Airbnb Has a Discrimination Problem. Ask Anyone Who’s Tried to #Airbnbwhileblack.
1
Laura W. Murphy, Airbnb’s Work to Fight Discrimination and Build Inclusion: A Report Submitted to Airbnb, Sept. 8, 2016, https://blog.atairbnb.com/wp-content/uploads/2016/09/REPORT_Airbnbs-Work-to-Fight-Discrimination-and-Build-Inclusion.pdf.
1
Airbnb, Update on Profile Photos, Oct. 22, 2018, https://news.airbnb.com/update-on-profile-photos/.
1
Airbnb, A New Way We’re Fighting Discrimination On Airbnb, June 15, 2020, https://www.airbnb.com/resources/hosting-homes/a/a-new-way-were-fighting-discrimination-on-airbnb-201.
1
D.C. Law 24–115 Sec.3(b)(a), https://lims.dccouncil.gov/downloads/LIMS/46603/Signed_Act/B24-0096-Signed_Act.pdf.
1
D.C. Council Office of Racial Equity, Racial Equity (CORE) Impact Assessment of Bill 24–0096, Eviction Record Sealing Authority and Fairness in Renting Amendment Act of 2021 9–11, Nov. 30, 2021, https://www.dropbox.com/s/nejlhn7ljj8js3y/B24-0096%20Eviction%20Record%20Sealing%20Authority%20 Amendment%20Act%20of%202021.pdf?dl=0.
1
See Dada & Duarte, supra note 16.
1
B24-0180, Record Expungement Simplification to Offer Relief and Equity (RESTORE) Amendment Act of 2021, https://lims.dccouncil.gov/Legislation/B24-0180.
1
See, e.g., Jenn Rolnick Borchetta, Curbing Collateral Punishment in the Big Data Age: How Lawyers and Advocates can Use Criminal Record Sealing Statutes to Protect Privacy and the Presumption of Innocence, 98 Boston U. L. Rev. 915, 2018, https://www.bu.edu/bulawreview/files/2018/06/BORCHETTA.pdf; Cameron Kimble & Ames Grawert, Collateral Consequences and the Enduring Nature of Punishment, Brennan Center for Justice, June 21, 2021, https://www.brennancenter.org/our-work/analysis-opinion/collateral-consequences-and-enduring-nature-punishment; Ariel Nelson, Broken Records Redux: How Errors by Criminal Background Check Companies Continue to Harm Consumers Seeking Jobs and Housing, Nat’l Consumer Law Ctr., 2019, https://www.nclc.org/resources/report-broken-records-redux/.
1
See DC Justice Lab, Seal the Deal, https://dcjusticelab.org/sealthedeal/.
1
15 U.S.C. § 1681c.
1
We also believe that the RESTORE Act should move to further limit law enforcement access to criminal records by expunging non-convictions, not merely sealing them.
1
DC Office on Aging, What are Long-term Services and Supports, and Could They be Right for You?, https://dhcf.dc.gov/sites/default/files/dc/sites/dhcf/page_content/attachments/LTSS%20overview_091318_CLEAN.pdf.
1
Id. (“What is the InterRAI assessment?”).
1
See Benefits Tech Advocacy Hub, Case Study Library, https://www.btah.org/case-studies.html.
1
See Miranda Bogen, Aaron Rieke & Shazeda Ahmed, Awareness in Practice: Tensions in Access to Sensitive Attribute Data for Antidiscrimination, Proceedings of the Conference on Fairness, Accountability, and Transparency (FAT*) 2020, https://arxiv.org/pdf/1912.06171.pdf.
1
See US Dep’t of Justice, Fair Housing Testing Program, https://www.justice.gov/crt/fair-housing-testing-program-1.
1
See Bogen et al., supra note 66 (citing Joseph M. Kolar & Jonathan D. Jerison, The Home Mortgage Disclosure Act: Its History, Evolution, and Limitations, 2006, https://buckleyfirm.com/uploads/36/doc/HistoryofHMDAapr06.pdf)).
1
See, e.g., Rieke et al, supra note 2, at 33–34; Bogen et al., supra note 66.
1
See, e.g., Airbnb, A New Way We’re Fighting Discrimination On Airbnb, June 15, 2020, https://www.airbnb.com/resources/hosting-homes/a/a-new-way-were-fighting-discrimination-on-airbnb-201.
1
Bogen et al., supra note 66.
1
See Id.
1
Airbnb, Measuring Discrimination on the Airbnb Platform, June 15, 2020, https://news.airbnb.com/measuring-discrimination-on-the-airbnb-platform/.
Related Work
We’re partnering with Legal Aid of Arkansas and the National Health Law Program to provide tools for advocates to fight harmful benefits technology and to build a community of advocates and technologists working to challenge tech that keeps people from accessing benefits.
Public BenefitsWe wrote an issue brief offering guidance and recommendations for advocates and policymakers who seek to draft or support eviction record sealing laws.
HousingWe sent a memo on technology’s role in financial services discrimination to agency leaders within the Biden administration.
Credit and FinanceWe sent a memo on technology’s role in hiring discrimination to agency leaders within the Biden administration.
Labor and Employment