Algorithms and the Community
Government agencies are increasingly employing automated decision-making processes to decide who gets access to housing, who is stopped by police, and who should receive benefits. Most people whose lives are affected by these systems don’t even know they exist. A panel during NNIP Month in May 2021 aimed to educate partners on how these processes affect communities, particularly people of color, and spur a discussion on what roles NNIP partners can play in promoting government transparency and advancing productive community conversations.
After an introduction by moderator Mychal Cohen of the Urban Institute, Chris Kingsley of the Annie E. Casey Foundation spoke about how this issue is increasingly of interest to their staff, as use of advanced analytic tools in decisions for things like policing, school assignment, or tenant screening disproportionately affect children and families experiencing poverty. The foundation developed “Four Principles to Make Advanced Data Analytics Work for Children and Families” to guide how these systems should be ethically developed, operated, and assessed.
Katurah Topps from the NAACP Legal Defense and Education Fund described her experience as part of the coalition that organized the late 2019 event in New York City, “Automating Bias: How Computers Are Making Decisions About Your Life” (see agenda). The city government was using automated decisionmaking technology in many ways (housing, employment, child welfare) for surveillance, tracking, and making decisions about people's lives without their knowledge. A report released at the same time - Confronting Black Boxes: A Shadow Report of the New York City Automated Decision System Task Force - documented the opaque nature of government processes and data-driven technologies. The event was intended for public education and presented easily digestible information so everyone could understand how people’s data were being used. For Ms. Topps, the event’s goal was to empower community members with resources and tools to use the information to engage in conversations with government about the use of algorithms in the various agencies.
She called for a collective effort educate the public around the country about the impact of algorithms on communities. Most conversations on these systems– even for governments trying to be transparent – only include policy and technical professionals. Residents - particularly those with low-incomes most affected by these systems - should always be at the table so their lived experience can guide priorities, highlight concerns and shed light on what information is needed to determine whether the benefits of using automated decisionmaking outweigh the risks.
Anthony Galvan from the NNIP Partner at the University of Texas in Dallas, described their analysis of the PRAISE Texas Pretrial Tool, a risk assessment to predict a person’s likelihood to appear in court or to reoffend. The tool was touted as providing reliable and neutral information even though the data that was used reflected past patterns of unequal treatment for people of color and with low incomes.
He noted that NNIP partners often promote ways data can be used to advance equity, but they also have a role in highlighting the bias in data and advocating for how data should not be used. They can advocate for transparency when agencies are using these tools and for including people of color at the table posing questions about fair treatment and potential harm.