Princeton prof. warns of racial bias in technology at Annenberg talk


Princeton prof. warns of racial bias in technology at Annenberg talk



African American studies Princeton professor Ruha Benjamin said, “Race neutrality, it turns out, can be a deadly force.”

Princeton professor Ruha Benjamin discussed underlying racial biases in technology at a talk at the Annenberg School of Communication on Monday.

Benjamin, who teaches African American studies, discussed the “New Jim Code,” the idea that technology developers can encode their own racial biases without users of that technology knowing.

“Race neutrality, it turns out, can be a deadly force,” Benjamin said. “This combination of coded bias and imagined objectivity is what I term the New Jim Code: innovation that enables social containment while appearing fairer than discriminatory practices of an earlier era.”

People should be skeptical of the technologies they use, Benjamin said, because the algorithms, which may seem neutral, could have implicit biases. She said this is because people may have established racial biases before they engage in creating technology.

She cited several examples of racially-biased technology throughout her presentation, including Citizen, a mobile application that is intended to alert people about dangerous neighborhoods. Benjamin said that since this app relies on user feedback to locate unsafe areas, the app is susceptible to racial profiling. Benjamin also showed a clip of Sleep Dealer, a dystopian sci-fi movie about how technology could be used to oppress and overwork migrant workers.

Kinjal Dave, a first-year Ph.D. student at Annenberg, said the talk was engaging and students felt passionate about the topics she discussed.

“The fact that it’s the middle of finals, it was terrible weather, it was a really cloudy day, and this room is packed completely just shows how much energy there is for her work,” Dave said.

To address the issue of racial bias in technology, Benjamin said the solution relies on developers being conscious of avoiding this problem. She said organizations like Our Data Bodies, which created the Digital Defense Playbook, are helpful for setting guidelines for developers to avoid being racially biased.

“I love how she ended with talking about how we have to question what we are trying to solve and how we bring the power back to the people,” said Christopher R. Rogers, an education doctoral candidate who attended the event.

Benjamin’s presentation was a collaborative initiative between the Penn Program on Race, Science, and Society, the Department of History and Sociology of Science Colloquium, and the Control Societies Speaker Series. 

Ezelle Sanford III, a postdoctoral fellow at the PRSS Center for Africana Studies, coordinated logistics for the event. He said Benjamin was the conclusion of a series of speakers that the History and Sociology of Science Department hosted this semester.

Sanford III said PRSS’s next project will be called the Penn Medicine and the Afterlives of Slavery project, which will be the subject of a symposium in the spring.




Source link