Programming bias

M.I.T. Media Lab researcher Joy Buolamwini

Despite artificial intelligence often being perceived as inherently forward-thinking and socially neutral, M.I.T. Media Lab researcher Joy Buolamwini, a Black woman, discovered that perhaps it wasn’t. While she was working with facial- recognition software, she realized that the algorithm couldn’t detect her face unless she donned a white mask. She tells her story in Coded Bias, a documentary directed by Shalini Kantayya, which explores how algorithms can perpetuate racial and gender bias. A discussion about the film’s themes is led by Melanie Moses, a UNM computer science professor, and Christopher Moore, a member of the Santa Fe Institute’s Interdisciplinary Working Group for Algorithmic Justice, at 6 p.m. on Tuesday, March 9, on Zoom. The cost is $12; all registrants receive a link to stream Coded Bias prior to the discussion. Co-presented by the Santa Fe Institute and the Center for Contemporary Arts; 505-982-1338,

(0) comments

Welcome to the discussion.

Thank you for joining the conversation on Please familiarize yourself with the community guidelines. Avoid personal attacks: Lively, vigorous conversation is welcomed and encouraged, insults, name-calling and other personal attacks are not. No commercial peddling: Promotions of commercial goods and services are inappropriate to the purposes of this forum and can be removed. Respect copyrights: Post citations to sources appropriate to support your arguments, but refrain from posting entire copyrighted pieces. Be yourself: Accounts suspected of using fake identities can be removed from the forum.