GET DIRECTV

Documentary Examines Bias Behind Artificial Intelligence, Facial Recognition

Documentary Examines Bias Behind Artificial Intelligence, Facial Recognition

By its very nature, science is supposed to be an impartial judge. But is it really? In her thought-provoking documentary “Coded Bias,” director Shalini Kantayya questions the neutrality of technology, arguing that computers have a built-in bias that reflects the faulty assumptions of the people (usually men) who program them. Her emphasis is on the impact that such bias has on marginalized communities via corporate business and law enforcement.

The film was sparked by the work of Joy Buolamwini, a Ph.D. student at MIT who conducted facial-recognition experiments using A.I. and had difficulty getting the technology to accurately process her face. Investigating further, she discovered that these programs struggle to register women more than men. Delving into the root causes of these problems, “Coded Bias” serves as both a wake-up call (to invasive practices the public doesn’t yet realize are being implemented) and a call to action.

At the Sundance Film Festival, where the film premiered, Kantayya explained that she isn’t trying to scare people, but rather to “inform them on things they should know.” But even that goal could quickly overwhelm audiences, as she crams an awful lot of information into the film’s 90-minute running time. That’s where animation and familiar science fiction references serve to keep technical concepts accessible to lay viewers.

Once Buolamwini establishes a gender bias, she proceeds to investigate the same programs for evidence of race-based bias as well. She suggests that the problem with A.I. algorithms can be found in the lazy or egocentric thinking of the coders themselves. The homogenous culture of Silicon Valley is a dead giveaway: The software under scrutiny is designed almost exclusively by men, who don’t necessarily take other identities into consideration when setting the basic parameters of their programs. Are the creators aware of the prejudice within their tech? Do they even care?

Tranae Moran discusses what it’s like living with facial-recognition biases on a smaller scale. She lives at the Atlantic Plaza towers in Brownsville, Brooklyn, N.Y. According to Moran, those in charge of the property use facial recognition to monitor what they deem suspicious, harassing residents flagged by the software. She believes they aim to take it a step further, using the system to gain entry to private apartments under the pretext of safety.

Across the Atlantic in London, Silkie Carlo serves as director of Big Brother Watch, an organization that monitors the use of facial recognition A.I. by British law enforcement. Carlo is a former advocacy officer who understands how civilian civil liberties are violated with this technology, in addition to noticing the growing number of citizens being misidentified. For example, Big Brother Watch found that the use of photo biometrics produced 2% identification accuracy for the Metropolitan police force, while for South Wales police it’s only 8% accurate.

Mistakes can result in unlawful searches of individuals on the street when the A.I. inaccurately labels someone a criminal. In one scene, Carlo witnesses a young black boy surrounded by police and asks the officers why they’re detaining the child. They answer that whatever program they are using identified the child as a danger. Baroness Jenny Jones, a member of British Parliament who regulates law enforcement, sees this and gets involved. She demands to know why the officers aren’t more knowledgeable. Their answer: “We don’t know.” No stranger to facial-recognition networks herself, the Baroness is wrongfully on the watch list for domestic extremism.

According to the documentary, nine companies — including Amazon, IBM and Facebook — are building the future of A.I. At present, 117 million Americans have already had their photos uploaded to a facial-recognition network that is searched by police using algorithms that aren’t audited for accuracy. A small number of corporations are creating a dystopian future where they make a profit, while people who don’t know what they’re doing use their software. It’s a scary thought that you can walk down the street and be labeled a terrorist. You are then rounded up, arrested, and traumatized before anyone has a chance to prove your innocence.

This scenario seems to come straight out of George Orwell’s novel 1984, in which “Big Brother” (in this case, corporations and law enforcement) already has control of vast amounts of our data and can use it as they see fit. This looming threat of authoritarianism — especially against the poor — motivates Buolamwini to compile all of her research into what she calls the gender/shades study, which uses an intersectional lens to test bias weaknesses in the algorithms.

For the moment, neither the companies nor the software they distribute is being regulated, and there is no one watching how these programs are made. With so much information to process, how can ordinary people protect themselves from facial-data networks?

According to the film, the situation is already unavoidable, but Congress is aware of the imminent danger that coded bias can have if the government doesn’t get this under control. For those who weren’t already aware of these new threats to personal privacy, “Coded Bias” opens their eyes to this invasive phenomenon, identifying the blindspots in the systems that may already be watching you.

This article was written by Peter Debruge from Variety and was legally licensed via the Tribune Content Agency through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

The content is featured on https://www.directv.com/insider/ is editorial content brought to you by DIRECTV. While some of the programming discussed may now or in the future be available affiliates distribution services, the companies and persons discussed and depicted, and the authors and publishers of licensed content, are not necessarily associated with and do not necessarily endorse DIRECTV. When you click on ads on this site you may be taken to DIRECTV marketing pages that display advertising content. Content sponsored or co-created by programmers is identified as "Sponsored Content" or "Promoted Content."