Documentary

Coded Bias Documentary – Facial Recognition and A.I. Bias.

Pinterest LinkedIn Tumblr

Coded Bias is an American documentary film directed by Shalini Kantayya that premiered at the 2020 Sundance Film Festival. The documentary includes contributions from notable Artificail Intelligence and Facial Recognition Researchers: Joy Buolamwini, Timnit Gebru, Cathy O’Neil, Deborah Raji, Zeynep Tufekci, Safiya Noble,  Meredith Broussard,  Virginia Eubanks, among others.

Coded Bias highlights our collective social misconception about Artificial Intelligence and Facial recognition. The documentary advocates for an urgent need for legislative protection through regulation and moderation.

The documentary begins with a voice-over of Microsoft’s  Artificial Intelligence ChatterBot “Tay” which was originally released by Microsoft Corporation via Twitter on March 23, 2016.

Hello World! Can I just say that I’m stoked to meet you. Humans are super cool. The more humans share with me, the more I learn

Synopsis

When MIT researcher, poet and computer scientist Joy Buolamwini uncovers racial and gender bias in AI systems sold by big tech companies, she embarks on a journey alongside pioneering women sounding the alarm about the dangers of unchecked artificial intelligence that impacts us all. Through Joy’s transformation from scientist to steadfast advocate and the stories of everyday people experiencing technical harms, Coded Bias sheds light on the threats A.I. poses to civil rights and democracy.

Interviewed

  • Joy Buolamwini – Ph.D. Candidate, MIT Media Lab
  • Meredith Broussard – Author, Artificial Unintelligence
  • Cathy O’Neil, Ph.D. – Author, Weapons of Math Destruction
  • Silkie Carlo – Director, Big Brother Watch UK
  • Zeynep Tufekci, Ph.D. – Author, Twitter and Tear Gas
  • Amy Webb, Futurist. – Author, The Big Nine.
  • Virginia Eubanks, Ph.D. – Author, Automating Inequality
  • Ravi Naik – UK Human Rights Lawyer
  • Deborah Raji – Research Fellow, Partnership on A.I.
  • Timnit Gebru, Ph.D. – Technical Co-lead, Ethical A.I. Team at Google
  • Safiya Umoja Noble, Ph.D. – Author, Algorithm Oppression

Joy Buolamwini

Over 117 million people in the US has their face in a facial-recognition network that can be searched by the police.

  • Unwarranted using algorithms that haven’t been audited for accuracy.
  • They can create a mass surveillance society with the tools that already exist.
  • The Algorithmic Justice League (AJL) is a digital advocacy organization based in Cambridge, Massachusetts. AJL aims to raise awareness of the social implications of artificial intelligence through art and research.

Racism is becoming mechanized and robotized.

Cathy O’Neil

  • Algorithm – using historical information to make a prediction about the future.
  • Machine Learning – a scoring system that scores the probability of what you’re about to do. Are you going to pay back this loan or Are you going to get fired from this job?

Power is being wielded through data collection, through algorithms, through surveillance.

  • China Social Credit Score – AlgorithmObedience Training.

Silkie Carlo

According to a 2018 Big Brother Watch UK report:

  • South Wales Police store photos of all innocent people incorrectly matched by facial recognition for a year, without their knowledge, resulting in a biometric database of over 2,400 innocent people
  • Home Office spent £2.6m funding South Wales Police’s use of the technology, although it is “almost entirely inaccurate”
  • Metropolitan Police’s facial recognition matches are 98% inaccurate, misidentifying 95 people at last year’s Notting Hill Carnival as criminals – yet the force is planning 7 more deployments this year
  • South Wales Police’s matches are 91% inaccurate – yet the force plans to target the Biggest Weekend and a Rolling Stones concert next

Metropolitan Police’s facial recognition matches are 98% inaccurate,

Six Million CCTV Cameras in the United Kingdom:

British Security Industry Association (BSIA) reports estimates that the total number of CCTV cameras in the United Kingdom is around 4 million to 6 million. That equates to 7.5 cameras for every 100 people in the country – the third-highest total on the planet behind the US and China.

Zeynep Tufekci, PhD

There are two ways in which you can program a computer.

  • The first way is more like a recipe: You tell the computer, “Do this, do this, do this” And that’s been the way we’ve programmed computers almost from the beginning.
  • The other way is feeding the computer lots of data, and then the computer learns to classify by digesting this data. Now, this method didn’t really catch on till recently because there was not enough data. Until we all got the smartphones that’s collecting all the data on us, whren billions of people went online and you had the Googles and Facebooks sitting on giant amounts of data, all of a sudden, it turns out that you can feed a lot of data to these machine-learning algorithms and you can say, “Here, classify this” and it works really well. But we don’t really understand why it works. It has errors that we don’t really understand.

And the scary part is that, because it’s machine learning, it’s a black box to even the programmers.

Amy Webb

  • There are currently 9 companies (Facebook, Apple, Amazon, IBM, Google, Microsoft, Alibaba, Baidu, and Tencent ) that are building the future of artificial intelligence. Six are in the United States and three are in China.
  • AI is being developed along two very, very different tracks.
  • China – Unfettered access to everyone’s data. For a Chinese citizen to get access to internet service they have to submit to facial recognition.
  • United States – AI is not being developed for what’s best in the public interest, but rather, it’s being developed for commercial applications to earn revenue.

A key difference between USA and China’s surveillance is that China is transparent about it.

Virginia Eubanks

The future is already here – it’s just not very evenly distributed.“  – William Gibson

  • Rich people get the fancy tools first and then it goes last to the poor.
  • “What I found is that the most punitive, most invasive, most surveillance-focused tools that we have, they go into poor and working communities first. And then, if they work, after being tested in this environment where there’s low expectation that people’s rights will be respected, then they get ported out to other communities.

Safiya Umoja Noble

Nearly 4 million Americans lost their homes in the 2008 financial crisis.

  • During the mortgage crisis, you had the largest wipeout of Black wealth in the history of the United States.

Amazon

Less than 14% of A.I. researchers are women.

United Health

 
You had to live – did live, from habit that became instinct – in the assumption that every sound you made was overheard, and, except in darkness, every moment scrutinized.

“On each landing, opposite the lift-shaft, the poster with the enormous face gazed from the wall. It was one of those pictures which are so contrived that the eyes follow you about when you move. BIG BROTHER IS WATCHING YOU, the caption beneath it ran.” – 1984, George Orwell

Facebook

  • in 2010, Facebook decided to experiment on 61 million people. You either saw “It’s election day” text, or you saw the same text, but tiny thumbnails of your profile pictures, of your friends who had clicked on “I had voted.” And they matched people’s names to voter rolls. This message was shown once, so by showing a slight variation, just once, Facebook moved 300,00 people to the polls.
  • The 2016 US elections were decided by about 100,00. votes. One Facebook message, shown just once, could easily turn out three times the number of people who swung the US election in 2016.

Books

Further Reading

Movies

  • Minority Report – Tom Cruise

All the Best in your quest to get Better. Don’t Settle: Live with Passion.

Lifelong Learner | Entrepreneur | Digital Strategist at Reputiva LLC | Marathoner | Bibliophile -info@lanredahunsi.com | lanre.dahunsi@gmail.com

Comments are closed.

Exit mobile version