Today's news in major cities, regional and local areas hich can include accident reports

Wednesday, November 1, 2023

[New post] “AI, Ain’t I A Woman?” On the Blindness and Limitations of Artificial Intelligence

Site logo image Joy Buolamwini posted: "I sensed an opening. Research papers could reach academics and industry practitioners focused on AI, but I needed something more to reach everyday people. I also needed to reach decision-makers like elected officials who might be seduced by the promises o" Literary Hub

"AI, Ain't I A Woman?" On the Blindness and Limitations of Artificial Intelligence

Joy Buolamwini

Nov 1

I sensed an opening. Research papers could reach academics and industry practitioners focused on AI, but I needed something more to reach everyday people. I also needed to reach decision-makers like elected officials who might be seduced by the promises of AI to bring increased efficiency without being aware of racial, gender, and other types of bias. Did the government officials in India exploring the adoption of the Aadhaar system know about the potential for bias in the biometric solutions being offered as answers for efficient distribution of government resources and persistent identification? Did they know algorithmic bias might deny benefits to the very people they sought to help? What about the police departments adopting facial recognition technologies? What did they know about algorithmic bias, if anything? I knew I couldn't leave it to the companies selling these systems to reveal their flaws. There was no incentive to put technological shortcomings in a sales pitch. I needed to humanize the harms and biases of AI systems and bring a perspective that tech companies were likely to shy away from. How might I use my knowledge to help people see beyond the headlines now being written about my work, "Facial Recognition Is Accurate, If You're a White Guy," and feel the impact on a specific person?

I decided one way to humanize AI biases and make the topic more mainstream than an academic paper was to test the faces of the Black Panther cast. Since my research had shown that the systems I tested worked worst on the faces of darker-skinned females, I decided to focus on the faces of the women of Wakanda: Lupita Nyongʹo as Nakia, Letitia Wright as Shuri, Angela Bassett as Queen Ramonda, and Danai Gurira as fearless General Okoye. I brought on Deborah Raji as my research intern to carry out a small-scale audit running the Black Panther cast's faces across the AI systems of five companies. This exploration became known as the Black Panther Face Scorecard project. The project revealed some commonalities with my own experience. Like me, some of their faces were misgendered, not detected at all, or in some cases mis-aged. Angela Bassett, who was in her late fifties at the time of the photo, was estimated by IBM's system to be between eighteen and twenty-four years old. (Maybe not all algorithmic bias was that bad.)

My collection of failure demonstrations provided a counterpoint to the celebrations that accompanied technological advances.

The results were amusing. The Black Panther Face Scorecard drew smiles from colleagues and visitors from member companies of the MIT Media Lab. These fictional characters, played by actors whose faces had reached billions of people, still felt safely removed from everyday life. While more women were rocking shaved heads, not many people were walking around with vibranium undershirts or bracelets with ammunition to keep superhero relatives safe. At least, this wasn't happening in my social circles.

The performance metrics on the women of Wakanda kindled my curiosity. How would these AI systems work on the faces of not just fictional dark-skinned women but iconic women of today and yesterday? How might AI read the faces of highly photographed women like Michelle Obama, Serena Williams, and Oprah Winfrey?

Screenshot of Oprah Winfrey image misclassification, from the visual poem "AI, Ain't I A Woman?" Youtu.be/QxuyfWoVV98?t=133.

And how would it do on historic figures like Sojourner Truth, who escaped slavery by buying her freedom and pushed for women's rights and the abolition of slavery? I was also eager to try the faces of Shirley Chisholm, the first Black congresswoman, and fearless journalist Ida B. Wells. I searched online for popular, widely used images of these women, which Deborah Raji ran through systems that included IBM, Amazon, and Microsoft. When she shared the results, I was astonished.

Screenshot of Sojourner Truth image misclassification, from the visual poem "AI, Ain't I A Woman?" Youtu.be/QxuyfWoVV98?t=39.

Looking at just the names with the results in a spreadsheet was one thing. Seeing the faces of women I admired and respected next to labels containing wildly incorrect descriptions like "clean shaven adult man" was a different experience. I kept shaking my head as I read over the results, feeling embarrassed that my personal icons were being classified in this manner by AI systems. When I saw Serena Williams labeled "male," I recalled the questions about my own gender when I was a child ("Are you a boy or a girl?"). When I saw an image of a school-aged Michelle Obama labeled with the descriptor "toupee," I thought about the harsh chemicals put on my head to straighten my kinky curls, until I decided to embrace my natural hair. And seeing the image of a young Oprah labeled with no face detected took me back to my white mask experience.

For a while, I tried to remain detached from my research findings, which indicated that all systems tested worked worst for dark-skinned females. The research touched on other groups that also warranted attention, like darker-skinned males and lighter-skinned females. With the latest exploration of women I admired, I had an opportunity to bring dark-skinned women like me to the center stage. I had the power to put faces to what might otherwise be story-less silhouettes.

My first instinct was to create an explainer video like the one I made for the "Gender Shades" research paper. Doing that was familiar and comfortable. It allowed me to show some of the outrageous results from the position of an analyst explaining how the results reflected misogynoir, the term coined by Dr. Moya Bailey meaning the ways Black women, specifically, are insulted or discriminated against.

After writing the draft script for an explainer video on these iconic women, I showed it to a teaching assistant in a film class I visited periodically and asked how I could improve it. "What motivated you to work on it?" he asked me.

"The research paper is the beginning of a conversation, but the results are abstract. I do not want to subtract the humanity of the feeling of being misgendered, being labeled in ways beyond your control. I want people to see what it means when systems from tech giants box us into stereotypes we hoped to transcend with algorithms. I want people to bear witness to the labels and peer upon the coded gaze for themselves."

As I spoke, he nodded his head.

"Have you considered making a poem about this instead of a script?"

For years, there was a form of art I indulged in but kept largely hidden. I had notebooks and digital diaries filled with verses and phrases. Snippets of my poetry dwelled in shadowy places. I enjoyed writing, but it was mostly a private, vulnerable exercise: I'd intended to keep my poetry mostly to myself and a tight circle of sympathetic ears.

Truth was also in the business of sharing counter-demos to large audiences to demolish dangerous narratives.

When the sunlight warmed me awake the next morning, the following phrase sat in my head, capturing how I felt about witnessing the cultural impact of Serena Williams, Michelle Obama, and Oprah Winfrey walking in their paths:

My heart smiles as I bask in their legacies

knowing their lives have altered many destinies.

As I brushed my teeth and looked into a fogged mirror, more words came into focus:

In her eyes, I see my mother's poise

In her face, I glimpse my auntie's grace

As I ruminated on the work more lines came to me:

Can machines ever see my queens as I view them?

Can machines ever see our grandmothers as we knew them?

My poem "AI, Ain't I A Woman?" was born. The piece held the emotions I had long suppressed. When I spoke the words of the poem aloud, my anguish and disappointment emerged. But for the full impact, the words needed to be paired with the images and disheartening labels that were slapped onto these iconic women by AI systems from leading tech companies. Part of what made the white mask demo more powerful than words alone was seeing me alter myself by donning a white mask to be made visible to a machine.

Until making the white mask fail demo, I thought of tech demonstrations as celebrations of what machines could do. If a demonstration included a failure, the demo gods had failed you. I thought of the way Steve Jobs, robed in a black turtleneck, not only talked about the possibilities of an iPhone but demonstrated the capabilities with carefully selected examples to tantalize onlookers and change the conception of what a cellphone could be. His words mattered, and so did seeing a simple gesture opening an application or switching screen views. Showcasing what his words meant completed the seduction. The Apple demos were a pathway into transforming existing beliefs about technology.

I was doing something similar but in the opposite direction. There were plenty of examples to show the possibilities of tech. I was collecting examples to show the limitations. My collection of failure demonstrations provided a counterpoint to the celebrations that accompanied technological advances.

The white mask failure I recorded was an example of what I call a counter-demo. But what exactly is a counter-demo countering? With the case of the white mask, I was providing a counter-narrative to the research and accompanying headlines lauding advances in computer vision. With "AI, Ain't I A Woman?" I decided to record screencasts to create counter-demos. These demonstrations countered the supposed sophistication of AI systems being eagerly sold. I assumed commercially sold products from these companies would perform fairly well on most people's faces if they were being sold to a wide market.

At the time, these companies had online demos of their AI product capabilities that were publicly available so anyone with some time, an internet connection, and a photo could upload an image and see how the demos worked. To make counter-demos, I screen recorded my visits to these websites and sat through loading animations of rotating wheels that preceded the display of results. Some included colored boxes that would be used to locate a head in an image. All had some type of description about what the uploaded images contained. When I uploaded an image of Sojourner Truth to Google's system, it returned the label "gentleman." Truth had fought to be treated on equal footing with a gentleman but was also vocal in saying that she too was a woman. Her famous 1851 "Ain't I A Woman?" speech inspired the name of my spoken word algorithmic audit. Truth was also in the business of sharing counter-demos to large audiences to demolish dangerous narratives.

__________________________________

From the book Unmasking AI: My Mission to Protect What Is Human in a World of Machines by Joy Buolamwini. Copyright © 2023. Published by Random House, an imprint and division of Penguin Random House LLC. All rights reserved. 

Comment

Manage your email settings or unsubscribe.

Trouble clicking? Copy and paste this URL into your browser:
https://lithub.com/ai-aint-i-a-woman-on-the-blindness-and-limitations-of-artificial-intelligence/

WordPress.com and Jetpack Logos

Get the Jetpack app to use Reader anywhere, anytime

Follow your favorite sites, save posts to read later, and get real-time notifications for likes and comments.

Download Jetpack on Google Play Download Jetpack from the App Store
WordPress.com on Twitter WordPress.com on Facebook WordPress.com on Instagram WordPress.com on YouTube
WordPress.com Logo and Wordmark title=

Automattic, Inc. - 60 29th St. #343, San Francisco, CA 94110  

at November 01, 2023
Email ThisBlogThis!Share to XShare to FacebookShare to Pinterest

No comments:

Post a Comment

Newer Post Older Post Home
Subscribe to: Post Comments (Atom)

JHI Blog: Recent posts

...

  • [New post] Germany Offers Free of Cost Work Visas to Indian IT Workers
    Arooj Fatima posted: " Indian IT professionals can now enjoy Germany's one of the best offers in terms of immigration. ...
  • [New post] 6 Apps You Must Add to Your iPhone ASAP | FinanceBuzz
    lhvi3...
  • [New post] Is Chicken In A Biskit Coming Back? We Just Got Word That It Might Be
    trentbartlett posted: "Rumours around this snack's return have been floating around the internet for a little while now...

Search This Blog

  • Home

About Me

Today's news in major cities, regional and local areas which can include accident reports, police & emergency responses, criminal and court proceedings or live
View my complete profile

Report Abuse

Blog Archive

  • June 2025 (7)
  • May 2025 (3)
  • April 2025 (10)
  • March 2025 (8)
  • February 2025 (6)
  • January 2025 (4)
  • December 2024 (6)
  • November 2024 (8)
  • October 2024 (9)
  • September 2024 (8)
  • August 2024 (5)
  • July 2024 (10)
  • June 2024 (10)
  • May 2024 (11)
  • April 2024 (4)
  • March 2024 (1462)
  • February 2024 (3037)
  • January 2024 (3253)
  • December 2023 (3238)
  • November 2023 (3122)
  • October 2023 (3010)
  • September 2023 (2524)
  • August 2023 (2299)
  • July 2023 (2223)
  • June 2023 (2164)
  • May 2023 (2229)
  • April 2023 (2135)
  • March 2023 (2236)
  • February 2023 (2171)
  • January 2023 (2326)
  • December 2022 (2500)
  • November 2022 (2470)
  • October 2022 (2648)
  • September 2022 (1909)
  • August 2022 (1839)
  • July 2022 (1856)
  • June 2022 (1969)
  • May 2022 (2411)
  • April 2022 (2354)
  • March 2022 (1867)
  • February 2022 (1013)
  • January 2022 (1050)
  • December 2021 (1620)
  • November 2021 (3122)
  • October 2021 (3276)
  • September 2021 (3145)
  • August 2021 (3259)
  • July 2021 (3084)
Powered by Blogger.