[New post] “AI, Ain’t I A Woman?” On the Blindness and Limitations of Artificial Intelligence
Joy Buolamwini posted: "I sensed an opening. Research papers could reach academics and industry practitioners focused on AI, but I needed something more to reach everyday people. I also needed to reach decision-makers like elected officials who might be seduced by the promises o" Literary Hub
I sensed an opening. Research papers could reach academics and industry practitioners focused on AI, but I needed something more to reach everyday people. I also needed to reach decision-makers like elected officials who might be seduced by the promises of AI to bring increased efficiency without being aware of racial, gender, and other types of bias. Did the government officials in India exploring the adoption of the Aadhaar system know about the potential for bias in the biometric solutions being offered as answers for efficient distribution of government resources and persistent identification? Did they know algorithmic bias might deny benefits to the very people they sought to help? What about the police departments adopting facial recognition technologies? What did they know about algorithmic bias, if anything? I knew I couldn't leave it to the companies selling these systems to reveal their flaws. There was no incentive to put technological shortcomings in a sales pitch. I needed to humanize the harms and biases of AI systems and bring a perspective that tech companies were likely to shy away from. How might I use my knowledge to help people see beyond the headlines now being written about my work, "Facial Recognition Is Accurate, If You're a White Guy," and feel the impact on a specific person?
I decided one way to humanize AI biases and make the topic more mainstream than an academic paper was to test the faces of the Black Panther cast. Since my research had shown that the systems I tested worked worst on the faces of darker-skinned females, I decided to focus on the faces of the women of Wakanda: Lupita Nyongʹo as Nakia, Letitia Wright as Shuri, Angela Bassett as Queen Ramonda, and Danai Gurira as fearless General Okoye. I brought on Deborah Raji as my research intern to carry out a small-scale audit running the Black Panther cast's faces across the AI systems of five companies. This exploration became known as the Black Panther Face Scorecard project. The project revealed some commonalities with my own experience. Like me, some of their faces were misgendered, not detected at all, or in some cases mis-aged. Angela Bassett, who was in her late fifties at the time of the photo, was estimated by IBM's system to be between eighteen and twenty-four years old. (Maybe not all algorithmic bias was that bad.)
My collection of failure demonstrations provided a counterpoint to the celebrations that accompanied technological advances.
The results were amusing. The Black Panther Face Scorecard drew smiles from colleagues and visitors from member companies of the MIT Media Lab. These fictional characters, played by actors whose faces had reached billions of people, still felt safely removed from everyday life. While more women were rocking shaved heads, not many people were walking around with vibranium undershirts or bracelets with ammunition to keep superhero relatives safe. At least, this wasn't happening in my social circles.
The performance metrics on the women of Wakanda kindled my curiosity. How would these AI systems work on the faces of not just fictional dark-skinned women but iconic women of today and yesterday? How might AI read the faces of highly photographed women like Michelle Obama, Serena Williams, and Oprah Winfrey?
And how would it do on historic figures like Sojourner Truth, who escaped slavery by buying her freedom and pushed for women's rights and the abolition of slavery? I was also eager to try the faces of Shirley Chisholm, the first Black congresswoman, and fearless journalist Ida B. Wells. I searched online for popular, widely used images of these women, which Deborah Raji ran through systems that included IBM, Amazon, and Microsoft. When she shared the results, I was astonished.
Looking at just the names with the results in a spreadsheet was one thing. Seeing the faces of women I admired and respected next to labels containing wildly incorrect descriptions like "clean shaven adult man" was a different experience. I kept shaking my head as I read over the results, feeling embarrassed that my personal icons were being classified in this manner by AI systems. When I saw Serena Williams labeled "male," I recalled the questions about my own gender when I was a child ("Are you a boy or a girl?"). When I saw an image of a school-aged Michelle Obama labeled with the descriptor "toupee," I thought about the harsh chemicals put on my head to straighten my kinky curls, until I decided to embrace my natural hair. And seeing the image of a young Oprah labeled with no face detected took me back to my white mask experience.
For a while, I tried to remain detached from my research findings, which indicated that all systems tested worked worst for dark-skinned females. The research touched on other groups that also warranted attention, like darker-skinned males and lighter-skinned females. With the latest exploration of women I admired, I had an opportunity to bring dark-skinned women like me to the center stage. I had the power to put faces to what might otherwise be story-less silhouettes.
My first instinct was to create an explainer video like the one I made for the "Gender Shades" research paper. Doing that was familiar and comfortable. It allowed me to show some of the outrageous results from the position of an analyst explaining how the results reflected misogynoir, the term coined by Dr. Moya Bailey meaning the ways Black women, specifically, are insulted or discriminated against.
After writing the draft script for an explainer video on these iconic women, I showed it to a teaching assistant in a film class I visited periodically and asked how I could improve it. "What motivated you to work on it?" he asked me.
"The research paper is the beginning of a conversation, but the results are abstract. I do not want to subtract the humanity of the feeling of being misgendered, being labeled in ways beyond your control. I want people to see what it means when systems from tech giants box us into stereotypes we hoped to transcend with algorithms. I want people to bear witness to the labels and peer upon the coded gaze for themselves."
As I spoke, he nodded his head.
"Have you considered making a poem about this instead of a script?"
For years, there was a form of art I indulged in but kept largely hidden. I had notebooks and digital diaries filled with verses and phrases. Snippets of my poetry dwelled in shadowy places. I enjoyed writing, but it was mostly a private, vulnerable exercise: I'd intended to keep my poetry mostly to myself and a tight circle of sympathetic ears.
Truth was also in the business of sharing counter-demos to large audiences to demolish dangerous narratives.
When the sunlight warmed me awake the next morning, the following phrase sat in my head, capturing how I felt about witnessing the cultural impact of Serena Williams, Michelle Obama, and Oprah Winfrey walking in their paths:
My heart smiles as I bask in their legacies
knowing their lives have altered many destinies.
As I brushed my teeth and looked into a fogged mirror, more words came into focus:
In her eyes, I see my mother's poise
In her face, I glimpse my auntie's grace
As I ruminated on the work more lines came to me:
Can machines ever see my queens as I view them?
Can machines ever see our grandmothers as we knew them?
My poem "AI, Ain't I A Woman?" was born. The piece held the emotions I had long suppressed. When I spoke the words of the poem aloud, my anguish and disappointment emerged. But for the full impact, the words needed to be paired with the images and disheartening labels that were slapped onto these iconic women by AI systems from leading tech companies. Part of what made the white mask demo more powerful than words alone was seeing me alter myself by donning a white mask to be made visible to a machine.
Until making the white mask fail demo, I thought of tech demonstrations as celebrations of what machines could do. If a demonstration included a failure, the demo gods had failed you. I thought of the way Steve Jobs, robed in a black turtleneck, not only talked about the possibilities of an iPhone but demonstrated the capabilities with carefully selected examples to tantalize onlookers and change the conception of what a cellphone could be. His words mattered, and so did seeing a simple gesture opening an application or switching screen views. Showcasing what his words meant completed the seduction. The Apple demos were a pathway into transforming existing beliefs about technology.
I was doing something similar but in the opposite direction. There were plenty of examples to show the possibilities of tech. I was collecting examples to show the limitations. My collection of failure demonstrations provided a counterpoint to the celebrations that accompanied technological advances.
The white mask failure I recorded was an example of what I call a counter-demo. But what exactly is a counter-demo countering? With the case of the white mask, I was providing a counter-narrative to the research and accompanying headlines lauding advances in computer vision. With "AI, Ain't I A Woman?" I decided to record screencasts to create counter-demos. These demonstrations countered the supposed sophistication of AI systems being eagerly sold. I assumed commercially sold products from these companies would perform fairly well on most people's faces if they were being sold to a wide market.
At the time, these companies had online demos of their AI product capabilities that were publicly available so anyone with some time, an internet connection, and a photo could upload an image and see how the demos worked. To make counter-demos, I screen recorded my visits to these websites and sat through loading animations of rotating wheels that preceded the display of results. Some included colored boxes that would be used to locate a head in an image. All had some type of description about what the uploaded images contained. When I uploaded an image of Sojourner Truth to Google's system, it returned the label "gentleman." Truth had fought to be treated on equal footing with a gentleman but was also vocal in saying that she too was a woman. Her famous 1851 "Ain't I A Woman?" speech inspired the name of my spoken word algorithmic audit. Truth was also in the business of sharing counter-demos to large audiences to demolish dangerous narratives.
No comments:
Post a Comment