Racial Equity in Everyday Products
Google’s AIUX team on creating more equitable AI-related products
I remember the day I got my first smart assistant. I couldn’t believe that technology was now so advanced that a device could hear my voice, answer questions, and respond to my demands!
I activated it. I asked in my normal, Black girl voice:
“What’s da weather gonna be like today?”
Silence.
Then it replied: “I’m sorry. I’m having difficulty understanding what you’re saying.”
That’s weird. Checking the weather was one of the main things this was advertised to do.
I tried again and again. I enunciated differently. I changed my pitch. In a final exasperated attempt, I asked the assistant in the most stereotypically white-passing American voice I could muster:
“What’s the weather going to be like today?”
Then it worked! My smart assistant confidently told me my location, the weather, and even to have a nice day.
Despite this minor triumph, something felt very upsetting about my experience. In order for me to use this product, I had to code switch by shifting my style of speaking and ultimately change who I was. This was the moment I realized that technology isn’t accessible to everyone—that’s still just an aspiration. Even now, certain people can be easily left out if they don’t speak, act, or look a certain way.
As a young child who grew up in a biracial household, I’ve always been interested in the ways race plays a role in our daily interactions, relationships, and experiences. At Google, I now lead several efforts that help understand and address the ways in which AI can amplify existing forms of inequality, discrimination, and bias.
Looking back to that first day with my smart assistant, I recognize that my poor experience with the speech technology driving a voice-controlled application—Automated Speech Recognition (ASR)—is not uncommon. Some recent research suggests that, in certain situations, ASE systems can have word error rates up to two times higher with Black speakers (who speak African-American Vernacular English or AAVE) than with white speakers (who typically speak Standard English).
In a diary study I co-led with Senior UX Researcher Michal Lahav, Black users of voice technology in the United States explicitly called out these unfair experiences with voice technology. They described it as “racially insensitive” and “not culturally inclusive.” Given the long history of linguistic discrimination African-Americans already face despite the fact that AAVE is a celebration of African-American resilience, strength, and culture, these alienating experiences amplify feelings of “otherness” in Black users’ own homes and spaces.
Unfortunately, voice technology is not the only place where Black users experience bias. Examples of companies creating AI without thinking of downstream effects are not rare—like how risk assessment algorithms designed to predict recidivism can wrongfully send more Black people to prison, or how AI credit ratings for calculating loan eligibility can inadvertently recreate a high-tech version of redlining. Each instance furthers bias. As industry practitioners, we can and must do better. Incorporating a racial equity lens into our design practice is one way to start.
Nothing about us without us
The most important way to build equitable products is to involve a diverse group of teammates and participants. On the AIUX team, we’re constantly striving to do better.
“Nothing about us without us” describes how no policy or product should be decided without the direct participation of those affected by it. Too often, our industry tries to design global products with teams that don’t represent a global perspective. We must strive to hire teammates with a range of backgrounds and experiences, as well as engage and consult with communities throughout the course of research, design, and prototyping.
An example of how the AIUX team1 puts this principle into practice is in our Automated Speech Recognition (ASR) study. In a partnership with Carnegie Mellon University’s Human-Computer Interaction Institute, Stanford Department of Linguistics, and Black community members, we’re engaged in community-based participatory research to address problems with voice technology, and bring Black voices to the forefront. By actively integrating the expertise of the community to address disparities we see in ASR systems, all partners can contribute expertise, as well as share in decision-making and ownership.
Own impact as much as intention
Even with the best intentions to make products inclusive, teams can commit missteps.
Take facial recognition, for example. To ensure an engaging, positive experience for all, datasets must contain representative samples of all people using that product. This means that for, say, a smart camera, datasets should include a wider range of darker skin tones to ensure that darker-skinned individuals can experience all the device features—just as non-marginalized groups do. That’s good, right? But collecting that data can also raise potential ethical concerns. The same technology that can enable, can cause harm in the wrong hands and even contribute to existing problems like over-policing and surveillance.
Even if teams don’t intend to harm marginalized communities, it’s essential to recognize that it’s not just about intentions. As author Ruha Benjamin points out in Race After Technology: It’s not that programmers are sinister and racist themselves. It’s that the desire for objectivity, efficiency, profitability, and progress fuels the pursuit of technical fixes that inadvertently create more inequity.
Doing right by users requires taking responsibility for impact as much as intention, and being mindful at each step about potential downstream effects.
Redistribute power
Often, UX processes are rooted in approaches that create a top-down power dynamic. This means that a small group of people design what’s used at a massive scale and dictate norms accordingly. When we think of “good design” or “good products,” we often simply replicate the tastes of those we consider “experts” in the industry (which are canonically Western-centric and homogenous), and the cultural inputs we’ve been given. Design especially has dealt with issues of creative savior complex rather than actually aligning our talents with vulnerable communities’ needs and increasing their power.
Instead, consider how a team can open up their design process to empower users and even have those communities influence or own the final product. In the design process itself, include co-creation or participatory design methods and make sure that people from communities that tech doesn’t typically serve first are included or co-guiding the process. For example, in our own work in agriculture, we engage directly with smallholder farmers in the Global South to understand farmers’ changing needs and challenges. In doing so, we try to ensure the results of the product, services, and experiences reflect the needs of the local community.
Design with broader systems in mind
At its heart, racism isn’t just about individual discrimination. Conversations about racism and privilege are about power—who has historically wielded the power to shape the world we live in, who has set our defaults around us and created the systems we operate in, and who hasn’t benefited from those defaults.
As employees at a tech company, we have the privilege to build for a wide audience. But just like that first experience with my assistant, it’s evident that not all audiences are always represented in the final product. We can’t expect technologists to truly build for everyone until we address and examine whose needs we default to serving, and whose needs we end up ignoring.
Antionette Carroll said it best: “Think of our work as systems-building rather than object-building.” Our team always asks before a project: Is what we’re making and how we’re making it upholding the existing system, or challenging it to increase more equitable outcomes? How can we make it more of the latter?
The entire industry and all our processes were created by people; have inherited the biases of people; and ultimately, can be changed by people. By all of us. It’s an ongoing process, and a conversation our team will continue, so that we can truly build for everyone.
Further reading
- Building For Everyone: Expand Your Market With Design Practices From Google's Product Inclusion Team (2020), Annie Jean-Baptiste
- Building Socially-Inclusive Design Systems (2019), Tatiana Mac
- Race After Technology (2019), Ruha Benjamin
- Algorithms of Oppression (2018), Safiya Noble
- Field Guide: Equity-Centered Community Design (2018), Creative Reaction Lab
- Weapons of Math Destruction (2016), Cathy O’Neill
- Design Justice (est. 2016), Sasha Constaza-Chock
NOTES
- This study is a collaborative effort between Google, Stanford, and Carnegie Mellon. It's led by Courtney Heldreth and Michal Lahav with support from Zion Mengesha, Tiffanie Horne, Aaron Donsbach, and Tiffany Deng.