Although it might’ve seemed like something out of The Jetsons a decade ago, many of us have casually held up our smartphones to ask Siri a question. These days, intelligent virtual assistants — or artificial intelligence (A.I.) assistants — are commonplace. Using an A.I. system, these assistants emulate human interactions and complete tasks for us. For example, Microsoft’s Cortana can sift through search engine results and set reminders; nameless, voice-activated assistants read off our in-vehicle GPS directions; and Amazon’s ever-popular Alexa can emcee your living room entertainment setup.
Needless to say, everyone from Apple to BMW is investing in this tech, making virtual assistants rather ubiquitous. “We’re at the point now where these artificial intelligence-driven interactions are more than just a feature – they’re the products themselves,” Phil Gray, executive vice president of business development at Interactions, writes. “Consumers are comfortable speaking with ‘a robot’ when the productivity and convenience value is there.” That is, asking Siri to look something up no longer holds that Jetsons-esque novelty — nor is it the latest party trick. Instead, voice-activated assistants are expected.
Some of the most popular intelligent virtual assistants — Siri, Cortana and Alexa — have something else in common: Their default English-language voices are all made possible thanks to women. The woman behind Apple’s Siri is veteran voice actor Susan Bennett. Cortana, which is based on a synthetic intelligence character from Microsoft’s Halo franchise, is voiced by Jen Taylor, just like in the video game series. And, although Alexa isn’t any real person’s voice — generated instead from the rules of Text to Speech (TTS) and Artificial Intelligence Technology — Amazon’s assistant has, arguably, the most traditionally gendered name of all the popular A.I. assistants out there. And this raises several red flags.
From Siri to Alexa: Gender Bias Exists in A.I.
While names and the sound of one’s voice shouldn’t be subjected to being gendered in the first place, there’s no denying that our society, which is mired in gender binarism, has allowed gender bias to seep into A.I. More than half of American smart speaker users have an Alexa, while, currently, 47% of all smartphone users in the U.S. have iPhones (that number jumps to 83% for teenagers)— and, by extension, access to Siri. How have tech companies addressed this trend of coding intelligent virtual assistants as female? Well, Siri and Google Assistant can be changed to use a male-coded voice. (That is, a voice that expresses what are typically thought of in western society as masculine traits). Meanwhile, Amazon is employing celebrities, like Michael B. Jordan, to lend their vocal talents to Alexa. But that’s simply not enough.
“An algorithm is an opinion expressed in code,” Ivana Bartoletti, founder of the networking group Women Leading in A.I., told The Guardian. “If it’s mostly men developing the algorithm, then of course the results will be biased… You’re teaching the machine how to make a decision.” All of this to say, intelligent voice assistants are doing the not-so-smart thing of continuing to normalize the subservience of women, especially since the bulk of our exchanges with these assistants boils down to commands.
“This is a powerful socialization tool that teaches us about the role of women, girls, and people who are gendered female to respond on demand,” USC sociology professor Dr. Safiya Umoja Noble told New York Magazine. Fortunately, researchers have created a way to subvert this bias. Enter: Q, the first genderless voice.
Meet Q: The First Genderless Voice
Developed by a team of researchers, sound designers and linguists — along with the folks behind Copenhagen Pride week — Q is part of an initiative called Equal A.I., which hopes to “write and right the future.” Part of that goal means undoing implicit bias and creating algorithms that don’t reflect human bigotry. “One of our big goals with Q was to contribute to a global conversation about gender, and about gender and technology and ethics,” Julie Carpenter, an expert in human behavior and emerging technologies, told NPR. “[We at Project Q want to understand] how to be inclusive for people that identify in all sorts of different ways.”
According to Carpenter, users might not even be aware of their bias, but, nonetheless, will show an affinity for giving commands to female-coded A.I., all while preferring male-coded A.I. when they need something authoritative. Here, we’re thinking of IBM’s question-answering computer Watson, which notably had a turn on Jeopardy and, through its intelligence, was meant to exude a kind of authority; unlike those in-pocket assistants, Watson has a voice that’s traditionally read as male.
In a blog post titled “Robots Should Not Be Gendered,” Alan Winfield, co-founder of the Bristol Robotics Laboratory at the University of the West of England, Bristol, wrote that, “Whether we like it or not, we all react to gender cues. So whether deliberately designed to do so or not, a gendered robot will trigger reactions that a non-gendered robot will not.” For Winfield and others in the field, gender is a construct — one that brings about numerous ethical concerns, most of which are related to gender bias.
All of this brings us back to Q. While Q isn’t available on your phone or smart home device yet, you can listen to what’s being touted as “the first genderless voice” here. So, how did the folks behind Project Q settle on their sound? The first step was to record dozens of people — nonbinary folks, trans and cis women, and trans and cis men. At first, Nis Nørgaard, one of the sound designers, thought they might layer the voices, thus merging them into “some kind of average,” but this method didn’t pan out.
Instead, Nørgaard honed in on a voice that registered between what our binary-cultured brains would consider masculine or feminine — something based largely on frequency, or pitch. “It was really tricky, because your brain can tell if the voice has been pitched up and down,” Nørgaard told Wired. “It was difficult to work with these voices without destroying them.” Four Q variants were played for a survey of 4,500 people and, based on feedback, developers landed on what is now Q’s genderless, or gender-neutral, voice.
While Q, or other gender-neutral intelligent virtual assistants, have yet to be integrated into mainstream tech, it’s clear that moving away from the gender binary is a movement that can be bolstered by advances in technology. “[Q] plays with our urge to put people into boxes,” Project Q linguist Kristina Hultgren told Wired, “and therefore has the potential to push people’s boundaries and broaden their horizons.”