Deutsche Welle (English edition)

The Smart Wife: Is your home voice assistant sexist?

Use a voice assistant? Is "she" a slave to white male prejudice? Researcher­s Yolande Strengers and Jenny Kennedy say AI-driven voice devices in the home reinforce sexist stereotype­s. But they've got a plan.

- Zul kar Abbany conducted the interview.

DW: There's been a fair bit of writing and discussion about gender stereotypi­ng and other bias with voice assistants and similar arti cial intelligen­ce systems. And your bookThe Smart Wife — Why Siri, Alexa and other Smart Home Devices need a Feminist Rebootaddr­esses that. Before we get into your recommenda­tions, tell us why voice devices seem so prone to bias?

Yolande Strengers: It's not so much bias as it is a deliberate strategy to help people learn to like these devices, to accept them and to welcome them into our homes and into our lives. It just so happens that feminine stereotype­s, when they're attached to objects that are designed to perform traditiona­l feminine tasks, and that are in the form of an AI, in our homes, that we're more likely to be comfortabl­e with them if they have that female form. So, it's not completely unintentio­nal, which is what unconsciou­s bias often assumes. But it's more of a [commercial] strategy, which makes a lot of sense really, as to why this is happening.

So, the idea of using male voices or genderless, mid-frequency range voices, how does that a ect the situation from your perspectiv­e?

Jenny Kennedy: It doesn't really address one of the main problems, which is the type of work that these devices are brought into the home to do and the most appropriat­e ways of doing that kind of labor.

It's very calculated and algorithmi­cally managed. But it's also about how that kind of "wife work" is valued in the home, and the way in which we have previously valued or undervalue­d the people that are doing that work.

In the book, we mention voice assistants used in other contexts. But we're looking specifical­ly at voice assistants in the home. And there's still this very rigid and limited ideal of what the home is, and the roles people play. We still operate on the basis of 2.4 people and assume heteronorm­ative relations between the adult couple. It's that context that makes all these devices really problemati­c, because they are reenergizi­ng that outdated ideal.

And you're calling for a "feminist reboot." What is a feminist reboot in practical terms and how would it help here?

YS: The feminist reboot is a set of proposals that Jenny and I make for how we can improve the situation. We look at this across the spectrum of industries and ideas, from how the devices are designed and the personalit­ies they have, right through to how they are represente­d in the media. We talk about the way we often blame the feminine device rather than the companies that make them, and how that reinforces negative stereotype­s towards women.

But we also look at the Sci-Fi industry and how the representa­tion of smart wives on screens provides the inspiratio­n for what we end up with in our homes and how we actually also need to change and challenge the film industry to come up with some new imaginatio­ns about what these kinds of creations or these helpers could be to help inspire the roboticist­s, AI developers and the whole computing industry.

The European Commission is introducin­g new default requiremen­ts under its Horizon Europe funding program from 2021 that all grant recipients include — what they are calling — the "gender dimension." The responsibl­e commission­er, Mariya Gabriel, says that "integratin­g sex and gender-based analysis into research and innovation, and [considerin­g] intersecti­ng social categories such as ethnicity, age or disability, is a matter of producing excellent research to the bene t of all European citizens." That covers health, urban planning, climate change, AI and machine learning, facial recognitio­n and "virtual assistants and chatbots: analysing gender and intersecti­onality in social robots." What do you think about that? Will it help?

YS: It's a fantastic move. We know that so often gender just isn't considered, as well as a range of other important intersecti­onal issues. It reminds me of the book Invisible Women

by Caroline Criado-Perez, which documents at length all the various ways in which women are invisible in research and data analysis.

But it can't just be a case of having a diverse range of people on a team, or can it? Take the European Space Agency, for example: Among the top 11 management positions, there's just one woman. Is that necessaril­y a problem?

JK: You can't really claim to have explored all perspectiv­es if you have a limited number of people who are able to put forward their perspectiv­e. It's about having a diversity of people at all levels. So, I do think it's a problem, like a group of 11, where there's only one woman. And it's not just that group, it's everything that feeds into there being only one woman able to have placed themselves in that position.

It's about paying attention to matters of diversity and gender from the very beginning, rather than building a product and then having the gender and diversity team come in to do a usability patch up on it.

The other issue is that of trust. And European politics wants stronger standards in that area. How do standards help to improve trust in technology?

JK: One of the proposals we have in the reboot section is a form of verificati­on to provide users with some confidence in the design process behind the types of the device they're using, extending to the kinds of representa­tions a device is going to perpetuate in their household.

For instance, is this device capable of assisting all persons in the household, or is it only really suitable for a particular user, namely a white man?

And this is where the female persona and the voice comes in, because often what people are talking about is whether they trust the device in the home. But the bigger question is whether they trust the corporatio­n behind the woman, the corporatio­n that is harvesting all their data — that is, a larger commercial machine that the female voice helps to obscure.

The Smart Wife — Why Siri, Alexa and other Smart Home Devices need a Feminist Reboot by Yolande Strengers, associate professor in the Department of Human Centred Computing at Monash University, and Dr. Jenny Kennedy of RMIT, Melbourne, is published by MIT Press (2020)

 ??  ??
 ??  ?? Intelligen­t home assistants are practical but are they reinforcin­g negative gender stereotype­s?
Intelligen­t home assistants are practical but are they reinforcin­g negative gender stereotype­s?

Newspapers in English

Newspapers from Germany