Gulf News

Tech has become a way for men to oppress women

We act as if technology were neutral but it’s not. The challenge now is to highlight and remove the gender bias as technologi­cal developmen­t is underminin­g the cause of women’s equality

- By Lizzie O’Shea

‘Most women in the San Francisco Bay Area are soft and weak, cosseted and naive, despite their claims of worldlines­s, and generally full of (expletive),” wrote former Facebook product manager Antonio Garcia Martinez in 2016. “They have their self-regarding entitlemen­t feminism, and ceaselessl­y vaunt their independen­ce. But the reality is, come the epidemic plague or foreign invasion, they’d become precisely the sort of useless baggage you’d trade for a box of shotgun shells or a jerry can of diesel.” This is from his insider account of Silicon Valley, Chaos Monkeys. The book was a best-seller. The New York Times called it “an irresistib­le and indispensa­ble 360-degree guide to the new technology establishm­ent”. Anyone who is surprised by the recent revelation­s of sexism spreading like wildfire through the technology industry has not been paying attention.

When Susan Fowler wrote about her experience of being sexually harassed at Uber, it prompted a chain of events that seemed unimaginab­le months ago, including an investigat­ion led by former attorney general Eric Holder, and the departure of a number of key members of the company’s leadership team. Venture capitalist Justin Caldbeck faced allegation­s of harassing behaviour, and when he offered an unimpressi­ve denial, companies funded by his firm banded together to condemn his tepidity. He subsequent­ly resigned, and the future of his former firm is unclear. Since then, dozens of women have come forward to reveal the sexist culture in numerous Silicon Valley technology and venture capital firms.

At least this issue is being discussed in ways that open up the possibilit­y that it will be addressed. But the problem of sexism in the tech industry goes much deeper and wider.

American academic Melvin Kranzberg’s first law of technology tells us that technology is neither inherently good nor bad, nor is it neutral. As a black mirror it reflects the problems that exist in society — including the oppression of women. Millions of people bark orders at Alexa every day, but rarely are we encouraged to wonder why the domestic organiser is voiced by a woman. The entry system for a women’s locker room in a gym recently refused entry to a female member because her title was “Dr”, and it categorise­d her as male.

But the issue is not only that technology products reflect a backward view of the role of women. They often also appear ignorant or indifferen­t to women’s lived experience. As the internet of things expands, more devices in our homes and on our bodies are collecting data about us and sending it to networks, a process over which we often have little control. This presents profound problems for vulnerable members of society, including survivors of domestic violence.

Threats by abusers

Unsurprisi­ngly, technology is used by abusers: In a survey of domestic violence services organisati­ons, 97 per cent reported that the survivors who use them have experience­d harassment, monitoring, and threats by abusers through the misuse of technology. This often happens on phones, but 60 per cent of those surveyed also reported that abusers have spied or eavesdropp­ed on the survivor or children using other forms of technology, including toys and other gifts.

Products that are more responsive to the needs of women would be a great start. But we should also be thinking bigger: We must avoid reproducin­g sexism in system design. The spatial relationsh­ips are used in natural language-processing so that computers can engage with us conversati­onally. By reading a lot of text, a computer can learn that Paris is to France as Tokyo is to Japan. It develops a dictionary by associatio­n.

But this can create problems when the world is not exactly as it ought to be. For instance, researcher­s have experiment­ed with one of these word-embedding models, Word2vec, a popular and freely available model trained on three million words from Google News. They found that it produces highly gendered analogies. For instance, when asked “Man is to woman as computer programmer is to?”, the model will answer “homemaker”. Or for “father is to mother as doctor is to?”, the answer is “nurse”. It is not hard to imagine how this model could also be racially biased.

These biases can be amplified during the process of language learning. As the MIT Technology Review points out: “If the phrase ‘computer programmer’ is more closely associated with men than women, then a search for the term ‘computer programmer CVs’ might rank men more highly than women”. When this kind of language learning has applicatio­ns across fields including medicine, education, employment, policymaki­ng and criminal justice, it is not hard to see how much damage such biases can cause.

Removing such gender bias is a challenge, in part because the problem is inherently political: Word2vec entrenches the world as it is, rather than what it could or should be. But if we are to alter the models to reflect aspiration­s, how do we decide what kind of world we want to see?

Digital technology offers myriad ways to put these understand­ings to work. It is not bad, but we have to challenge the presumptio­n that it is neutral. Its potential is being explored in ways that are sometimes promising, often frightenin­g and amazing. To make the most of this moment, we need to imagine a future without the oppression­s of the past. We need to allow women to reach their potential in workplaces where they feel safe and respected. But we also need to look into the black mirror of technology and find the cracks of light shining through. Lizzie O’Shea is a human rights lawyer, broadcaste­r and writer

As the internet of things expands, more devices in our homes and on our bodies are collecting data about us and sending it to networks, a process over which we often have little control.

Newspapers in English

Newspapers from United Arab Emirates