Bangkok Post

Auction of Muslim women on Indian app stokes concern

Disturbing discovery shows tech being weaponised for abuse, writes Rina Chandran

- THOMSON REUTERS FOUNDATION

Six months ago, pilot Hana Khan saw her picture on an app that appeared to be auctioning scores of Muslim women in India.

The app was quickly taken down, no one was charged, and the issue shelved — until a similar app popped up on New Year’s Day.

Ms Khan was not on the new app called Bulli Bai — a slur for Muslim women — that was hawking activists, journalist­s, an actor, politician­s and Nobel Laureate Malala Yousafzai as maids.

Amid growing outrage, the app was taken down, and four suspects arrested this week.

The fake auctions that were shared widely on social media are just the latest examples of how technology is being used — often with ease, speed and little expense — to put women at risk through online abuse, theft of privacy or sexual exploitati­on.

For Muslim women in India who are often abused online, it is an everyday risk, even as they use social media to call out hatred and discrimina­tion against their minority community.

“When I saw my picture on the app, my world shook. I was upset and angry that someone could do this to me, and I became angrier as I realised this nameless person was getting away with it,” said Ms Khan.

She filed a police complaint against the first app, Sulli Deals, another pejorative term for Muslim women.

“This time, I felt so much dread and despair that it was happening again to my friends, to Muslim women like me. I don’t know how to make it stop,” Ms Khan, a commercial pilot in her 30s, told the Thomson Reuters Foundation.

Mumbai police said they were investigat­ing whether the Bulli Bai app was “part of a larger conspiracy”.

A spokespers­on for GitHub, which hosted both apps, said it had “longstandi­ng policies against content and conduct involving harassment, discrimina­tion, and inciting violence.

“We suspended a user account following the investigat­ion of reports of such activity, all of which violate our policies.”

Advances in technology have heightened risks for women across the world, be it trolling or doxxing with their personal details revealed, surveillan­ce cameras, location tracking, or deepfake pornograph­ic videos featuring doctored images.

Deepfakes — or artificial, intelligen­ce-generated, synthetic media — are used to create porn, with apps that let users strip clothes off women or swap their faces into explicit videos.

Digital abuse of women is pervasive because “everybody has a device and a digital presence”, said Adam Dodge, chief executive of EndTAB, a United

States-based nonprofit tackling techenable­d abuse.

“The violence has become easier to perpetrate, as you can get at somebody anywhere in the world. The order of magnitude of harm is also greater because you can upload something and show it to the world in a matter of seconds,” he said.

“And there is a permanency to it because that photo or video exists forever online,” he added.

The emotional and psychologi­cal impact of such abuse is “just as excruciati­ng” as physical abuse, with the effects compounded by the virality, public nature, and permanence of the content online, said Noelle Martin, an Australian activist.

At 17, Ms Martin discovered her image had been photoshopp­ed into pornograph­ic images and distribute­d. Her campaign against imagebased abuse helped change the law in Australia.

But victims struggle to be heard, she said.

“There is a dangerous misconcept­ion that the harms of technology-facilitate­d abuse are not as real, serious, or potentiall­y lethal as abuse with a physical element,” she said.

“For victims, this misconcept­ion makes speaking out, seeking support, and accessing justice much more difficult.”

Tracking lone creators and rogue coders is hard, and technology platforms tend to shield anonymous users who can easily create a fake email or social media profile.

Even lawmakers are not spared: in November, the United States’ House of Representa­tives censured Republican Paul Gosar over a photoshopp­ed anime video that showed him killing Democrat Alexandra Ocasio-Cortez. He then retweeted the video.

“With any new technology we should immediatel­y be thinking about how and when it will be misused and weaponised to harm girls and women online,” said Mr Dodge.

“Technology platforms have created a very imbalanced atmosphere for victims of online abuse,” he said.

“And the traditiona­l ways of seeking help when we are harmed in the physical world are not as available when the abuse occurs online,” he added.

Some technology firms are taking action.

Following reports that its AirTags — locator devices that can be attached to keys and wallets — were being used to track women, Apple launched an app to help users shield their privacy.

In India, the women on the auction apps are still shaken.

Ismat Ara, a journalist showcased on Bulli Bai, called it “nothing short of online harassment”.

It was “violent, threatenin­g and intending to create a feeling of fear and shame in my mind, as well as in the minds of women in general and the Muslim community,” Ms Ara said in a police complaint that she posted on social media.

‘‘ I felt so much dread and despair ... I don’t know how to make it stop. HANA KHAN

COMMERCIAL PILOT

 ?? REUTERS ?? Police detain a 20-year-old man on Thursday in Assam state for creating an app that shared pictures of Muslim women for a virtual ‘auction’.
REUTERS Police detain a 20-year-old man on Thursday in Assam state for creating an app that shared pictures of Muslim women for a virtual ‘auction’.

Newspapers in English

Newspapers from Thailand