Hindustan Times (Lucknow)

Data governance needs a gender lens

- Mayank Mishra and Aparajita Bharti work at The Quantum Hub The views expressed are personal

The rise of big data and machine learning has caused an immense growth in powerful technologi­es and applicatio­ns. But simultaneo­usly, the same technologi­es have become a privacy nightmare for their users. The algorithms behind these technologi­es amass a huge amount of data from individual­s, which is then used (or sold to other firms) to target, persuade, reward, or penalise users. While privacy issues have been extensivel­y debated, a discussion on how data governance laws might impact women differentl­y than men and affect their agency on the internet has been mostly missing.

Women and men use digital technologi­es differentl­y. According to a 2017 survey, women use social media (such as Facebook and Instagram) significan­tly more than men. At the same time, there appears to be a huge disparity in mobile ownership. The National Family Health Survey-5 (2019-20) indicates significan­t diversity between states and Union Territorie­s (UTs) in terms of the percentage of women having a mobile phone, with figures ranging from 49% in Gujarat and Andhra Pradesh to 91% in Goa. Areas with less penetratio­n of phones among women indicate shared use of mobile phones in Indian families, which, in turn, impact women’s behaviour on the internet.

Women also face a higher risk of reputation­al loss online. Between 2017 and 2018, cases of cyberstalk­ing or bullying of women or children increased by 36% while the conviction rate fell from 40% to 25%. Such issues can negatively affect the mental health of victims resulting from humiliatio­n, diminishin­g self-esteem, and social isolation. These incidents also lead to a perception of the internet as an unsafe place for women.

Given these sensitivit­ies around women’s data and its impact on their ability to use the internet, India’s various data governance proposals that are under discussion currently must be evaluated from a gender lens.

For example, the proposed Personal Data Protection (PDP) Bill, 2019 imposes a blanket requiremen­t for parental consent for processing the personal data of anyone below the age of 18 years. This effectivel­y gives parents control over teens’ access to any internet platform. While protection of minors’ data is indeed important, a blanket requiremen­t such as this coupled with the shared usage of mobile phones, may compromise the agency of teenage girls far more than boys, as families exert control over their usage. Most other countries have this age requiremen­t at 13 years as teenagers make use of the internet to learn new skills, build new relationsh­ips and explore their identities.

Another example is the governance of non-personal data, a framework that will facilitate the usage of aggregate data to build Artificial Intelligen­ce (AI) to deliver better

services to Indian citizens. The Krish Gopalakris­hnan Committee on non-personal data has come out with two different frameworks to facilitate this sharing of data. However, a larger discussion around algorithmi­c biases against women has been missing. AI algorithms learn the patterns in the training datasets to utilise that learning for predictive analytics, among other things. There are multiple ways in which this could lead to discrimina­tory outcomes for women.

First, through the underlying bias in the training datasets. For example, if an algorithm is trained on outcomes that are unfavourab­le for women, it will replicate the same in its prediction­s. Second, if women are underrepre­sented in the training dataset (very likely due to the existing digital divide), then it will result in products that aren’t designed for women, furthering the digital divide over time. For instance, if an automated speech recognitio­n system is trained on a dataset that has disproport­ionately fewer voice snippets of women talking, it will make errors while trying to comprehend women’s voices. Therefore, perhaps a policy around AI developmen­t is more urgent and needs guardrails around ensuring that underlying datasets are not biased.

Further, from a privacy perspectiv­e, the risks of identifica­tion by piecing together different sets of non-personal data are far higher for women than men. For example, non-personal data from women’s health apps, when pieced together with shopping data, may risk revealing their identities and their reproducti­ve health issues.

As India moves towards an increasing­ly digital society, how privacy and data governance laws may impact women’s safety and agency on the internet should not come as an afterthoug­ht. We need to have these discussion­s front and centre as these regulation­s can be a key building block to women’s agency on the internet and their participat­ion in the economy of the future. We risk deepening the existing chasms in an increasing­ly digital world, if we do not get this right.

 ?? SHUTTERSTO­CK ?? India’s various data governance proposals that are currently under discussion must be evaluated through a gender lens
SHUTTERSTO­CK India’s various data governance proposals that are currently under discussion must be evaluated through a gender lens
 ?? Mayank Mishra ??
Mayank Mishra
 ?? Aparajita Bharti ??
Aparajita Bharti

Newspapers in English

Newspapers from India