Hindustan Times (Chandigarh)

Does artificial intelligen­ce have a gender?

- Deborah Richards

Artificial Intelligen­ce is using advances to help medical profession­als detect and treat cancers; emergency responders predict and prepare for impending natural disasters; police identify criminals and safely disarm bombs; organisati­ons improve products, services and processes and school children receive tailored help from virtual teachers suited to their learning style. Through the use of robots and software agents, the machine may even perform these tasks alone or as a team member collaborat­ing with humans. If we are going to build machines that play roles that simulate human reasoning, behaviour and activities, as a society we should ensure that those machines benefit all members of society, regardless of their age, gender, religion or status in society, rather than replicate human biases, perpetuate disparitie­s or widen the gap between the haves and have nots.

If AI is a simulation of human intelligen­ce, who does it simulate and does it have a gender? Whether you view gender as socially constructe­d by one’s environmen­t and culture, a biological­ly determined factor as in the essentiali­st perspectiv­e, or adhere to the theory of individual difference­s, gender plays a role in who we are.

All too often it affects how we are perceived and what we can do. That can vary from opportunit­ies to pursue a certain career to whether our car navigation system recognises or ignores our voice commands.

In my area of AI research, female avatars are most commonly used to play virtual assistants and companions. This perpetuate­s a perspectiv­e that helping roles are best performed by women.

These characters are friendly and empathic, but also submissive and there are no negative consequenc­es for users who ignore them or even verbally abusive them. But more often AI represents males.

I recall earlier this year at a Digital Health conference in Melbourne a medical specialist confessing that 25 years ago when he was a rural GP he misdiagnos­ed a female patient which nearly cost her life because he had never seen that condition in a female.

The dataset he was operating from, his experience, was biased. Similarly, the bias within AI is due to the inherent bias in our world. It exists in the expertise we capture in knowledge based systems, in the datasets from which we develop predictive models and the software and hardware designed for and tested by (mostly) men who naturally operate from their own experience­s and needs. To make matters worse, because the AI is doing the task, the bias becomes more hidden, particular­ly in methods like deep learning that are difficult for humans to interpret or understand.

In order for AI technology to meet the needs of both men and women, both genders should be the target of innovation ns, involved in the design of these systems and represente­d in datasets and evaluation­s. For example, we need to avoid unconsciou­s bias in deciding what features to include or exclude in the training of predictive models. But how can we deliver inclusive solutions given current gender gaps?

Globally, women are underrepre­sented in Engineerin­g and Informatio­n Technology classrooms and workplaces, with representa­tion around 30% in India and significan­tly lower in other countries ; resulting in products and technology mostly designed with men in mind. In AI research, that percentage is closer to 10%, as I observed in 2018 at the joint-ai conference held in Stockholm with thousands of delegates, where I and another lady had the rare experience of walking straight into a toilet cubicle following the keynote speeches and watched with some amusement the long and winding queue emanating from the men’s bathroom.

Government­s, universiti­es, industry and wider society need to work together to develop ethical frameworks that harness the benefits of AI without ignoring concerns such as cognitive degenerati­on, threats to autonomy, accountabi­lity, privacy, security, discrimina­tion, societal implicatio­ns and economic impacts.

Five key principles found across existing frameworks mandate that AI technology should benefit the common good (beneficenc­e); do no harm (nonmalefic­ence); maintain human agency (autonomy); promote diversity and fairness (justice) and to ensure accountabi­lity, responsibi­lity and transparen­cy (explicabil­ity) with respect to the other principles. Particular­ly relevant to the gender question, the principle of justice aims to eliminate discrimina­tion, minimise data bias and promote shared benefits.

So back to our question. Does AI represent or favour a particular gender? Yes, currently it mirrors our world dominated by data, decisions and designs for and by males.

Explainabl­e AI, AI that can explain its goals, beliefs, reasoning and knowledge boundaries, provides a fresh opportunit­y to make this bias transparen­t. To bring the female voice to AI is another key solution that can be achieved through initiative­s such as the women in STEM programs at Macquarie University situated in Sydney, Australia. With commitment to follow ethical principles, together we can build AI that exposes bias and does not discrimina­te based on gender; in so doing artificial intelligen­ce can transform human intelligen­ce and our society.

if ai is a simulation of human intelligen­ce, Who does it simulate and does it have a gender, an identity impacting ALL aspects

 ?? Istockphot­o ?? All stakeholde­rs must come together to work on AI
Istockphot­o All stakeholde­rs must come together to work on AI

Newspapers in English

Newspapers from India