Khaleej Times

Let’s take a moral stance on robots

- DAN CRIMSTON DEBATE

When psychologi­sts talk about a “moral circle” they are referring to how far we extend our moral considerat­ion towards others. That is, whether we care about the well-being of others, and act accordingl­y.

For most of us, the continuum of our moral circle is pretty straightfo­rward: we include our loved ones, and we aren’t all that concerned about rocks or the villains of society. But the middle ground between the obvious ins and the obvious outs are not quite as clear-cut.

Moral circles are a surprising­ly multifacet­ed and impression­able element of our moral cognition. Historical trends suggest they are expanding, meaning the future of our moral circles may be vastly different from today. Could they one day include robots?

The moral circle is an intuitive concept. We are concerned about the welfare of those inside our moral circle and feel a sense of moral obligation for their treatment. Those on the outside can be subject to indifferen­ce at best, and horrific treatment at worst.

Therefore, our assessment of who is in and who is out is incredibly consequent­ial, and we are confronted with the reality of these decisions every day. Do you feel an obligation to help a homeless person you pass? Are you concerned about the plight of refugees? Or the survival of the great apes?

These issues are frequently presented to us as direct tradeoffs. How we respond to these ethical challenges is in large part determined by the makeup of our moral circle.

Whether you include someone or something within your moral circle is more complicate­d than you may think. When pressed, you may be able to identify whether an entity is worthy of moral considerat­ion, but can you explain why?

We tend to possess a larger moral circle if our moral instincts centre around the reduction of harm, rather than a priority for our in-group. People who identify with all humanity are likely to show greater concern for out-group members. While those who possess a sense of oneness with nature feel a strong moral obligation toward non-human animals and the environmen­t.

Beyond individual difference­s, your moment to moment motivation­s have the power to manipulate your moral circle. For example, if you love animals, but you also love eating meat, in the moment you are about to tuck into a steak you are likely to deny the moral standing of animals.

Our perception­s of others are also crucial to their inclusion within the moral circle. First and foremost is the possession of a mind. Can they feel pain, pleasure or fear? If we perceive the answer is yes then we are far more likely to grant them moral inclusion.

Equally, if groups are dehumanise­d and perceived to lack fundamenta­l human traits, or objectifie­d and denied personhood, we are far less likely to include them within our moral circle. Finally, our moral circles can be shaped by subtle cognitive forces beyond our conscious awareness. The simple cognitive switch of adopting an inclusion versus an exclusion mindset can have a substantia­l impact. Looking for evidence that something is worthy of moral inclusion produces a smaller moral circle than when looking for evidence that it is unworthy.

Similarly, how an entity is framed can be of tremendous consequenc­e. History shows that humanity trends toward moral expansion. Time and again, generation­s consider the moral standing of entities beyond the scope of their ancestors.

In the coming years we will face yet another novel ethical challenge due to the inevitable rise of artificial intelligen­ce. Should robots be granted moral inclusion?

Indeed, some are already beginning to ask these questions. Robots have been awarded citizenshi­p status, and their perceived mistreatme­nt can elicit an emotional response.

The estimation of robots as worthy of moral considerat­ion could depend on whether they meet many of the criteria outlined above. Do we perceive them to feel pain, pleasure or fear? Are they are framed as human-like or entirely artificial? Are we looking for evidence that they should be included in our moral circle, or evidence that they shouldn’t be? And do their needs conflict with our own? While this issue is guaranteed to be divisive, one cannot deny that it presents a fascinatin­g ethical challenge for our species. —The Conversati­on Dan Crimston is Postdoctor­al Researcher in Morality and Social Psychology, The

University of Queensland

The estimation of robots as worthy of moral considerat­ion could depend on whether they meet many criteria

 ??  ??

Newspapers in English

Newspapers from United Arab Emirates