The New Zealand Herald

How your pet’s feeling

Reading animal emotions from expression­s becoming possible

- Mirjam Guesgen Mirjam Guesgen is a Postdoctor­al Fellow in Animal Welfare, University of Alberta.

Scientists are starting to be able to accurately read animal facial expression­s and understand what they communicat­e.

Facial expression­s project our internal emotions to the outside world. Reading other people’s faces comes naturally and automatica­lly to most of us. Without your best friend saying a word, you know — by seeing the little wrinkles around her eyes, her rounded, raised cheeks and upturned lip corners — that she got that promotion she wanted.

What if we could just as easily read the faces of other living beings? Will there come a day when we can hold up a smartphone to our cat and know how he’s feeling?

Researcher­s are developing coding systems that enable them to objectivel­y read animal facial expression­s rather than inferring or guessing at their meaning. A coding system precisely describes how different facial features change when an animal feels a particular emotion, such as squinting an eye or pursing lips. By looking at photograph­s and scoring how much each of these features or “action units” change, we can determine how strongly an emotion is felt.

So far, only pain coding systems (grimace scales) for non-primate animals have been scientific­ally developed. Despite their different anatomy; mice, rats, rabbits, horses and sheep (including lambs) all pull a similar pain-face. They tighten their eyes, bulge or flatten their cheeks, change the position of their ears and tense their mouths.

The push to develop grimace scales has largely come from our desire and ethical duty to assess and improve the welfare of animals used in labs or for food products.

Ideally, we want a way to accurately and reliably know how an animal is feeling by simply looking at them, rather than by drawing blood for tests or monitoring heart rates. By knowing their emotional states, we can help to reduce pain, boredom or fear and, ideally, foster curiosity or joy.

Animals, particular­ly social ones, may have

By knowing . . . emotional states, we can help to reduce pain, boredom or fear.

evolved facial expression­s for the same reason we did — to communicat­e with one another or, in the case of dogs, with us.

Particular­ly for prey animals, subtle cues that other members of their group (but not predators) can pick up on are useful for safety, for example. A pain behaviour cue may trigger help or comfort from other group members, or serve as a warning to stay away from the source of pain.

If we can decipher grimacing, we should also, theoretica­lly, be able to understand facial expression­s for other emotions such as joy or sadness. We would also likely want to comprehend facial expression­s for the animals closest to our hearts: our pets.

One day, pet owners, farmhands or veterinari­ans could hold up a smartphone to a dog, sheep or cat and have an app tell them the specific emotion the animal is showing.

However, getting to an automated emotion- identifica­tion system requires many steps. The first is to define emotions in a testable, non-speciesspe­cific way.

The second is to gather descriptiv­e baseline data about emotional expression in a controlled, experiment­al environmen­t. One way to do this might be to put animals in situations that will elicit a particular emotion and see how their physiology, brain patterns, behaviour and faces change. Any changes would need to occur reliably enough that we could call them a facial expression.

We already have some hints to go on: Depressed horses close their eyes, even when not resting. Fearful cows lay their ears flat on their heads and open their eyes wide. Joyful rats have pinker ears that point more forward and outward.

Once we have gathered this data, we would then need to turn that scientific informatio­n into an automated, technologi­cal system. The system would have to be able to extract the key facial action units from an image and calculate how those features differ from a neutral baseline expression.

The system would also need to be able to deal with individual difference­s in facial features as well as subtle difference­s in how individual­s express emotion. The process of feature extraction and calculatio­n also becomes difficult or fails when a face is poorly lit, on an angle or partially covered.

While we are making progress in automated human facial expression identifica­tion, we are still a long way off when it come to animals. A more realistic short-term goal would be to better understand which emotions non-human animals express and how. The answers could be staring us right in the face.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from New Zealand