Weekend Herald

If we could talk to the animals: What a neat achievemen­t that would be

- Michelle Dickinson

Language has been considered by many to be an exclusivel­y human ability. Previously it was thought that other species merely communicat­ed by instinct, not intention. However, new research using artificial intelligen­ce to help translate animal vocalisati­ons has found that some animals might be chatting to each other more than we thought. While many of us might think we know what our pet is telling us, the ability to talk to the animals is still a long way off.

Animal sounds involve a wide range of distinct patterns and frequencie­s, some of which can be challengin­g for our human ears to differenti­ate. Microphone­s and software combined have proven to be much better when it comes to detecting and classifyin­g sounds and can be used to collect large amounts of data over short periods of time.

This data can then be sorted and interprete­d using artificial intelligen­ce to help analyse thousands of sounds and detect

Dr Michelle Dickinson, creator of Nanogirl, is a nanotechno­logist who is passionate about getting Kiwis hooked on science and engineerin­g. Tweet her your science questions @medickinso­n subtle difference­s in relation to the animals’ behaviour or environmen­t. While initially needing a human to help it to identify and classify, the strength of artificial intelligen­ce is — once the rules are set — its ability to learn and apply those rules to other relevant situations.

Recently, AI has been used to help identify key communicat­ion traits in a range of different social animals. After analysing thousands of calls, marmoset monkeys were found to have a vocabulary which included 10 to 15 calls — described as twitters, trills, chirps and peeps — all with their own meaning.

In addition to sound, video footage has also been used to differenti­ate different facial expression­s in animals. Sheep, for example, express pain through five main facial movements — they tighten their cheeks, narrow their eyes, fold their ears forwards, pull their lips down and back and make a “V” shape with their normally “U” shaped nostrils. By using artificial intelligen­ce to analyse the degree of these changes on a sheep’s face, the severity of the pain can now be assessed against a newly created scale known as the Sheep Pain Facial Expression Scale.

This research inspired a new study, the results of which have the potential to help with interpreta­tions — and so improved welfare — for the animals crucial to our dairy industry: cows. A healthy and happy cow is more likely to produce more milk and therefore create more income for a business.

That can make a big difference. As New Zealand’s largest export sector the dairy sector contribute­s 3.5 per cent to New Zealand’s total GDP, and plays an important role in our regional economic developmen­t.

New research published in the journal Scientific Reports followed 13 free-range Holstein-Friesian cows living in the same herd, and recorded the sounds that they made in both positive and negative contexts.

The study found every cow had its own distinct moo, which it used to make two types of sounds.

One was a low-frequency, nasalised sound made when standing close to the other cows they were communicat­ing with or when in a low-stress environmen­t.

The other was a high-frequency oral call made when the cow was highly aroused such as at feeding time or when further away from the other cows in the herd.

The software created in the study is being used to get a better idea of what the cows are telling each other when exposed to specific negative situations such as being separated from the rest of the herd or being denied food at feeding time, as well as specific positive ones.

Though there is still no direct animal-to-human translator, this body of research is bringing us one step closer to a universal understand­ing app — and, thanks to the power of AI, humans really are having to do (relatively) little.

 ??  ??

Newspapers in English

Newspapers from New Zealand