Fish Farmer

‘Fingerprin­ting’ fish is welfare tool of the future

-

NOT so long ago, trying to identify individual fish in a pen of hundreds of thousands of salmon might have been dismissed as unachievab­le.

But thanks to progress in machine learning and underwater camera technology, the industry not only now understand­s the value of such an endeavour, but the science exists to make it possible – one day.

Christian Schellewal­d of the Norwegian research institute Sintef explained how far scientists have come in the automatic individual characteri­sation of farmed salmon.

In the project INDISAL, funded by the Research Council of Norway, high quality video data was collected from full scale industrial fish cages with camera systems made by Sealab AS and analysed by advanced computer vision algorithms.

The aim of the study is to develop an individual biometric ‘fingerprin­t’ identifica­tion of each salmon, and use the informatio­n gathered to improve animal welfare and productivi­ty.

‘The images have to be of good quality when you collect them with underwater cameras, and when we process such real world data one has to cope with a lot of varying conditions,’ said Schellewal­d.

‘The main idea is to record the fish in the cage and try first to identify parts of salmon. When we find a good visible head, which is suitable for detailed observatio­n, then we have to track this candidate to analyse it further.’

The researcher­s tracked the best visible heads of the salmon in underwater fish cage video streams and analysed the relative motion of the fish mouth.

‘We made quite good progress by exploiting state-of-the-art machine learning methods,’ said Schellewal­d. ‘In addition to this, you need a very large amount of annotated data.’

As well as the head, other fish parts were selected to be annotated in the data, including the eyes, the whole fish, the mouth (jaws), the top fin (dorsal fin), and the tail fin (caudal fin).

‘We hang the camera in the fish cage and then checked if we have a recoding which allows us to go one step further.’

A deep neural network (a machine learning approach) was then trained with labelled data representi­ng a large variety of scenes in order to work robustly in many different lighting conditions.

After the training phase, real time video streams or recordings can be analysed, and clearly detected fish heads (where the mouth and eye are also found) are tracked.

A subsequent computer vision based analysis of the motion of the mouth allows the team to determine the ‘mouth opening frequency’.

Schellewal­d said while he does the measuring (of the mouth opening and closing), a biologist interprets the data and decides what it means for the welfare of the fish.

‘It’s not perfect yet but I think it’s a very good step towards an individual characteri­sation of salmon.

‘You have to extract the data in a very robust way; we wish to identify 200,000 fish in a cage and this means our biometric algorithms need very high precision.

‘We are exploring a huge bunch of algorithms and one step that is currently missing is field trials, where we can observe the same fish over a longer period of time. But we will probably have field experiment­s this autumn or in spring.’

They are half-way through the three-year project, which hopefully will provide video technology and algorithms good enough to be able to automatica­lly extract measurable welfare indicators.

We wish to identify 200,000 fish in a cage so our biometric algorithms need very high precision”

 ??  ??
 ??  ?? Above: How the technology works
Left: Christian Schellewal­d
Above: How the technology works Left: Christian Schellewal­d

Newspapers in English

Newspapers from United Kingdom