Smart system tracks individual farm animals

WUR research Suresh Neethirajan developed ChickTrack for monitoring individual farm animals.
ChickTrack can distinguish chickens from each other and track them individually as they move through the barn. This takes place on the basis of the dimensions of the animal; small differences in shape and size that a person does not or hardly see, but the computer does. Photo: Shutterstock

Technology for keeping an eye on individual farm animals is being developed with great rapidity. WUR researcher Suresh Neethirajan is making great strides with ChickTrack.

Good farmers know their animals and can see whether a cow, chicken or pig is feeling well and happy. They might even know their animals by name, if they have names. ‘With a cow, for example, you only have to look at three things,’ explains Neethirajan. ‘The eyes, the ears and the mouth. How do the eyes look, and how much of the whites of the eyes can you see? Are the ears back or do they hang? What about the position of the mouth? But it takes training to see these things, and it is rather subjective.’

And what about a chicken farmer with a barn containing thousands of birds? How can he see which chickens are sick, weak, or feeling off-colour? This is the kind of farmer who stands to benefit from ChickTrack: a system using cameras and sensors that follow and monitor chickens individually, providing the farmer with the information he needs for managing the farm. The chicken’s wellbeing is the main focus, Neethirajan emphasizes. But the farmer will benefit too, of course. A lot of animals are lost in livestock farming. They may stay too small to be slaughtered for their meat, or they end up with bruises, fractures and wounds. And that is bad for both the animal’s welfare and the farmer’s bank balance.

Suresh Neethirajan got his inspiration for ChickTrack from microbiology. ‘I had two years of training in that field at the Oak Ridge National Laboratory in the USA. I understand how to analyse the social networks of micro-organisms. I wondered whether that could be upscaled to the macro level of chickens. How do they talk to each other? What is the pattern of their movements? What kinds of noises do they make when displaying certain behaviour? And how could we use technology to measure those welfare indicators?’

Heat map

ChickTrack works with video cameras that film from different corners of the barn. This footage is then analysed using specialized computer software (YOLO, You Only Look Once) and deep-learning technology. This enables ChickTrack to differentiate between chickens and track their movements around the barn individually. The recognition is based on the animal’s measurements: small differences in shape and size that the human eye barely detects, but the computer does. And all this takes place without touching the chicken.

The algorithm doesn’t come with the ability to do this; it has to go to school first. Neethirajan: ‘That is the beauty of artificial intelligence. There are a variety of systems available these days, which work like a neural network and learn fast. Of course, it depends on the available computing power. But thanks to the fast developments in electronics, computing speeds are increasing enormously, while costs are decreasing.’

Besides the standard cameras, Neethirajan also used thermal cameras and microphones. ‘Microphones for recording the sounds birds make, and thermal infrared cameras for measuring body temperatures to see how that temperature changes through the day and in response to the animal’s welfare situation. We have a heat map of a chicken’s body. Situations that arouse anxiety, for instance, cause the temperature of a chicken’s comb and throat to change must faster than that of the rest of its body.’

The distribution of heat over the body is a good gauge of the chicken’s emotional state and its welfare, reckons Neethirajan. ‘In this context, you could see emotion as e-motion, with the “e” standing for energy. So, energy in motion: how the energy spreads around the body. That’s how I look at it.’

Besides eyes (cameras) and ears (microphones), there are also artificial noses (odour sensors) that register the chickens’ metabolisms. ‘They breathe out different mixtures of chemical substances, depending on the food they get. In theory we can measure that for each bird separately. We’re talking here about very small quantities, so the sensors have to be super-sensitive to pick them up.’

Fast decision-making

ChickTrack integrates all these signals to get a picture of the chickens’ welfare. Neethirajan: ‘Within a few seconds, the algorithm can make a decision and pass it on to the farmer. And the farmer can then adjust things like the light or humidity in the barn, or change the composition of the feed. Decisions with which you can influence the behaviour or emotional state of the animals.’

‘I started out as a bio-engineer,’ says Neethirajan. ‘When I see a problem, I think about a solution. If you want to improve welfare in the poultry sector, you need to detect animal diseases early. Small differences in behaviour can be indicators of major issues in terms of improving animal welfare. But this is not just about measuring the negative indicators of welfare. Positive indicators are important too. Are there signals we are not yet aware of, for example? And can we use them to improve welfare through technology?’

According to Neethirajan, we are living in a ‘VUCA’ world that is volatile, uncertain, complex and ambiguous. ‘We never know what’s going to happen next. There are so many uncertainties, and everything is so dynamic. Artificial intelligence and big data enable us to think up possible solutions to problems that are not yet fully visible to the human eye.’

Happy cow
With ChickTrack still under development, Suresh Neethirajan has already embarked on his next eye-catching project: reading the emotions on the faces of cows and pigs. ‘The idea is to understand the animals’ emotional make-up using video footage of their faces,’ he explains. To this end, a software program is being developed that can analyse the images. During the study, the animals are fitted with sensors that also measure biomedical vital signals on the animals’ skin. Neethirajan: ‘We use a kind of sticker with sensors that measure the animals’ heartbeat, breathing rate and activity. It’s a question of non-invasive measurements: we won’t be taking any blood or hair samples. The sticker only weighs a few grams and sits on the skin like a tattoo. The signals are sent to a base station in real time.’ The project is called Solaria. Its overarching goal is to create tools that will help farmers improve the welfare of their animals.

Strava
Keeping track of chickens is popular. PhD student Malou van der Sluis is finishing off her study in which she tracks individual chickens via a sensor on their leg. The chickens are kept in a run with special flooring fitted with antennae. The system works, says Van der Sluis. ‘The idea was to find out whether you can monitor the activities of individual chickens this way. And you can: it works on a small scale in a research setting. It’s still a big step to apply it in a real-life setting, though.’ Follow-up research is also needed to correlate the recorded activity of chickens with their health and welfare. Van der Sluis: ‘More activity seems to go together with a good life for the chicken, but the relationships between these things are not crystal-clear.’

You may also like:

Leave a Reply


You must be logged in to write a comment.