ChickTrack - A Quantitative Tracking Tool for Measuring Chicken Activity

8 9 The automatic detection, counting and tracking of individual and flocked chickens in the poultry 10 industry is of paramount to enhance farming productivity and animal welfare. Due to 11 methodological difficulties, such as the complex background of images, varying lighting 12 conditions, and occlusions from e.g., feeding stations, water nipple stations and barriers in the 13 chicken rearing production floor, it is a challenging task to automatically recognize and track birds 14 using computer software. Here, a deep learning model based on You Only Look Once (Yolov5) is 15 proposed for detecting domesticated chickens from videos with varying complex backgrounds. A 16 multiscale feature is being adapted to the Yolov5 network for mapping modules in the counting 17 and tracking of the trajectories of the chickens. The Yolov5 network was trained and tested on our 18 dataset which resulted in an enhanced tracking precision accuracy. Using Kalman Filter, the 19 proposed model was able to track multiple chickens simultaneously with the focus to associate 20 individual chickens across the frames of the video for real time and online applications. By being 21 able to detect the chickens amid diverse background interference and counting them precisely 22 along with tracking the movement and measuring their travelled path and direction, the proposed 23 model provides excellent performance for on-farm applications. Artificial intelligence enabled 24 automatic measurements of chicken behavior on-farm using cameras offers continuous monitoring 25 of the chicken's ability to perch, walk, interact with other birds and the farm environment, as well 26 as the assessment of dustbathing, thigmotaxis, and foraging frequency, which are important 27 indicators for their ability to express natural behaviors. This study highlights the potential of 28 automated monitoring of poultry through the usage of ChickTrack model as a digital tool in 29 enabling science-based animal husbandry practices and thereby promote positive welfare for 30 chickens in animal farming. 31


Introduction
positive experiences for the animals [4]. However, such traits are challenging to objectively, 47 efficiently, and timely record in a farm containing over thousands of individual animals.

49
To support the increasing agricultural demand and ensure biosecurity adherence and operational 50 efficiency for the animals, farm video surveillance systems are expected to grow to US $3.6 billion 51 by 2027 [5]. Remote monitoring, physiological and behavioral phenotyping data collection  chicks that are immediately killed after hatching in the egg industry, which is just the system's 68 design as they are "useless" regarding egg production, on an annual basis before they are processed 69 for meat [3]. In the poultry meat industry, often chickens are rejected at the slaughterhouses due 70 to the lack of sufficient meat quality and bruises, skin injuries, fractures, or other lesions on the 71 chicken bodies. This loss of life is of a significant concern for animal welfare, agricultural 72 efficiency, and economic impacts [3]. The positive, negative and neutral chicken welfare indicators 73 based on video and image analysis can be derived from early-life stress due to separation of the 74 mother-chick, very high density, bad air circulation, poor hygiene leading to respiratory issues, 75 injuries on their feet due to ammonia building up on the ground, bad housing environment, no 76 positive / rewarding stimuli (playful behavior), behavioral problems such as pecking or 77 cannibalism, chronic stress, peak in stress before slaughter, suffering when the slaughtering 78 method is not as efficient, unnatural lighting conditions and others. The link between poultry health 79 and the poultry product quality emanates from the human risk of diseases if the animals have been  In the poultry sector, machine vision focused research has developed tools in behavioral detection 138 based on the quantification of the brightness patterns within a two-dimensional video [22]. 139 However, there is a need for models and tools that allow multiple chickens to be detected and 140 monitored.

141
The best way to prevent the missing animals or the poultry bodily features due to occlusions and    classification in avian species have only recently been growing (Table 1). However, no research 166 has been published yet on the detection, counting and tracking of the chickens under occlusion 167 conditions nor using the You Only Look Once software (Yolov5). Tracking of chicken movement 168 is achieved through taking an initial set of the chicken shape and contour detections, creating a 169 unique ID based on the coordinates in the image (frames from videos) for each of the initial 170 detections, and then tracking as they move around frames in the video, continuing the ID assigned.   Table 1 were relatively small, and although the accuracy was shown to be 184 high, the performance accuracy would be insufficient in real-time outside of controlled conditions.   There is no classification involved in this study, as the goal is to detect the chicken, count and track    Table 2.    and validation dataset is shown in Figure 3.

288
Detecting and tracking poultry using optical flow and video based automatic assessment is a 289 challenging task of which the outcome is to create a meaningful insight for intervention or

298
Optical flow has been demonstrated as a way to identify vehicles for driver assistance systems in the frame 1. In the subsequent frames, the same ID of that chicken was carried forward. As the 331 frame changes, if a new chicken appears, then the old ID is dropped, and it is assigned a new ID.

332
Hence, tracking the chicken becomes challenging due to the fact that the bird in the video may 333 appear or disappear between the frames or there may be occlusions hiding the bird in later frames.

334
By frame-to-frame centroid assessment, the distance from previous centroid being calculated, this 335 challenge was overcome.

337
The Kalman filter models the future position and the velocity using gaussians. By using 338 probability, the Kalman filer assigns the measurement to its prediction and updates itself. The  Table 3 shows the overlap success rate at threshold of 0.5 for three videos analyzed     The height installation of the camera in the poultry barn, multiple viewpoints and angles in 436 capturing the bird images will lead to specific characteristics of the chicken and its movement with Supplementary material namely sample videos S1 to S6 and the YoloV5 model are available for 455 download.

459
The author being the sole contributor declare that there is no known competing financial interests 460 or personal relationships that could have appeared to influence the work reported in this article.