Hostname: page-component-76fb5796d-x4r87 Total loading time: 0 Render date: 2024-04-26T07:17:04.837Z Has data issue: false hasContentIssue false

A machine vision system to detect and count laying hens in battery cages

Published online by Cambridge University Press:  14 July 2020

O. Geffen
Affiliation:
Precision Livestock Farming (PLF) Lab, Institute of Agricultural Engineering, Agricultural Research Organization (A.R.O.) – The Volcani Center, 68 Hamaccabim Road, P.O.B 15159 Rishon Lezion7505101, Israel Electro Optical Engineering Department, School of Electrical and Computer Engineering, Ben-Gurion University of the Negev, 1 Ben Gurion Avenue, P.O.B 653 Be’er Sheva8410501, Israel Animal Science Institute, Agricultural Research Organization (A.R.O.) – The Volcani Center, 68 Hamaccabim Road, P.O.B 7505101 Rishon Lezion7505101, Israel
Y. Yitzhaky
Affiliation:
Electro Optical Engineering Department, School of Electrical and Computer Engineering, Ben-Gurion University of the Negev, 1 Ben Gurion Avenue, P.O.B 653 Be’er Sheva8410501, Israel
N. Barchilon
Affiliation:
Precision Livestock Farming (PLF) Lab, Institute of Agricultural Engineering, Agricultural Research Organization (A.R.O.) – The Volcani Center, 68 Hamaccabim Road, P.O.B 15159 Rishon Lezion7505101, Israel Animal Science Institute, Agricultural Research Organization (A.R.O.) – The Volcani Center, 68 Hamaccabim Road, P.O.B 7505101 Rishon Lezion7505101, Israel
S. Druyan
Affiliation:
Animal Science Institute, Agricultural Research Organization (A.R.O.) – The Volcani Center, 68 Hamaccabim Road, P.O.B 7505101 Rishon Lezion7505101, Israel
I. Halachmi*
Affiliation:
Precision Livestock Farming (PLF) Lab, Institute of Agricultural Engineering, Agricultural Research Organization (A.R.O.) – The Volcani Center, 68 Hamaccabim Road, P.O.B 15159 Rishon Lezion7505101, Israel
Get access

Abstract

Manually counting hens in battery cages on large commercial poultry farms is a challenging task: time-consuming and often inaccurate. Therefore, the aim of this study was to develop a machine vision system that automatically counts the number of hens in battery cages. Automatically counting hens can help a regulatory agency or inspecting officer to estimate the number of living birds in a cage and, thus animal density, to ensure that they conform to government regulations or quality certification requirements. The test hen house was 87 m long, containing 37 battery cages stacked in 6-story high rows on both sides of the structure. Each cage housed 18 to 30 hens, for a total of approximately 11 000 laying hens. A feeder moves along the cages. A camera was installed on an arm connected to the feeder, which was specifically developed for this purpose. A wide-angle lens was used in order to frame an entire cage in the field of view. Detection and tracking algorithms were designed to detect hens in cages; the recorded videos were first processed using a convolutional neural network (CNN) object detection algorithm called Faster R-CNN, with an input of multi-angular view shifted images. After the initial detection, the hens’ relative location along the feeder was tracked and saved using a tracking algorithm. Information was added with every additional frame, as the camera arm moved along the cages. The algorithm count was compared with that made by a human observer (the ‘gold standard’). A validation dataset of about 2000 images achieved 89.6% accuracy at cage level, with a mean absolute error of 2.5 hens per cage. These results indicate that the model developed in this study is practicable for obtaining fairly good estimates of the number of laying hens in battery cages.

Type
Research Article
Copyright
© The Author(s), 2020. Published by Cambridge University Press on behalf of The Animal Consortium

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Andrew, W, Greatwood, C and Burghardt, T 2017. Visual localisation and individual identification of Holstein Friesian cattle via deep learning. In Proceedings of the 16th IEEE International Conference on Computer Vision, 22–29 October 2017, Venice, Italy, pp. 28502859.Google Scholar
Aydin, A 2017. Development of an early detection system for lameness of broilers using computer vision. Computers and Electronics in Agriculture 136, 140146.CrossRefGoogle Scholar
Berckmans, D 2006. Livestock production and society. In Automatic on-line monitoring of animals by precision livestock farming (ed. R Geers and F Madec), pp 287294. Wageningen Academic Publishers, Wageningen, The Netherlands.Google Scholar
Cronin, GM, Borg, SS and Dunn, MT 2008. Using video image analysis to count hens in cages and reduce egg breakage on collection belts. Australian Journal of Experimental Agriculture 48, 768772.Google Scholar
Dunn, MT 2007. Applications of vision sensing in agriculture. PhD Thesis, University of Southern Queensland, Toowoomba, Australia.Google Scholar
Halachmi, I, Guarino, M, Bewley, J and Pastell, M 2019. Smart animal agriculture: application of real-time sensors to improve animal well-being and production. Annual Review of Animal Biosciences 7, 403425.CrossRefGoogle Scholar
He, K, Zhang, X, Ren, S and Sun, J 2016. Deep residual learning for image recognition. In Proceedings of the 29th IEEE Conference on Computer Vision and Pattern Recognition, June 26–July 1 2016, Las Vegas, NV, USA, pp. 770778.Google Scholar
Joosen, P, Norton, T, Marchant-Forde, J and Berckmans, D 2019. Animal welfare monitoring by real-time physiological signals. In Proceedings of the 9th European Conference on Precision Livestock Farming, August 26–29, 2019, Cork, Ireland, pp. 337344.Google Scholar
Kashiha, MA, Green, AR, Sales, TG, Bahr, C, Berckmans, D and Gates, RS 2014. Performance of an image analysis processing system for hen tracking in an environmental preference chamber. Poultry Science 93, 24392448.Google Scholar
Leroy, T, Vranken, E, Struelens, E, Sonck, B and Berckmans, D 2005. Computer vision based recognition of behavior phenotypes of laying hens. Paper presented at the 2nd in 2005 ASAE Annual Meeting, July 17–20, 2005, Tampa, FL, USA.Google Scholar
Li, N, Ren, Z, Li, D and Zeng, L 2019. Automated techniques for monitoring the behaviour and welfare of broilers and laying hens: towards the goal of precision livestock farming. Animal 14, 617625.CrossRefGoogle ScholarPubMed
Lin, T-Y, Maire, M, Belongie, S, Hays, J, Perona, P, Ramanan, D, Dollár, P and Zitnick, CL 2014. Microsoft coco: Common objects in context. In Proceedings of the 13th European Conference on Computer Vision, September 6–12, 2014, Zurich, Switzerland, pp. 740755.Google Scholar
Nakarmi, AD, Tang, L and Xin, H 2014. Automated tracking and behavior quantification of laying hens using 3D computer vision and radio frequency identification technologies. Transactions of the ASABE 57, 14551472.Google Scholar
Pohle, K and Cheng, H-W 2009. Furnished cage system and hen well-being: comparative effects of furnished cages and battery cages on behavioral exhibitions in White Leghorn chickens. Poultry Science 88, 15591564.CrossRefGoogle ScholarPubMed
Ren, S, He, K, Girshick, R and Sun, J 2015. Faster R-CNN: towards real-time object detection with region proposal networks. In Proceedings of the 28th Advances in Neural Information Processing Systems Conference, December 7–12, 2015, Montreal Canada, pp. 9199.Google Scholar
Sergeant, D, Boyle, R and Forbes, M 1998. Computer visual tracking of poultry. Computers and Electronics in Agriculture 21, 118.CrossRefGoogle Scholar
Tillett, RD, Onyango, CM and Marchant, JA 1997. Using model-based image processing to track animal movements. Computers and Electronics in Agriculture 17, 249261.Google Scholar
Xiong, X, Lu, M, Yang, W, Duan, G, Yuan, Q, Shen, M, Norton, T and Berckmans, D 2019. An automatic head surface temperature extraction method for top-view thermal image with individual broiler. Sensors 19, 5286.CrossRefGoogle ScholarPubMed
Supplementary material: File

Geffen et al. Supplementary Materials

Geffen et al. Supplementary Materials 1
Download Geffen et al. Supplementary Materials(File)
File 14.7 KB

Geffen et al. Supplementary Materials

Geffen et al. Supplementary Materials 2

Download Geffen et al. Supplementary Materials(Video)
Video 1.5 MB