Abstract

Cow behavior is a crucial indicator for monitoring health, reproductive status, and welfare in livestock management. However, methods that rely on wearable devices often face significant challenges, including high costs, maintenance difficulties, and potential impacts on animal welfare. To address these limitations, this study explored the potential of using YOLOv8, a cutting-edge computer vision model, for noninvasive monitoring of cow behavior. The research methodology involved four key steps: data collection, preliminary data processing, model training, and validation. The findings reveal that YOLOv8 can accurately detect and localize key cow behaviors-lying, standing, eating, and ruminating-achieving a mean average precision of 0.778 at a 0.5 intersection over union threshold. Despite the promising results, the model's performance is notably affected by occlusion, which remains a primary challenge. Nevertheless, the outcomes indicate that YOLOv8 is a viable tool for recognizing cow behavior, offering a significant step forward in precision livestock farming and addressing the growing need for efficient and welfare-oriented livestock management practices.

Original languageEnglish
Pages (from-to)190-197
Number of pages8
JournalTurkish Journal of Veterinary and Animal Sciences
Volume48
Issue number5
DOIs
Publication statusPublished - 2024

Keywords

  • Cow behavior
  • YOLOv8
  • eating
  • lying
  • ruminating
  • standing

Fingerprint

Dive into the research topics of 'Monitoring cow behavior based on lying, standing, eating, and ruminating recognition using YOLOv8'. Together they form a unique fingerprint.

Cite this