Skip to content

bioinfoUQAM/cow-detection-and-identification-model

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

Cow Detection and Identification Using YOLOv8

Project Overview

This project involves training a Computer Vision Model to detect and identify individual cows in a tie-stall barn under the McGill-UQAM Research and Innovation Chair in Animal Welfare and Artificial Intelligence the Bioinformatics at the University of Quebec at Montreal (UQAM). The project consists of two main parts: training a cow detection model and then using this model to train an individual cow identification model.

Part 1: Training the Cow Detection Model

Step 1: Dataset Generation

We generate a dataset from video files taken at the barn with different views: center, front, and rear, each with three orientations: center, left, and right. Frames are extracted from these videos and labeled to create train, validation, and test datasets. Data augmentation techniques are used to enhance the dataset. This task is performed using Roboflow.

Step 2: Training the Detection Model

The dataset is used to train the cow detection model using Ultralytics YOLOv8 for over 50 epochs with a 640 image size.

# Start training from a pretrained *.pt model using GPUs 0 and 1
yolo detect train data=<path/to/data.yaml> model=yolov8n.pt epochs=50 imgsz=640 device=mps 

Step 3: Model Validation

The trained model (best.pt) is validated using the validation data to show metrics such as mAP, accuracy, precision, recall, confusion matrix, and confidence curves.

yolo detect val model=<path/to/best.pt> data=<path/to/data.yaml>

Step 4: Inference

We perform inference to detect the cow of interest in the barn, achieving 85% to 95% accuracy on the initial videos.

yolo predict model=path/to/best.pt source="path/to/*.MP4"

Step 5: Model Export

The trained model is exported to ONNX format for deployment across various platforms and devices.

yolo export model=path/to/best.pt format=onnx  # export custom trained model

Step 6: Cow Tracking

The custom trained model is used to track the cow of interest in the videos, maintaining 85% to 95% performance accuracy.

yolo track model=path/to/best.pt source="path/to/*.MP4" tracker="bytetrack.yaml"

Step 7: Cropping Tracked Frames

During tracking, crops of the cow's positions in the videos are saved, with outputs organized by camera orientation.

yolo task=detect mode=predict model=path/to/best.pt source="path/to/*.mp4" save=True save_txt=True save_crop=True hide_labels=True hide_conf=True conf=0.8

Step 8: Generating XML Labels

LabelImg is used to generate XML files for each image in the train, validation, and test datasets for future use in training the cow identification model.

pyrcc5 -o libs/resources.py resources.qrc
pip3 install lxml
python3 labelImg.py

Part 2: Training the Individual Cow Identification Model

Step 1: Model Evaluation on Unknown Data

The custom cow detection model is evaluated on new, unseen data to assess its performance with unknown cows.

Step 2: Dataset Generation for Identification

A larger dataset of individual cows is created by saving crops of each tracked cow as images, labeled with the cow's name. This is done by processing videos of each cow through the detection model.

yolo task=detect mode=predict model="path/to/best.pt" source="path/to/*.MP4" save=True save_period=10 conf=0.8 save_txt=True save_crop=True hide_labels=True hide_conf=True

Step 3: Training the Identification Model

The new dataset is used to train the cow identification model using YOLOv8.


Conclusion

This project successfully develops a robust cow detection and identification system using state-of-the-art computer vision techniques. The models trained in this project can significantly enhance the monitoring and management of cows in a barn environment, contributing to improved animal welfare.

Acknowledgments

This work is part of the research activities at the McGill-UQAM Research and Innovation Chair in Animal Welfare and Artificial Intelligence, conducted in the bioinformatics lab at the University of Quebec at Montreal.

For more information, visit well-e.org.

Contact

Francois Gonothi Toure
Graduate Student Researcher, McGill-UQAM Research and Innovation Chair in Animal Welfare and Artificial Intelligence
E-mail: [email protected]

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published