High-Speed Machine Vision System Decodes Chicken Feeding Biomechanics at 300 FPS

Date
03/13/2026

 PDF
An Allied Vision EoSens camera paired with a cascaded YOLOv8–SAM pipeline achieves 0.95 precision in non-invasive kinematic phenotyping, unlocking the next frontier of Precision Livestock Farming

Validation of the background subtraction algorithm: (1) original frame; (2) intermediate processing removing the bird instance; (3) final binary mask of the feed ROI used for displacement tracking (photo courtesy of Paulista University)

­Researchers at Paulista University in São Paulo, Brazil have issued a landmark study validating a scalable, automated sensing framework capable of quantifying broiler feeding biomechanics in real time. Published under the title Smart Farming Innovation: Automated Biomechanical Monitoring of Broilers Using a Hybrid YOLO-SAM Pipeline, the research establishes the hardware-software stack required to transform beak kinematics into actionable digital data streams.

A 70% Cost Driver Under the Microscope

A broiler is any chicken bred and raised specifically for meat production. Feed expenditure consumes up to 70% of total broiler production costs, yet the biomechanical interface between bird and feeder has remained a blind spot in commercial monitoring. Existing methods such as invasive markers, retrospective growth metrics, or low-throughput manual annotation, are incompatible with the real-time data demands of modern Cyber-Physical Systems (CPS). Without high-fidelity behavioral data streams, constructing credible "digital twins" of the feeding process is impossible.

Optical Precision Meets Production Reality

At the core of the sensing framework is an Allied Vision (formally Mikrotron) EoSens™ CoaXPress high-speed industrial camera equipped with a Nikon 50 mm f/1.4 lens, positioned 1.0–1.5 m from the feeder to capture unobstructed lateral kinematic profiles. Illuminance was standardized using a 500W 6500K LED source, providing an irradiance of 3000–5000 lux to ensure a high signal-to-noise ratio for image capture.

Operating at 300 frames per second (fps) with spatially calibrated resolution — referenced against a physical scale placed at feeder level — the optical configuration delivers the temporal resolution necessary to resolve rapid beak kinematics that are entirely invisible to standard video equipment. The system was validated against a biological dataset of nine broiler chickens, strategically stratified across three growth phases to ensure representative coverage of biomechanical variation across the production cycle.

Three-Stage Computer Vision Pipeline — 0.95 Precision

Raw high-speed footage from the EoSens camera is processed through a cascaded three-stage hybrid architecture implemented in Python 3.10, leveraging PyTorch 2.10.0 for deep learning inference and OpenCV 4.13.0.92 for image pre-processing. The pipeline integrates the YOLOv8 (You Only Look Once) computer vision model for rapid object detection with the Meta AI-developed Segment Anything Model (SAM) for precise anatomical segmentation, maintaining robust tracking performance in unstructured farm environments characterized by variable lighting and frequent partial occlusions. 

Inference was assessed on a high-performance workstation featuring an Intel Core Ultra 9 275HX processor, 64 GB of DDR5-6400 RAM, and an NVIDIA GeForce RTX 5070 GPU running Windows 11 Pro, demonstrating the feasibility of real-time deployment. The integrated system achieved a precision of 0.95, a benchmark that renders manual annotation obsolete.

Feed Particle as a Biomechanical Variable

The automated analysis delivered a first-of-its-kind finding: feed granulometry (particle size) directly and quantifiably modulates the biomechanical demands of broiler feeding. Coarser feed particles produced measurably larger gape amplitudes and more efficient ingestion dynamics — a relationship previously inferred from retrospective physiological studies but never captured in real time. This directly links feed structure engineering to biomechanical effort, opening a new control loop in precision nutrition management.

Technological Bridge to Poultry Digital Twins

Beyond academic validation, the system's dual-function architecture simultaneously monitors production efficiency via the Beak Efficiency Index (BEI) and tracks animal welfare indicators, therefore enhancing its commercial value for smart farming platforms. As international standards for poultry production continue to evolve, the system provides producers with an objective, automated compliance monitoring capability. This framework represents the technological bridge required to generate the continuous behavioral data streams that will power the next generation of Precision Livestock Farming (PLF) platforms and full digital-twin implementations of the broiler production process.

Future work will focus on Edge AI deployment and testing the system’s robustness in commercial barns, where occlusion and variable lighting are more prevalent.

RELATED