Thermal imaging & vision systems DECODING CHICKEN FEEDING BIOMECHANICS R
An Allied Vision EoSens camera paired with a cascaded YOLOv8–SAM pipeline achieves 0.95 precision in non-invasive kinematic phenotyping, unlocking the next frontier of Precision Livestock Farming
esearchers at Paulista University in São Paulo, Brazil have issued a landmark study validating a scalable, automated sensing framework capable of quantifying broiler feeding biomechanics in real time. Published under the title Smart Farming Innovation: Automated Biomechanical Monitoring of Broilers Using a Hybrid YOLO-SAM Pipeline, the research establishes the hardware-software stack required to transform beak kinematics into actionable digital data streams.
A 70 PER CENT COST DRIVER UNDER THE MICROSCOPE
A broiler is any chicken bred and raised specifically for meat production. Feed expenditure consumes up to 70 per cent of total broiler production costs, yet the biomechanical interface between bird and feeder has remained a blind spot in commercial monitoring. Existing methods such as invasive markers, retrospective growth metrics, or low-throughput manual annotation, are incompatible with the real- time data demands of modern Cyber-Physical Systems (CPS). Without high-fidelity behavioral data streams, constructing credible “digital twins” of the feeding process is impossible.
OPTICAL PRECISION MEETS PRODUCTION REALITY
At the core of the sensing framework is an Allied Vision (formally Mikrotron) EoSens CoaXPress high-speed industrial camera equipped with a Nikon 50 mm f/1.4 lens, positioned 1.0–1.5 m from the feeder to capture unobstructed lateral kinematic profiles. Illuminance was standardised using a 500W 6500K LED source, providing an irradiance of 3000–5000 lux to ensure a high signal-to-noise ratio for image capture. Operating at 300 frames per second (fps) with spatially calibrated resolution - referenced against a physical scale placed at feeder level - the optical configuration delivers the
temporal resolution necessary to resolve rapid beak kinematics that are entirely invisible to standard video equipment. The system was validated against a biological dataset of nine broiler chickens, strategically stratified across three growth phases to ensure representative coverage of biomechanical variation across the production cycle.
THREE-STAGE COMPUTER VISION PIPELINE — 0.95 PRECISION
Raw high-speed footage from the EoSens camera is processed through a cascaded three-stage hybrid architecture implemented in Python 3.10, leveraging PyTorch 2.10.0 for deep learning inference and OpenCV 4.13.0.92 for image pre-processing. The pipeline integrates the YOLOv8 (You Only Look Once) computer vision model for rapid object detection with the Meta AI-developed Segment Anything Model (SAM) for precise anatomical segmentation, maintaining robust tracking performance in unstructured farm environments characterised by variable lighting and frequent partial occlusions.
Inference was assessed on a high-performance workstation featuring an Intel Core Ultra 9 275HX processor, 64 GB of DDR5-6400 RAM, and an NVIDIA GeForce RTX 5070 GPU running Windows 11 Pro, demonstrating the feasibility of real-time deployment. The integrated system achieved a precision of 0.95, a benchmark that renders manual annotation obsolete.
FEED PARTICLE AS A BIOMECHANICAL VARIABLE
The automated analysis delivered a first-of-its-kind finding: feed granulometry (particle size) directly and quantifiably modulates the biomechanical demands of broiler feeding. Coarser feed particles produced measurably larger gape
amplitudes and more efficient ingestion dynamics — a relationship previously inferred from retrospective physiological studies but never captured in real time. This directly links feed structure engineering to biomechanical effort, opening a new control loop in precision nutrition management.
TECHNOLOGICAL BRIDGE TO POULTRY DIGITAL TWINS
Beyond academic validation, the system’s dual- function architecture simultaneously monitors production efficiency via the Beak Efficiency Index (BEI) and tracks animal welfare indicators, therefore enhancing its commercial value for smart farming platforms. As international standards for poultry production continue to evolve, the system provides producers with an objective, automated compliance monitoring capability. This framework represents the technological bridge required to generate the continuous behavioral data streams that will power the next generation of Precision Livestock Farming (PLF) platforms and full digital-twin implementations of the broiler production process.
Future work will focus on Edge AI deployment and testing the system’s robustness in commercial barns, where occlusion and variable lighting are more prevalent.
Allied Vision
www.alliedvision.com
56
April 2026 Instrumentation Monthly
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70 |
Page 71 |
Page 72