Press "Enter" to skip to content

Cross-Species Communication: Humans Crack the Chicken Clucking Code

Researchers at the University of Queensland have discovered that people can understand chickens’ emotional states from their clucking sounds. This ability, unaffected by prior experience with chickens, has significant implications for improving poultry welfare and could aid in developing AI-based monitoring systems. Credit: SciTechDaily.com

A study reveals that humans can accurately discern chickens’ emotions from their clucks, a finding that could enhance chicken welfare and inform consumer choices.

A University of Queensland-led study has found humans can tell if chickens are excited or displeased, just by the sound of their clucks.

Professor Joerg Henning from UQ’s School of Veterinary Science said researchers investigated whether humans could correctly identify the context of calls or clucking sounds made by domestic chickens, the most commonly farmed speciesA species is a group of living organisms that share a set of common characteristics and are able to breed and produce fertile offspring. The concept of a species is important in biology as it is used to classify and organize the diversity of life. There are different ways to define a species, but the most widely accepted one is the biological species concept, which defines a species as a group of organisms that can interbreed and produce viable offspring in nature. This definition is widely used in evolutionary biology and ecology to identify and classify living organisms.” data-gt-translate-attributes=”[{“attribute”:”data-cmtooltip”, “format”:”html”}]” tabindex=”0″ role=”link”>species in the world.

Study Methodology and Findings

“In this study, we used recordings of chickens vocalizing in all different scenarios from a previous experiment,” Professor Henning said.

“Two calls were produced in anticipation of a reward, which we called the ‘food’ call and the ‘fast cluck’.

“Two other call types were produced in non-reward contexts, such as food being withheld, which we called the ‘whine’ and ‘gakel’ calls.”

The researchers played the audio files back to test whether humans could tell in which context the chicken sounds were made, and whether various demographics and levels of experience with chickens affected their correct identification.

Implications for Chicken Welfare

“We found 69 percent of all participants could correctly tell if a chicken sounded excited or displeased,” Professor Henning said.

“This is a remarkable result and further strengthens evidence that humans have the ability to perceive the emotional context of vocalizations made by different species.”

Professor Henning said the ability to detect emotional information from vocalization could improve the welfare of farmed chickens.

“A substantial proportion of participants being able to successfully recognize calls produced in reward-related contexts is significant,” he said.

“It provides confidence that people involved in chicken husbandry can identify the emotional state of the birds they look after, even if they don’t have prior experience.”

Future Research and Applications

“Our hope is that in future research, specific acoustic cues that predict how humans rate arousal in chicken calls could be identified, and these results could potentially be used in artificially intelligent based detection systems to monitor vocalizations in chickens,” Professor Henning said.

“This would allow for the development of automated assessments of compromised or good welfare states within poultry management systems.

“Ultimately this could enhance the management of farmed chickens to improve their welfare, while helping conscientious consumers to make more informed purchasing decisions.”

This research has been published in the journal Royal Society Open Science.

Reference: “Humans can identify reward-related call types of chickens” by Nicky McGrath, Clive J. C. Phillips, Oliver H. P. Burman, Cathy M. Dwyer and Joerg Henning, 3 January 2024, Royal Society Open Science.
DOI: 10.1098/rsos.231284

Source: SciTechDaily