You are here

Entropy rate as a measure of animal vocal complexity

Arik Kershenbaum (2014). Entropy rate as a measure of animal vocal complexity. Bioacoustics, Volume 23 (3): 195 -208

 

Abstract: 

Vocal complexity is an important concept for investigating the role and evolution of animal communication and sociality. However, no one definition of ‘complexity’ appears to be appropriate for all uses. Repertoire size has been used to quantify complexity in many bird and some mammalian studies, but is impractical in cases where vocalizations are highly diverse, and repertoire size is essentially non-limited at realistic sample sizes. Some researchers have used information-theoretic measures such as Shannon entropy, to describe vocal complexity, but these techniques are descriptive only, as they do not address hypotheses of the cognitive mechanisms behind vocal signal generation. In addition, it can be shown that simple measures of entropy, in particular, do not capture syntactic structure. In this work, I demonstrate the use of an alternative information-theoretic measure, the Markov entropy rate, which quantifies the diversity of transitions in a vocal sequence, and thus is capable of distinguishing sequences with syntactic structure from those generated by random, statistically independent processes. I use artificial sequences generated from different stochastic mechanisms, as well as real data from the vocalizations of the rock hyrax Procavia capensis, to show how different complexity metrics scale differently with sample size. I show that entropy rate provides a good measure of complexity for Markov processes and converges faster than repertoire size estimates, such as the Lempel–Ziv metric. The commonly used Shannon entropy performs poorly in quantifying complexity.

Keywords: 

complexity, entropy rate, Lempel–Ziv, Markov process, renewal process, Shannon entropy, syntax