Scientists prove dolphin communication is quite sophisticated

Humans generally think of ourselves as qualitatively more intelligent than all other animals… so much more intelligent, in fact, that many humans refuse to accept that we are animals.

And one of the most commonly cited “proofs” of our intellectual superiority is our rich, complex language.

Ironically, our arrogance about our language dominance may actually reflect our intellectual inability to grasp other animals' languages.

This article reports fascinating research analyzing dolphin communication (using “information theory”) to prove dolphins communicate in sophisticated ways humans — despite decades of trying — still cannot comprehend. Scientists haven’t yet collected enough data to determine exactly how rich dolphin communication is, but we already know dolphins communicate using quite expressive language:

As Carl Sagan once famously said, “It is of interest to note that while some dolphins are reported to have learned English – up to 50 words used in correct context – no human being has been reported to have learned dolphinese.”

…[A]ccording to information theory, dolphin communication is highly complex with many similarities with human languages, even if we don’t understand the words they are saying to one another.

Information theory was developed in the 1940s by the mathematician and cryptologist Claude Shannon, mainly to be applied to the then-burgeoning technology of telecommunications. It operates on the knowledge that all information can be broken down into ‘bits’ of data that can be rearranged in myriad ways. George Zipf, a linguist at Harvard, realized that language is just the conveyance of information, and therefore could be broken down too.

Think of all the different sounds human beings make as they speak to each other, the different letters and pronunciations. Some, such as the letters ‘e’ and ’t' or words such as ‘and’ or ‘the’ will occur far more frequently than ‘q’ or ‘z’ or longer words such as ‘astrobiology’. Plot these on a graph, in order of the most frequently occurring letters or sounds, and the points form a slope with a –1 gradient.

A toddler learning to speak will have a steeper slope – as they experiment with words they use fewer sounds but say them more often. At the most extreme a baby’s babble is completely random, and so any slope will be nearly level with all sounds occurring fairly evenly. It doesn’t matter which human language is put through the information theory test – be it English, Russian, Arabic or Mandarin – the same result follows.

What is remarkable is that putting dolphin whistles through the information theory blender renders exactly the same result: a –1 slope, with a steeper slope for younger dolphins still being taught how to communicate by their mothers, and a horizontal slope for baby dolphins babbling. This tells us that dolphins have structure to how they communicate.

Meanwhile, another feature of information theory, called Shannon entropy, can tell us how complex that communication is…. Write down 100 words on one hundred pieces of paper and throw them into the air and they can be arranged in myriad ways. Impose rules on them, such as sentence structure, and your choices automatically narrow. It is a bit like playing hangman; you have a five-letter word where the first letter is ‘q’, so the rule structure of English necessitates that the second letter is ‘u’. From thereon there is a limited number of letters that can follow ‘qu’ and so you may have ‘que’ or ‘qui’ or ‘qua’ and you can predict that the word is ‘quest’ or ‘quick’ or ‘quack’. Shannon entropy is defined as this application of order over data and the resulting predictability of that order.

“It turns out that humans go up to about ninth order Shannon entropy,” said Doyle. “What that means is, if you are missing more than nine words then there is no longer a conditional relationship between them – they become random and pretty much any word will do.” In other words, there are conditional probabilities, imposed by the rule structures of human languages, up to nine words away.

Doyle has analyzed many forms of communication with information theory, from the chemical signals of plants to the rapid-fire radio transmissions of air traffic control. How do dolphins fare? “They have a conditional probability between signals that goes up to fourth order and probably higher, although we need more data,” said Doyle.

Posted by James on Sunday, September 11, 2011