Talking to whales

Top photo by Mike Korostelev . Bottom visualizations of sperm whale clicks. Top: standard spectrogram, bottom Wavelet display by Mark Fischer/Aguasonic
There have been a number of articles coming across my screen over the last year or so speculating – and actually deploying “Artificial Intelligence” to figure out whether we might be able to understand, and thus communicate with whales.
This sounds like a lot of fun, but I’m going to open up this gambit by paraphrasing the old Wittgenstein trope: ‘If we could understand what whales were saying, we wouldn’t understand what they were talking about.’
Probably the most meaningful AI efforts are focused on sperm whales, likely because these critters communicate using a collection of repeated click patterns in what has been identified as “codas.” It would stand to reason that repeated phonation patterns might signify something.
All that needs to be done then, is compare the circumstances, dispositions, and actions of the various communicating group members, and hear if there are any correlations between conditions, associations, and behaviors that would give rise to ‘meaning’ in their identified “codas.”
The problem we have had up to this point is if you have a group of 20 sperm whales doing “something” with each other, there are far too many variables for observing humans to grasp. But because AI is good at synthesizing gobs of data, it would be a good candidate to sort out all of the variables, identify patterns, distinguish a “vocabulary,” and Viola! Sperm whale chat groups!
The problem with this narrative is that “vocabulary” implies representational, or symbolic language. Humans are epistemologically adapted to symbolic, representational language. Loren Eiseley gets into this with his “Time Binding” idea of human language development – that given our early, dark subterranean origins, we needed to convey visual experiences into strategic acoustical instructions over time – like “don’t go to the south entry of our den because there is a predator waiting there.”
We hear this in other subterranean animals – prairie dogs and mole rats, for example, that live in dark warrens and need to convey experiences over time. Whales, on the other fin, don’t seem to have any adaptive reason to use symbolic, or representational “language.” They only need to know where they are in time and space in the context of their dynamic social and geophysical circumstances.
So how do these animals coordinate their complex relationships merely with a set of clicks and codas? These recognizable patterns are often compared to “Morse Code.” But while these patterns may have contextual meaning, such as acoustical signatures identifying who is making the sound, they’re is no phenomenological reason for these sounds to mean “things.”
I believe there are a lot of elements missing in the whale communication inquiry. This “symbolic” or “representational” language track being just one. But I suggest that some of the fundamental tools of signal analysis are also “missing in action.”
The visualizations above are two ways of looking at sperm whale signals. There are two in the header illustration above. The top visualization is a series of clicks in the frequency domain – a common analysis tool used, called a “spectrogram.” The other visualization is in the time domain. (These are not the same signals, but they would “sound” similar to us humans.)
There are clearly two levels of granular detail in these two “visual translations.” The first is relative to frequencies, low to high, along the “y” axis (color being equated to amplitude), against a fixed temporal “x”axis. This helps humans recognize sperm whale “coda” patterns and “pattern matches” how humans serialize “vocabulary” into coherent streams of meaning. It is also how they are training AI to “interpret” acoustical signals.
The second visualization displays amplitude and fine-pitch time domain, with the y axis querying how the signal correlates or interferes with a known time-domain signal – in this case a known “wavelet” interference pattern against an x-axis temporal window. This may be closer to how sperm whales decipher information – in terms of data density (or not).
So the top visual translation infers the idea of “vocabulary” implied by pulse patterns in the frequency domain. The visual translation below is dependent on data density in the time domain.
AI may be able to sort this out once we understand what data is being conveyed in the time domain, but we don’t even have a perceptual context to know what it means. When a sperm whale phonates to another sperm whale, are the the signals interpreted in a spatial/temporal version of a “homunculus” embodied in the receiver? Or are they merely a spatial synthesis of what the receiver would otherwise perceive as a biosonar reflection, had it emitted the originating signal? (Or perhaps a bit of both.)
And this is without integrating the most important component of communication – empathy. Humans have empathy with others around us in the context of our identity habitat: “I feel for you because I identify with your sensations in the context of our shared environment.”
Human shared perceptions include visuality, hearing, smell, sentiment (‘vibes’), and to a lesser degree, physical vibrations (because unlike whales in water, our bodies are separated in a compressible space and subject to gravity).
Whales inhabit a perceptual space that includes their bodies; because they inhabit a non-compressible medium in which there is only a minor difference between the acoustical compliance of their bodies and their aquatic surroundings. They inhabit a space in which acoustical and physical energy influences their bodies in a manner similar to all others around them. All of their bodies are subject to the same vibrations and physical torments of their habitat – separated only by the distance between each other in water.
We have no way of interpreting this level of empathy (although I’d suggest that if we did, we would not be messing up our planet the way we are).
So while using AI to translate sperm whale symbolic communication is a clever intrigue, we, and AI, are so far from the possibilities, that I suspect a lot of money will be expended on this endeavor before we understand.

 

Subscribe
Notify of
guest

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments