The whale song translation project—An experiment to assess humpback whale response to voice-selected visual feedback cues

To better understand the behavioral and communication capabilities of Megaptera Novaeangliae, the findings of a recent whale song study suggest an intriguing experiment to assess humpback whale response to acoustically selected visual-feedback cues. The analysis of high-complexity, frequency-modulat...

Full description

Saved in:
Bibliographic Details
Published inThe Journal of the Acoustical Society of America Vol. 150; no. 4; p. A163
Main Author Pines, Howard S.
Format Journal Article
LanguageEnglish
Published 01.10.2021
Online AccessGet full text
ISSN0001-4966
1520-8524
DOI10.1121/10.0007991

Cover

More Information
Summary:To better understand the behavioral and communication capabilities of Megaptera Novaeangliae, the findings of a recent whale song study suggest an intriguing experiment to assess humpback whale response to acoustically selected visual-feedback cues. The analysis of high-complexity, frequency-modulated song units indicates a Shannon-Hartley-compliant sub-unit architecture similar to human vowel generation. Like constant-pitch English language vowels, which are differentiated by their two most energetic peak resonance frequencies, humpbacks also exhibit precision vocal control of the production of a variety of sub-units of distinct and differentiable harmonic frequency combinations. Humans navigate mobile phone and tablet-PC informational, gaming, and adaptive learning apps using visual feedback from tactile-selected touchscreen icons and hyperlinks. In lieu of tactile manipulation, an alternative approach to touchscreen control is vocal selection of icons and links corresponding to the generation of specific vowel resonance frequencies and sub-unit harmonic frequencies. A software prototype of a voice-controlled “touchscreen” gaming experiment demonstrates how humans and humpback whales could conceivably interact or how humpbacks could engage in informational transactions. The prototype also incorporates video “training” examples designed to guide subjects in the voiced selection of visual symbols assigned to the sub-regions of a big-screen display.
ISSN:0001-4966
1520-8524
DOI:10.1121/10.0007991