Can we talk to animals with artificial intelligence?

Move over Eliza Thornberry, an organisation in California claims that machine learning is decoding communication with dolphins and more…

Hero image in post
photo: The Wild Thornberrys
Hero image in post
photo: The Wild Thornberrys

Move over Eliza Thornberry, an organisation in California claims that machine learning is decoding communication with dolphins and more…

By Eve Walker02 Aug 2022
3 mins read time
3 mins read time

The 1970 album Songs of a Humpback Whale bolstered a movement that banned the barbaric practice of commercial whaling, as it got listeners to really empathise with the animal upon hearing their elaborate vocalisations and imbued intelligence. So what would the world be like – what we buy, how we eat, how we treat the animal kingdom – if we all had the powers of Dr. Doolittle?

Animal communication remains a subject of wonder: primates produce different warning noises depending on the predator, birds change elements of their song to alter the intended message, and dolphins and whales communicate with a range of whistles through the ocean. Yet, there isn't currently any animal that communicates in a way that meets all the criteria to justify labelling it as a language.

Aza Raskin, the co-founder and president of Earth Species Project (ESP), claims that their California based non-profit is getting close to decoding non-human communication with artificial intelligence through machine learning.

“Along the way and equally important is that we are developing technology that supports biologists and conservation now.” Raskin explained, in an interview with the Guardian.

ESP’s first scientific paper was published last December. The organisation will make all their findings public as part of their mission to deepen our connection and understanding of the animal kingdom, so that we can better protect other species on our planet.

Decoding has previously relied on animal observation, which takes a huge amount of time and patience. However, machine learning allows scientists to process vast quantities of data that can be collected by animal-borne sensors.

“People are starting to use it,” said Elodie Briefer, a professor at the University of Copenhagen who specialises in vocal communication in mammals and birds. “But we don’t really understand yet how much we can do.”

Briefer co-developed an algorithm that analyses pig grunts to decipher whether the animal is feeling happy or sad. Similarly, DeepSqueak has the ability to tell if rodents are feeling stressed based on their ultrasonic calls. Another algorithm, which is yet to be put into practice – Project CETI (Cetacean Translation Initiative) – will use machine learning to translate the communication between sperm whales.

ESP, however, aims to decode all animal communication, rather than focusing on one species. While the animal experience remains somewhat of a mystery to us, we know they feel emotions, like grief and joy.

Raskin said: “I don’t know which will be the more incredible – the parts where the shapes overlap and we can directly communicate or translate, or the parts where we can’t.”

But the project has caused serious doubt. Robert Seyfarth, professor emeritus of psychology at University of Pennsylvania, remains sceptical. He has studied social behaviour and vocal communication in primates in their natural habitat for over 40 years.

“I just think these AI methods are insufficient,” Seyfarth concludes. “You’ve got to go out there and watch the animals.”

Raskin does acknowledge that AI alone may not be enough to talk to animals. But he explains recent research that has shown many species communicate in ways “more complex than humans have ever imagined”.

“These are the tools that let us take off the human glasses and understand entire communication systems,” he said.