Welcome to the fourth edition of AI for Animals! This newsletter brings you the latest news and research on AI, animals, and digital minds.
Each edition also homes in on a specific topic relating to AI’s potential impacts for animals. This time, we explore how AI could facilitate communication between humans and other animals.
If you have any questions or feedback, our email is contact@aiforanimals.org. You can subscribe to this newsletter here.
Thanks to Allison Agnello, Yolanda Eisenstein, and Constance Li for their contributions to this month’s edition!
Max Taylor
Index
How could AI enable interspecies communication?
In late 2023, researchers ‘conversed’ with a humpback whale they called Twain. While this interaction only really involved Twain and the SETI researchers greeting each other several times before Twain swam away, to many it seemed a crucial step towards meaningful human communication with another species.
Several organizations are dedicated to making such interspecies communication a reality. For Project CETI (Cetacean Translation Initiative), the sole objective is to communicate with whales. (This makes sense: whales have enormous brains, are extremely social, are known for their complex vocalizations, and – as fellow mammals – are relatively likely to share some linguistic similarities with humans.) The goal of Earth Species Project and Interspecies Internet, meanwhile, is to communicate with all species.
Machine learning and its knack for spotting patterns in a sea of messy data are fundamental to the approach of all three organizations. If you too want to use machine learning to communicate with other species, here’s a step-by-step guide, using whales as an example.
1. Gather the data
As with most AI applications, any human-whale communication tool requires a huge amount of data. Recordings of the whales’ vocalizations are top of the list: by one estimate, you’ll need roughly four billion recordings if you want to deploy a really advanced model. However, to interpret those vocalizations, you’ll also need to understand what context they occurred in: Who was speaking? Whom were they speaking to? Where? Were their predators nearby? Had they recently eaten? What sort of emotional state did they seem to be in?
This entails recording years’ worth of visual footage to accompany the acoustic data. Fortunately, recording tech has become radically cheaper and more versatile in recent years. Aerial and underwater drones, cameras on tethered buoys, and wearable ‘bio-logger’ devices (such as cameras designed to attach to the whale like suckerfish) collectively allow researchers to identify and track individual whales, gradually building a picture of their lives, personalities, and social structures. Mother-baby interactions are particularly revealing, shedding light on the way in which whales first acquire language.
2. Clean up the data
So now you have all of your data, but chances are that your acoustic recordings will be a mess of underwater noise. Whale sounds will typically be infrequent, indistinct, obscured by other animals, or drowned out by ships. Once you’ve isolated the whale vocalizations, you’ll still need to separate out each individual’s vocalizations from the overlapping chatter and distinguish communicational calls from echolocation clicks.
Luckily, machine learning is adept at addressing this ‘cocktail party problem’. Researchers developed a model nicknamed ‘Deep Karaoke’ by training it to recognize the sounds of different individual instruments, then break down complex musical recordings into the individual constituents. While this model was designed for recordings of human communication, similar algorithms have proved successful in doing the same with animal vocalizations and unwanted environmental noise.
Once you’ve removed unwanted background noise and worked out which whale is saying what, you can start breaking down the vocalizations into their individual constituents. For whales, these constituents are sequences of clicks known as ‘codas’. AI’s adeptness at pattern recognition makes it much easier to identify commonly recurring sequences, gradually revealing whales’ coda ‘alphabet’. It’s not just whales: researchers and citizen scientists are making progress in deciphering the alphabets (and to a lesser extent, dictionaries) of all kinds of animals, including elephants, macaques, orangutans, meerkats, bats, and crows.
3. Decode the data
To translate the data into human language, you have two main options:
Option 1: Train algorithms on labeled data.
The labeling entails you poring through the audio and video recordings and suggesting, based on your expertise in whale behavior, what the whales might have been trying to communicate in different contexts. You then feed this labeled dataset into the supervised learning algorithm so that it can pinpoint particular sounds that correspond to those likely communicative intentions and, over time, refine its ability to generalize these patterns to new, unlabeled recordings. While this approach provides expert human supervision for the AI models, it is limited by the biases and constraints of human understanding. There will always be a high degree of uncertainty and subjectivity in how researchers label the data, which can introduce noise and error into the training process.
Option 2: Train algorithms on unlabeled data.
In theory, models could use underlying similarities between human and whale ‘language’ to establish commonalities. This is sometimes depicted as a constellation of words, connected by gravitational pulls of varying strengths. For example, in most human languages, the relationship between the words ‘king’ and ‘queen’ is similar to the relationship between ‘man’ and ‘woman’, and the relationship between ‘man’ and ‘king’ is similar to that between ‘woman’ and ‘queen’. These shared connections give languages a common ‘shape’, allowing you to map one language onto another and understand which words in the different languages are referring to the same concept. (See this Vox video for a visualization of this.)
Human languages generally all share the same core concepts, so this method has proven very successful for translation across many languages. The big unresolved question is whether whales and other non-human animals share enough of those concepts for such a mapping exercise to be feasible.
Some have also pointed to other linguistic rules that seem to occur across most human languages and might also form the basis of various non-human forms of communication. One such example is Zipf’s law, which proposes that the most common word in a language occurs twice as often as the second most common, three times as often as the third most common, and so on. However, the extension of such rules to animal communication is contested.
4. Talk back
Congratulations! You’ve worked out what the whales are saying and are now ready to talk back to them. You’re faced with a few decisions on how to communicate:
Source: You can either play back a recording of real-life whale vocalizations, or use synthetically generated ones. The former risks causing confusion if you use a familiar voice (or the whale’s own), while the latter risks sounding unnatural.
Medium: Sounds can be played from a disembodied speaker, from a speaker close to a human, or by an animatronic whale. In principle the last option stands the greatest chance of engagement, but in practice robotics would need to be more advanced (and much cheaper).
Environment: This presents a dilemma. Speak to animals in their natural habitat, and you stand a greater chance of causing havoc if you accidentally say something off-kilter. But interaction with animals in captivity can be ethically dubious and might not replicate in the wild.
Finally, and most importantly, you need to decide what to actually talk about and what your intentions are. Maybe you want to satisfy your scientific curiosity, help break down the perceived boundaries between humans and other species, or warn the whales away from shipping routes and drilling operations. Or maybe you want to misuse this information by luring whales in so you can harpoon them and sell their flesh. The next edition will explore the huge opportunities, risks, and ethical conundrums that human-animal communication could bring.
📚 Resources
For more information on AI and human-animal communication, check out:
The work of the Earth Species Project, Project CETI, and Interspecies Internet
Toward understanding the communication in sperm whales (Andreas et al., iScience)
Cetacean conversation: AI could let us talk to whales. Experts question if that's a good idea (Salon)
ESP Technical Roadmap (Earth Species Project)
The Hive Community Slack (sign up to join) has several channels dedicated to discussion of AI and animals, including #c-ai-discussion for broad discussions and #s-ai-coalition for project collaboration.
If you want to dig deeper, the aiforanimals.org website has a list of relevant articles, papers, and other materials giving an overview of the AI and animals space.
🌏 Opportunities
Planning is underway for the 2025 AI, Animals, and Digital Minds Conference (Bay Area, February 2025, exact details TBD). The conference will be a multidisciplinary gathering of people who want to steer AI and other advanced technologies towards the benefit of nonhumans. To make suggestions for content, speakers, or anything else, please fill out this form.
Open Paws is looking for volunteers for their human feedback collection project. To get involved, read more and sign up here.
🚨 Updates
September AI for Animals SF Meetup: Our monthly meetup in San Francisco focused on advances being made in communication with dogs and what this could mean for gaining their legal personhood. We had fun with a special dog theme! The speaker was Praful Mathur, founder of Sarama, a tech company building AI to enable multimodal interspecies communication. The event was featured on Luma’s discovery list and attracted 60+ attendees.
International Animal Rights Conference: Sam Tucker, head of Open Paws, gave a talk on ‘Advancing animal rights in the age of AI’ in Warsaw, Poland. Sam highlighted the need to 1) restrict the adoption of AI in animal agriculture, 2) make the AI industry more animal-friendly, and 3) accelerate the adoption of AI in animal advocacy. The recording is available here.
CARE Conference: AI Coalition members Constance Li (Hive/AI for Animals), Yip Fai Tse (Peter Singer’s Research Assistant), karol orzechowski (Faunalytics), and Sam Tucker (Open Paws) took part in a panel discussion on ‘Artificial Intelligence and Animal Advocacy – Navigating the Ethical Frontier’. The panel covered the risks and opportunities of Precision Livestock Farming, how advocates can most effectively use AI, and the possibility of AI-caused catastrophes for animals.
8th Africa Animal Welfare Conference: Yolanda Eisenstein (UIA Animal Law Commission) presented on ‘AI Collaborations to Protect Animals and the Environment’ in Nairobi, Kenya. Yolanda highlighted the promising applications of AI for wildlife conservation and animal welfare, such as Wildme/Conservation X Lab's project, but also noted that AI can bring many risks for humans and animals. She stressed the need for collaboration between animal advocates and AI researchers, AI developers, lawmakers, and the public.
EU AI Act Code of Practice Consultation: Five coalition members were accepted to become stakeholders in the consultation process. We dedicated our monthly meeting to discussing and coordinating strategy to achieving the most favorable outcomes for animals in this policy.
🗞️ News & Research
🗣️ Understanding animals
How machine learning is helping us probe the secret names of animals (MIT Technology Review)
Researchers at Hebrew University in Israel have used machine learning and audio analysis to discover that marmoset monkeys use specific vocal labels, akin to names, to identify and communicate with individual monkeys. By employing a statistical technique called "random forest," the team was able to classify and analyze these sounds, providing strong evidence that these vocal labels are a form of primitive naming, similar to what has been observed in humans, dolphins, elephants, and parrots.
Danish researchers use AI to understand pigs (The Local)
Danish researchers from the University of Copenhagen used AI to decode 19 distinct pig vocalizations, identifying emotions like happiness, sadness, fear, frustration, and stress. The study, based on over 15,000 recorded pig grunts, found that pigs in conventional farms expressed distress more frequently (25%) compared to pigs in outdoor environments (8%), highlighting welfare differences.
Whistles, songs, boings, and biotwangs: Recognizing whale vocalizations with AI (Google)
Google Research has introduced a new whale bioacoustics AI model capable of identifying eight distinct whale species, including multiple call types for some species and the recently attributed "Biotwang" sound of Bryde's whales. The model, which converts audio data into spectrograms for classification, has demonstrated high performance across various whale vocalizations and has been used to label over 200,000 hours of underwater recordings, providing new insights into whale ecology and migration patterns.
The world's first robotic AI shark can do this (Times of India)
A Chinese company has developed a five meter-long robotic whale shark that uses AI to navigate its environment. Researchers plan to use it to improve our understanding of marine animals and ecosystems.
[Audio] Translating whale, with the help of AI (Wisconsin Public Radio)
Project CETI uses AI and autonomous submarines to record and analyze sperm whale vocalizations, identifying distinct "codas" that vary by family and region. These codas are believed to convey cultural information, like clan identity, behavior, and habitat use, and the project's machine learning models help process vast amounts of audio to classify these patterns, revealing nuanced communication that may change with context.
🐔 Chicken farming
New £3m project to address health and welfare issues in poultry production (Food Manufacture)
Nottingham Trent University in the UK has partnered on a £3m project exploring the use of insect protein and artificial intelligence to address significant health and welfare issues in poultry production.
Using AI, EWG maps 357M poultry on North Carolina’s factory farms (Environmental Working Group)
The Environmental Working Group used AI to map 357 million chickens and turkeys on North Carolina factory farms, revealing a 43% increase since 2007 and significant environmental concerns. The investigation highlights the lack of state oversight and potential health risks associated with dense concentrations of poultry operations, including air and water pollution from the estimated 3.2 million tons of manure produced annually.
Animal Welfare and Artificial Intelligence: A Combination of the Poultry Present or Future? (AviNews)
AI integration in poultry farming enhances the detection of welfare issues like lameness and health problems by analyzing subtle changes in behavior, gait, and environmental conditions. Automated systems can track weight gain and stress indicators with greater precision, minimizing human error and invasive handling. While these technologies offer more objective welfare assessments, their effectiveness depends on factors like environmental complexity and the degree of system integration on farms.
🐟 Aquaculture
Op-ed: AI data modeling and nanotechnology are innovations every farmer should be taking note of (Seafood Source)
Norwegian aquaculture startups like Manolin and Biofeyn are using AI and advanced technologies to improve fish farming efficiency and sustainability. Manolin applies AI-driven predictive modeling to optimize salmon farming by helping farmers predict health issues, optimize breeding and genetics, and improve treatments for sea lice; while Biofeyn uses nanotechnology to enhance fish nutrition and deliver oral vaccines via feed.
Advanced biomass camera proves a success for New Zealand salmon farms (The Fish Site)
Ace Aquatec's A-BIOMASS camera, using AI and machine learning, has proven successful at Mount Cook Alpine Salmon's New Zealand farm, providing accurate real-time biomass estimates in challenging conditions. The technology has improved harvest results and reduced fish stress, prompting Ace Aquatec to pursue expansion across Oceania's aquaculture industry.
A Look at Science, Technology, and Artificial Intelligence for Sustainable Aquaculture (Environmental Defense Fund)
Advanced AI and robotics are improving aquaculture sustainability through automated feeding, disease detection, and remote monitoring of ocean farms. Meanwhile, innovations in feed formulas, pen designs, and traceability systems aim to reduce environmental impacts and improve efficiency in open ocean fish farming.
The AI tool that aims to make bottom trawling smarter and prevent bycatch and discards (Global Seafood Alliance)
Smartrawl, an AI-driven tool, helps reduce bycatch in bottom trawling by using cameras and smart gate systems to sort fish by species and size, releasing non-target species back into the ocean. This technology aims to improve sustainability in fisheries by reducing waste and helping fishers comply with regulatory catch limits. While promising, the system is still undergoing field trials to assess its effectiveness compared to traditional methods.
Fish-AI project develops artificial intestine platform to revolutionize aquafeed trials (AquaFeed)
The Fish-AI project has developed an in vitro platform using fish intestinal cell lines to test aquafeeds, reducing the need for live animal trials. It offers efficient early-stage feed evaluation, though it currently focuses only on gut health and requires highly specialized personnel.
Abu Dhabi AI-powered project aims to boost sustainable fish farming, says expert (The National)
Abu Dhabi's AI-powered floating sea cages aim to make fish farming more sustainable by optimizing feed strategies and minimizing environmental impact. The project is expected to improve fish health, reduce resource usage, and develop new protocols for environmentally responsible aquaculture.
China accelerates big data, AI application in ocean industry, anticipating revolutionary changes (Global Times)
China is leveraging AI and big data to transform marine aquaculture, optimizing feed usage and reducing environmental impact by precisely monitoring and adjusting feeding practices. This technology is also being shared with developing countries, helping them enhance their aquaculture efficiency and sustainability as part of broader efforts to build ‘blue economies’.
🐖 Animal farming: General
The Next Revolution in Animal Agriculture (Asterisk)
Precision livestock farming (PLF) uses AI-driven sensor technologies to automate monitoring of farm animals, offering potential improvements in efficiency and animal welfare. However, challenges such as high costs, data privacy concerns, and low adoption rates among farmers, compounded by the lack of financial incentives and technological complexity, limit its widespread implementation despite its potential benefits.
The Future of Meat & Poultry Processing: New Trends (Meat + Poultry)
AI and robotics are revolutionizing the meat and poultry industry by enhancing precision in processing, with AI-driven systems optimizing butchering and supporting predictive maintenance.
Animal-counting drones help farmers speed up stocktake (1News)
AI-powered drones are revolutionizing livestock counting on farms in New Zealand, significantly speeding up the process by quickly identifying and counting animals from aerial footage. This innovation reduces human error, enhances efficiency, and provides farmers with rapid and accurate stocktake results, improving operational management.
AIMS questions the use of AI to influence food purchases (Meat Management)
The Association of Independent Meat Suppliers (AIMS) has voiced concern at suggestions by Tesco’s CEO that AI could be used to nudge consumers into making healthier and more sustainable purchases. AIMS says it is worried that AI could perpetuate programmers’ biases by pushing ‘specific narratives’, such as promoting plant-based alternatives.
EuroTier 2024: Trends in livestock technology (EuroTier)
EuroTier 2024, the world's leading livestock technology fair, will showcase AI-powered solutions for animal monitoring, automated feeding systems, and emission-reducing barn designs. The November event will highlight innovations in robotic farm assistance, climate control technologies, and data integration to improve animal welfare and farm efficiency.
🐦⬛ Wild animals
Alstom and Flox trialling AI wildlife detection system in Sweden (Railway Technology)
Alstom and Flox are trialing an AI wildlife detection system on Swedish railways to reduce train-animal collisions, with the technology identifying animals and using tailored sound signals to deter them. The project, funded by a Swedish government grant, aims to enhance both rail safety and wildlife protection by preventing the approximately 5,000 annual animal collisions on Sweden's tracks.
Trump Calls Wind Turbines Bird Killers. New AI Tech Saves Them From The Blades (Forbes)
New AI-powered radar systems can detect bird flocks and automatically halt wind turbines to prevent collisions, with successful trials in Portugal and upcoming implementations in Dutch wind farms. While wind turbines do kill birds, studies show that domestic cats, buildings, and vehicles are far greater threats to bird populations.
Valiance Solutions has launched WildlifeIQ, an AI-powered platform for intelligent wildlife monitoring and conservation, featuring automated species classification, tiger identification, and conflict prediction. The system speeds up the analysis of camera trap images from months to days, offering real-time insights to help manage human-wildlife conflicts more effectively.
Using Artificial Intelligence to Combat Wildlife Crime (Wilson Center)
AI technologies are becoming essential in combating wildlife crime, like poaching and illegal logging, by enabling real-time detection through tools such as machine learning-powered microphones and cameras, which improve patrol efficiency and target illegal activity more accurately. However, these advancements also pose risks for under-resourced rangers, potentially increasing confrontations with criminals, emphasizing the need for proper training, equipment, and support alongside the AI tools.
🍔 Alternative proteins
How generative AI could boost plant-based protein functionality (Food Navigator)
AI Bobby's generative AI technology specifically designs and optimizes plant-based proteins by analyzing vast datasets to identify the most effective protein structures for desired functionalities, such as gelation, which is crucial for achieving the right texture and mouthfeel in plant-based products. The AI models are trained to predict and enhance the gelling properties of proteins, enabling the creation of plant-based meats and dairy analogues with improved consistency and reduced need for additional ingredients, thus lowering production costs and improving product quality.
How AI can help design proteins for the food industry (Food Navigator)
Start-ups like Cradle are using generative AI to design and optimize proteins for various applications in the food industry, such as improving the stability of growth factors in cultivated meat or enhancing the clarity of fruit juices. This technology allows for the creation of proteins with specific properties that may not naturally occur, significantly speeding up the development process and enabling new functionalities, like extending shelf life or removing off-tastes in plant-based products.
How AI is Powering the Next Generation of Plant-Based Food (VegNews)
Three companies are using AI to revolutionize plant-based foods: Climax Foods uses AI to analyze animal-based foods at a molecular level, creating hyper-realistic vegan cheeses with plant-based casein; Meati employs AI to optimize the nutrition of mycelium-based meat alternatives; and NotCo's AI, Giuseppe, mimics the molecular structure of animal products, generating innovative plant-based recipes, such as using pineapple and cabbage to replicate dairy milk. These innovations aim to improve taste, texture, and nutritional value while reducing production costs.
🤖 Digital minds
Understanding the moral status of digital minds (80,000 Hours)
80,000 Hours wants to see more people focusing their careers on the moral status of digital minds, building a field of researchers to improve our understanding of this topic and getting ready to advise key decision makers in the future.
Can AI feel distress? Inside a new framework to assess sentience (Nature)
Jonathan Birch's book "The Edge of Sentience" proposes a framework for assessing and protecting potentially sentient entities, from AI to animals, using a precautionary approach based on scientific meta-consensus and citizen panels. The book grapples with philosophical and scientific uncertainties surrounding sentience across various domains, advocating for proportionate protective measures while acknowledging the challenges in definitively establishing sentience in non-human entities.
🐀 Animals used for research
Testing toxicity using stem cells and AI (Nature)
Researchers at Yokohama University of Pharmacy are developing a system, StemPanTox, that combines stem cells and AI to predict chemical toxicity, aiming to reduce reliance on animal testing. This approach offers more accurate toxicity assessments by analyzing gene expression in human stem cells, overcoming the limitations of species differences in traditional animal models. The technology could lead to safer chemical and drug development while advancing policies toward reducing animal experiments.
How AI is helping to bridge the research gap between animals and humans (Nachrichten Informationsdienst Wissenschaft)
Researchers have developed an AI model that narrows the gap between animal models and human disease studies by translating molecular patterns from animals, like hamsters with COVID-19, into corresponding human patterns. Unlike other AI efforts aimed at finding alternatives to animal testing, this approach seeks to enhance the relevance of animal experiments for human clinical research by improving the accuracy of disease modeling.
Could AI replace animal research? (Understanding Animal Research)
AI is increasingly being used to improve the efficiency and precision of animal research, particularly in fields like toxicology, where AI-driven models replicate human biological systems to predict responses to substances, complementing traditional animal testing. While AI shows promise in enhancing data analysis, experimental design, and outcome prediction, it is not yet capable of fully replacing animal models due to the complexity of living organisms; instead, it primarily serves to refine current methods and reduce the number of animals used.
🐾 …and more
AI Caught In ‘Tug-Of-War’ Between Animal Agriculture And Advocacy (Plant Based News)
AI is creating a "tug-of-war" in animal agriculture, with potential to intensify factory farming or empower animal advocacy efforts. Open Paws founder Sam Tucker argues now is the critical time to direct AI towards compassionate purposes, helping advocacy organizations while pushing for restrictions on AI use in animal farming.
Animals in the machine: why the law needs to protect animals from AI
AI has the potential to both help and harm animals, benefiting areas like veterinary care but also facilitating illegal wildlife trade and perpetuating animal cruelty. Australia's renewed Animal Welfare Strategy should address these AI-related risks to ensure technology does not amplify existing animal harms or create new ones.
Tackling Veterinarian Burnout With AI For Good (Forbes)
AI tools like LAIKA are being introduced to help veterinarians manage burnout by streamlining tasks such as diagnostics, disease detection, and workflow management. These AI copilots can reduce time spent on routine tasks, allowing vets to focus more on complex decision-making and patient care, ultimately improving work-life balance and reducing stress. While AI offers significant support, ethical considerations and the veterinarian's final judgment remain critical in its implementation.
📨 That’s it for this edition — as always, please feel free to get in touch at contact@aiforanimals.org with any ideas and feedback!