Think Out Loud

How AI could help us locate and study threatened birds

By Rolando Hernandez (OPB)
Oct. 31, 2024 1 p.m. Updated: Oct. 31, 2024 8:18 p.m.

Broadcast: Thursday, Oct. 31

File photo from March 5, 2021. Birds such as the Marbled Murrelet can be difficult to study in the wild. New research shows maybe AI could be used to help with identifying these birds.

File photo from March 5, 2021. Birds such as the Marbled Murrelet can be difficult to study in the wild. New research shows maybe AI could be used to help with identifying these birds.

Stephani Gordon / OPB

00:00
 / 
16:12
THANKS TO OUR SPONSOR:

For researchers, endangered and threatened birds like the Northern Spotted Owl and the Marbled Murrelet can be incredibly hard to find and study in the wild. Now, new research shows that artificial intelligence may be able to help. As first reported in Hakai Magazine, researchers have been training AI to correctly identify elusive species of birds in the wild through sound. Matthew Betts is a professor and chairs the Department of Forest Ecosystems and Society at Oregon State University. He coauthored the study and joins us to share more about the development of this AI and its implications for future research.

Note: The following transcript was transcribed digitally and validated for accuracy, readability and formatting by an OPB volunteer.

Dave Miller: This is Think Out Loud on OPB. I’m Dave Miller. We start today with a tool that’s being used to solve a vexing problem. Endangered and threatened birds like the Northern Spotted Owl and the Marbled Murrelet can be incredibly hard to find and to study in the wild. New research has found that artificial intelligence may be able to help.

As first reported in Hakai Magazine, researchers have been training AI to correctly identify elusive species of birds in the wild through sound. Matthew Betts is a professor in the Department of Forest Ecosystems and Society at Oregon State University. He co-authored the study and he joins us now. Welcome back to Think Out Loud.

Matthew Betts: Thanks, Dave. Thanks for having me.

Miller: Before we get to the new piece of this, the AI piece of this, how have you and other researchers traditionally done surveys of birds like Spotted Owls or Murrelets?

Betts: Yeah. Historically, it involves going out into the forest at ridiculously early times of day, standing around and listening, and identifying these species by sound. So I’ve probably spent about 4,000 hours doing what we call point counts; these are these standardized listening surveys.

Miller: What are those hours like?

Betts: Well, it’s strangely meditative being in the forest at 5:30 in the morning, listening and recording all the different birds and even small mammals that you hear. But it also can be a bit mind numbing. In a typical morning you do twelve 10-minute counts and it can be somewhat dangerous depending on the terrain that you’re in. And for species like Marbled Murrelet, it’s a totally different story. This is a really hard species to study. It’s uncommon and it flies into the forest from the ocean at about 60 miles an hour. So it’s really difficult to detect. So that’s where some of these new techniques come in.

Miller: Let’s listen to audio of a Marbled Murrelet.

[Audio playing: Marbled Murrelet calls]

Miller: How many birds can you or other people, say, in your lab, identify just by ear?

Betts: So in Oregon, in the temperate zone, there are probably about 200 species by ear. I mean, typically I’ll have field assistance in one particular ecosystem, so it might be as low as 70, but it’s a remarkable task that all these field assistants learn. More into the tropics, you’re talking about 300 or 400 different species that are making sounds and you need to learn to ID.

Miller: Wow. So you mentioned that Murrelets are particularly challenging. Part of it, I guess, is just how rare they are. What are the other challenges in terms of having a human listen for these birds?

Betts: Well, as I said, they’re an interesting species and they’re actually a seabird, related to, say, Puffins, that many people will have seen. And their behavior is to come inland up to 50 miles and nest in old forests. So there’s not really much of a nest, but they scrape a little area out on moss. They lay a single egg and then they go back into the ocean to forage. And so really interesting life history, but it makes it extremely challenging to study them because they’re not always making noise. And as you said, they’re rare and they’re flying so fast, it makes it really difficult to detect where they are.

So, every year in Oregon, hundreds of people go out and try to survey the species. It’s actually required by law that people survey for them in older stands that they’re planning on cutting, to make sure they’re not there under the Endangered Species Act. We were funded by the state actually to study the species about 10 years ago and it’s just been an epic adventure of going out in ships, putting tags on the species, following them inland, finding their nests via plane, climbing trees. So anything we could do to make our lives a little bit easier was very welcome.

Miller: Can you describe what started to change in 2017 with passive acoustic monitoring, a whole network of microphones set up in forests?

Betts: Well, and I can claim zero credit for this. But the backstory to this, Dave, is that, as you might know, the Spotted Owl has been declining across the Northwest for quite some time, really, over 30 or 40 years. We’ve known about its declines and that’s mainly due to this competitor, a Barred Owl. And it got to the point, especially in the northern study areas, Washington and Oregon, the species had declined so much, it was getting difficult to do the standard monitoring – which involved catching birds, banding them and finding nests.

There were study areas across the whole Northwest. And Spotted Owls were so rare, it was difficult to get any useful scientific data. So a colleague of mine who worked for the Pacific Northwest Research Station with the U.S. Forest Service took a gamble and he thought, well, maybe the way to do this is to have spatially extensive, thousands of microphones out across the forest. And from that, we can do a broad scale quantification or measurement of how the species is doing. As you said, that was 2017. It was quite a risk to do that. But in the end, it’s proved to be quite effective for monitoring that particular species.

Miller: My understanding though is that at the beginning, all these microphones were out there collecting thousands of hours of recordings, but then it was up to humans to retrieve the recordings from each of these sites and then to listen to each of them?

Betts: Yeah. Again, the study remained quite massive. This is Damon Lesmeister’s project and he would have people go out, grab the SD cards – just like you might have in your phone or in any recording device you have – bring them back to the lab. And these were thousands of terabytes of data, and then go through them by hand and look for the signature of these Spotted Owls calling, initially. So it’s still a huge effort.

Miller: So that brings us to the change here. The new version of this is using artificial intelligence, machine learning, to actually train it first to say, “This is what a Spotted Owl is, sounds like. This is what a Marbled Murrelet sounds like.” How does that work?

THANKS TO OUR SPONSOR:

Betts: It sounds like a tricky project, doesn’t it? But the first step is to change these audio recordings into basically a 3D space or an image you could even print on a page, called a spectrogram, and that’s done by something called a four year transform. So you have basically a picture of the bird call. And then once you’ve got that picture, this is where something called a convolutional neural network comes in. And the computer, by itself, comes up with a series of features that can describe that call and then try to automatically recognize it in a massive data set of these other calls.

Miller: And then is it the case where, what might have taken a team of humans years can be done, I don’t know, nearly instantaneously?

Betts: So I would say that’s still optimistic, unfortunately. Yes, to a couple of calls instantaneously. But the thing is, over time, a petabyte – I didn’t even know what a petabyte was until this project – it’s 1,000 terabytes of data. In a given year, it can take two months, still, for a computer to churn through these recordings, and automatically identify them as Spotted Owls or Marbled Murrelets, as the case might be.

Miller: So, I mean, how accurate is the algorithm at this point in identifying a Marbled Murrelet?

Betts: Good question. So initially, some of these algorithms were quite bad at doing it. But thankfully, things have improved substantially. The initial challenges were Barred Owls and Spotted Owls sound pretty darn similar. And there are a number of species that can be confused for Marbled Murrelets as well. But after quite a bit of development on these, CNNs can predict with about 90% accuracy.

Miller: And I suppose it’s also worth wondering how accurate humans are, if you’re 300 feet away and, I don’t know, 200 feet below some nesting Murrelet. How good are you at identifying that that’s a Murrelet?

Betts: That’s an extremely good question. And one we spent quite a long time developing statistical techniques to try and figure out. But you’re right, the nice thing about these recordings is that they’re inherently archivable. If anyone questioned whether or not Matt Betts heard a Murrelet, they can go back to that exact minute when I claim to have heard it and listen to it themselves. So storing these data is an additional challenge, but it’s wonderful because we can go back and review.

Miller: My guess is that there are birders or just bird lovers out there listening right now, who are saying, “I have an app on my phone that I use already to identify bird calls.” How different is what you’re talking about with this research-based AI from what people can just download from their app store?’

Betts: Yeah, I will say those apps – Merlin is one of them, out of Cornell – are pretty fantastic. So I recommend them to anyone listening. The difference is that those algorithms, very similar algorithms, are trained on a broad suite of data from potentially even across all of North America. And this particular PNW-Cnet, it’s called, is trained very much more regionally on the species we get here and even the dialects within species that we might get, so the accuracy will be higher. But that said, Merlin is getting better and better all the time. It does make mistakes, I’ve learned, but it’s certainly quite good.

Miller: So let’s turn from the technology piece of this to what you can do with it. I mean, what have you and others already learned about where these birds are, or when they’re there, or what their behavior is like because of this powerful new tool?

Betts: Yeah, I would say we’re very much at the front end of some exciting things that we can learn from this. So far, we’ve used the data very much the way we would have used point count data that I described earlier, where you’re only in a spot for 30 minutes over the whole year. But now that we have hundreds of hours of recordings for every single spot in the landscape, we can potentially learn a lot more about birds.

So one obvious one is just, how are their population trends doing, right? Every single year, at relatively low cost, we have about 5,000 sites across the Northwest that’s being developed by the Forest Service to monitor all these species. So that’s fantastic. But we can do a lot more than monitor. We can learn about, for example, when a migrant arrives, flying all the way from Brazil, the minute it starts singing at a sight, we know it’s arrived. And so we can ask questions like, how does climate change affect arrival times of these species? And is that affecting their population trends? And even old, old questions that haven’t been addressed well, going back to Darwin – how do we have so many species in one place? How do they coexist? We can try and get at, with these kind of real time audio, super-long recordings, that fine temporal resolution. So it’s a really exciting time, I think, to be interested in bioacoustics and birds in particular.

Miller: What’s an example of a question you have about that last piece – the interactions of different animals? So, let’s say, and we’ve heard a lot about Barred Owls and Spotted Owls. I mean, what might the audio recording, as sort of understood by AI, tell us humans about these birds?

Betts: I mean, at its very most basic, you can imagine having these hours of recording … say you hear a Spotted Owl for the first like 20 days of recordings, and then in comes a Barred Owl, and then after that Barred Owl is there, you no longer hear the Spotted Owl. And of course, there are two possibilities. One is that the Spotted Owl is still there and just shut up. But the more likely one is that it’s taken off from the site. And so by having those high resolution data, we can test for those kinds of negative interactions.

And actually, we’re testing for not just negative interactions but positive ones. It’s becoming better known that some birds use other species as cues for where to settle. Well, if a MacGillivray’s Warbler is there, I’m a Wilson’s Warbler and I think it might also be great. So using these audio recordings, we can test for those sorts of attractions as well.

Miller: You’d mentioned being able to know when a migration is happening, and if I understood correctly, it seemed like you were saying in real time. But could these microphones send data back to an OSU office or a Forest Service office in real time?

Betts: That is the dream, Dave. But unfortunately, a lot of these sites are offline. So until we either expand our cell network or shrink down that communication, it’s going to remain a dream. But it’s possible that at least smaller scales like the H.J. Andrews Experimental Forest, where I work, we could be beaming data down to a computer in real time.

Miller: I want to go back to where we started. Because you’d said that because you didn’t have this kind of technological ability, this new tool at your disposal throughout your career, you spent – your estimate is – 4,000 hours quietly sitting there, often early in the morning, just listening, which you called sort of meditative. Would you be the same ecologist that you are today if you hadn’t had that time in the woods?

Betts: That’s a really good question. And I’ve thought a lot about it. I would hate to see an era where we don’t have young ecologists, young people going out into the forest, and really paying attention and learning about what they see. And that’s definitely how I became interested in nature in general.

But I would say it’s not necessarily a replacement. We can save the time trudging through the forest and standing in one place, and do what I would think are actually more exciting things, like actually watching birds and understanding their behavior; or catching them; or banding them and learning about their survival rates; or finding nests and learning about how well they’re reproducing. So there’s a lot more we can do that I don’t see one of Elon Musk’s robots doing for quite a long time, if we can be freed up from doing what now I do consider to be slightly mind numbing point counts.

Miller: Matt Betts, thanks very much.

Betts: Thanks very much for the invitation, Dave.

Miller: Matthew Betts is a professor in the Department of Forest Ecosystems and Society at Oregon State University. He joined us to talk about the use of AI to help researchers find threatened or endangered species. But also, the hope is in the coming years to learn much more about their lives.

Contact “Think Out Loud®”

If you’d like to comment on any of the topics in this show or suggest a topic of your own, please get in touch with us on Facebook, send an email to thinkoutloud@opb.org, or you can leave a voicemail for us at 503-293-1983. The call-in phone number during the noon hour is 888-665-5865.

THANKS TO OUR SPONSOR:

Become a Sustainer now at opb.org and help ensure OPB’s fact-based reporting, in-depth news and engaging programs thrive in 2025 and beyond.
We’ve gone to incredible places together this year. Support OPB’s essential coverage and exploration in 2025 and beyond. Join as a monthly Sustainer or with a special year-end contribution. 
THANKS TO OUR SPONSOR: