Marine animals can be hard to research, and whales especially so – they dive deep, travel far and can be elusive. To understand their biology, research vessels that cost tens of thousands of dollars a day to operate often must get close to a whale as it surfaces, then a researcher shoots the animal with a dart to collect tissue samples. It is laborious, inefficient and expensive. But that may all soon change, thanks to the snotbot.
As a whale surfaces, it exhales through its blowhole, shooting a spout of water and mucus skyward. Researchers can pilot the drone – called the snotbot – into this spray and catch the mucus in midair.
Iain Kerr, chief executive officer of Ocean Alliance, says that from this snot researchers can collect DNA, hormones, bacteria and other vital information on the health of the whale. The small drone is also far less intrusive than a huge research vessel.
Since its initial development, the snotbot has become an even more powerful research tool. Thanks to a partnership with Intel, it now uses machine learning to identify individual animals and review their history; soon it may be able to study the whales’ health in real time. Oceans Deeply recently spoke with Kerr about the origins of the snotbot, the kinds of data collected from whale mucus and what tools the drone might use to study whales and other wildlife in the near future.
Oceans Deeply: What made you think of studying whale mucus with drones?
Iain Kerr: I was on our boat one day with my little [biopsy] dart and this whale snot sort of pummeled over my body; as you can imagine, from a biologist’s perspective the stinkier it is, the more gross it is, typically the more productive and more interesting it is.
I then did a little bit of research, and actually found out that somebody had taken [a mucus sample] from a cetacean blow in captivity. So I said: “OK, let’s develop this tool that can actually collect DNA, microbiomes, stressful hormones, pregnancy hormones. Let’s try to see if we can develop this tool to do a health assessment of an animal.”
It’s gone so far beyond that now. When you look at a lot of big oceanographic groups, or even a lot of ocean research, it’s almost a prerogative of the privileged. You need big boats and now you’ve got Scripps and Woods Hole [oceanographic institutions]. And when you look around the world, a lot of these countries really can’t afford a $15 million research vessel that will cost $50,000 a day to run.
But if you actually say, “Hey! You can get really cutting-edge video of manatee behavior, dolphin behavior, shark behavior; you can look into the ocean with an eyesight you never had before,” I believe it’s akin to the development of the microscope for cellular biologists. We are now seeing the world in a whole different way, a way that is affordable, replicable, scalable, safer for the animals and safer for the humans.
Oceans Deeply: With the old method of big boats and biopsy darts, how much data were you able to collect and how does that compare to what you’re able to collect from the mucus?
Kerr: We would have a 90ft [27m] research vessel, and honestly that boat was probably costing a minimum of a $100,000 a month to run. We would be lucky potentially to get five biopsies a day.
If you’re in an area with whales, and a whale comes up a mile away, and it’s up for 10 minutes, it’s probably going to take you eight minutes to get over there. Well then, these drones do 50mph [80km/h]; it’s almost over the whale by the time the damn boat is turned around and pointing in that direction.
Certainly, we were getting a toxicological legacy and DNA from the biopsy, but with the drone in the air, you’re seeing behaviors we’ve never seen before. Let me be blunt. I’ve been in this business almost 30 years; rarely does someone come along and say: “OK, by the way, here’s this tool that costs five grand that’s going to change everything.”
What’s interesting about the drone is we’ve got all this biological information. DNA, all these different hormones, microbiomes.
[Before the drones] we would get over there, and maybe we biopsy [the whale] as quickly as we could, and they would then dive. So, how much time did I have with the animal? A minute? Now, I could be over this animal 25 to 30 minutes just looking down and maybe I’d collect more than one blow from the same animal. We’re living in almost a tsunami of data.
Scientists on one level have this awful problem nowadays – you want all the data you can get, and then you’re buried under it, and sometimes it’s very hard to get out the pieces you need.
Oceans Deeply: To collect and analyze all of that data, I understand that the snotbot team recently partnered with Intel. How has that impacted your research?
Kerr: I’ve done one expedition with Intel. We’re doing another one at the end of September. That first expedition was the first time I managed to take the Intel team into the field.
Of course, being that they’re Intel, they’ve now taken it to the next level up already. For example, a photo ID. You get a photograph of an animal. You look and you catalog it. Basically, I was in Alaska flying over a whale, and the guy just held up a little computer screen next to me while I was flying, and it said: “This whale is Dylan. The whale was last seen eight years ago, and it was seen 12 miles [19km] from here.”
There are, generally speaking, two populations that are actually heading into Alaska: there are the animals coming from Hawaii and then there are animals coming up from Mexico. The Mexican animals are more endangered, so on our permit it says, “pay special attention if you’re dealing with these Mexican whales.” Guess what: we never know if we’re dealing with the Mexican whales until a month and a half after the expedition. Until now.
The other challenge they’ve taken on, and they had some success with it in Alaska, is real-time volumetrics. Basically, with you and me or anybody, you go to the doctor and they weigh you, and they say: “You know, for your group, you’re a little light or you’re a little heavy.” What we’re really now doing with Intel is more real-time stuff. [For example,] that animal’s looking a little underweight. There’s a scar on the animal. Could that scar be infected? OK, we’ve got an infra-red camera here that can tell us if the wound is hot or cold, and if it’s infected.
These animals don’t know we’re even there, and we’re doing a health assessment of them.
Oceans Deeply: By recording from a drone, which whale behaviors have you been able to see that you weren’t able to see before?
Kerr: One is just a really intimate time between a mother and a calf. Quite literally, the mother is stroking the calf with its pectoral fin, and the calf is just nuzzling up. Is this changing the field of biology? No. But does it make it easier for us to empathize with this creature? I think it does.
These are things that we haven’t even looked for. Then there’s just how they’re using their bodies to forage or catch prey – how the humpback whales in Alaska were pushing their pectoral fins forward and lunging right before they lunged up to get groups of herring or krill.
We’re now at this point where I will turn my video camera on as soon as a drone takes off, and it will record for 30 minutes and then land. In that 30 minutes, there’s probably 10 minutes of interesting video, and one minute of stunning video.
So even if I’m only doing 10 flights a day, that’s 300 minutes of video that’s got to be gone through. If I do a two-week expedition, you do the math. This is where machine learning and [artificial intelligence] are so good. [Machines] could say: “Oh. That’s water, that’s water, that’s water, that’s a whale. Oh! That’s two whales. I think this could be a scenario worth looking at.”
[You can] tell the machine: “Yes, that’s what I like.” Then it knows next time to look for more of that. The more of this video we have, the better the machines are going to get at interpreting what we think is interesting or unique, and then I know it’s going to happen: I know at some point I’m going to get an alert on the behavior that I didn’t even know existed.