Deep Learning in Prince William Sound

Research from Prince William Sound Science Center and published in ICES Journal of Marine Science shows that deep learning can turn millions of photos into a snapshot of zooplankton health.  

Plankton are like a vast mixed crop, livestock network of plants and animals that float along the water. They serve as the base of the marine food web and are critical to commercially harvested fish. They also perform invaluable services such as producing oxygen and moving carbon. But measuring the health and abundance of these tiny sea creatures is challenging.

Large-scale trends in plankton health can be monitored by satellite imaging and remote sensing, while up close analysis of water samples takes place under the microscope. But what’s missing is real time information about life in the ecosystem.

Immersed video and audio equipment is changing that, providing scientists with a daunting amount of data in the form of millions of pictures and audio files. Can deep learning help shed light on life 60 meters down?

Data Collecting and Analysis in Prince William Sound

Depiction of a submersed profiler. Photo source ICES Journal of Marine Science.

Depiction of a submersed profiler. Photo source ICES Journal of Marine Science.

In Prince William Sound, site of the Exxon Valdez oil spill, a long-term monitoring program is taking place: Gulf Watch Alaska. As part of that program an autonomous moored profiler was deployed in 2013 in the middle of the sound. In 2015 it was fitted with an in situ zooplankton camera system. Twice per day from 2016-2018 that system collected pictures from 60 meters below the surface. More than 2 million images of individual plankton, called planksters, were collected, far too many for manual analysis. 

The researchers turned to deep learning, the use of advanced computer imaging processors and a type of machine learning—convolutional neural networks or CNNs—to classify these images. To “teach” the system to recognize individual species, the researchers used a training set with more than 18,000 images of identified zooplankton from 43 different classes.

The result? The system was able to identify planksters within certain groups with an accuracy of 80% to 100%.

“The value and usefulness of automatically classified imagery depends on the questions at hand. Simple information from zooplankton imagery such as abundance and size is easily determined with high confidence,” the researchers said.

But, they say, more work would be needed to delineate certain copepod species or to gain insights beyond classification. Over time, this approach could be used to assess plankton populations and provide data for marine ecosystem management.

Note: Photo source ICES Journal of Marine Science.

Previous
Previous

Floating Microplastic Petri Dish

Next
Next

Boxed Lunch