Skip to content

Story

Funding secured for automated image-analysis biodiversity monitoring project

13 June 2024

The Natural Environment Research Council (NERC) has announced funding for a new project proposed by our Dr James Clark and collaborators, to develop software systems to improve biodiversity monitoring by automating the analysis of plankton and sea floor image data. 
Image courtesy of the NERC funded Deep links project 2016, University of Plymouth, Oxford University, BGS, JNCC. Image courtesy of the NERC funded Deep links project 2016, University of Plymouth, Oxford University, BGS, JNCC

Plankton may be small, but these microscopic creatures form the base of the marine food web, supporting the majority of marine ecosystems and biodiversity of our ocean.  


The health and biodiversity of plankton and sea floor communities is indicative of the wider health of our ocean. Monitoring what is happening in the world of plankton is critical to understanding what impacts may be felt down the line by other marine organisms, and also ourselves as humans - reliant on the critical goods and services our ocean provides. 

As such, our Dr James (Jim) Clark and team have proposed and secured funding for a new project to automate the analysis of plankton and sea floor image data, to develop tools for automating image analysis for biodiversity monitoring. 

The project comes as a natural extension of the existing Automated, in situ Plankton Imaging and Classification System (APICS) project led by Dr Clark, which involves deploying revolutionary technology at our L4 sampling station (about 6 nautical miles south of Plymouth) that automatically gathers plankton data in situ. 

The APICS platform will be installed imminently, and once deployed, will generate thousands of images of plankton from the ocean each day. The images will then be automatically shared and classified back at the laboratory, and this data will be used to track changes in the composition of plankton communities in the English Channel.   

Untitled-design-(3).jpg

Above: A whale shark swimming among a school of fish. Whale sharks are one of the many animals reliant on plankton as a food source - these filter-feeding fish can process more than 6,000 litres of water an hour through their gills, eating small shrimp, fish and plankton. [Source: WWF

“Biodiversity loss is one of the most pressing environmental issues of our generation, and if we are to find solutions, we desperately need to improve observations and monitoring. We can only protect what we understand,” said Dr Clark. 

“Until recently, we have been limited with how we can monitor marine biodiversity, with data collection and sampling methods heavily reliant on infrequent (and expensive) ship-based observations. We’ve not been able to gather enough consistent, periodic data to build a real picture of what is happening in our ocean.” 

“That is, until now. Significant advances have been made in recent years to develop marine autonomous imaging platforms that collect data for specific organisms - including microscopic plankton - like with the technology we are utilising within the APICS project. These platforms can generate millions of images that have the potential to revolutionise our understanding of marine biodiversity, and to facilitate a step change in marine biodiversity monitoring, allowing fine-scale spatial and temporal trends to be resolved.” 

“However, for this revolution to be realised, first high-throughput and high-efficacy classification and analysis tools must also be developed. We have access to more data than it has ever been possible to collect before - enormous datasets that would simply be impossible for humans to begin to interpret manually. As such, we need to develop robust machine learning models to help us process and interpret this data.”   

“We are delighted to have secured funding for our new DEAL project - ‘DEcentrAlised Learning for automated image analysis and biodiversity monitoring’, that aims to service these needs, and also address some of the existing inefficiencies and challenges that are currently faced by many research teams already using and building bespoke automated image classifiers.” 

“At the present time, work on automated image classifiers is highly fragmented, and there is an absence of common standards. Many research teams working on plankton act as individuals, building bespoke classifiers which are trained against limited image data, often from a single instrument. The problem with this approach is that individual researchers are ultimately limited by the amount of data they have been able to collate and label, which leads to results with higher biases that inhibit their incorporation into operational biodiversity monitoring platforms. The models are also incapable of detecting new or rare organisms that weren’t present within the original training data, meaning they are unsuitable for studying important processes such as the emergence of invasive species.” 

“Meanwhile, centralised services which require users to share their image data with a single custodian by uploading it to a central server before the images are classified, raise concerns over the privacy and ownership of data. The process is also highly inefficient, as it relies on the transfer of potentially large volumes of data and its duplication on multiple servers.” 

"DEAL will address many of these challenges by using the HPE Swarm Learning Framework, developed by our project partner Hewitt Packard Enterprise, allowing users to participate in a decentralised collaborative network without a central server. It makes it possible for users to benefit from each other’s data and learnings, and to collaborate in the building of better classification models with lower biases. At the same time, the system preserves data privacy and reduces inefficiencies and carbon costs associated with the transfer and duplication of large volumes of data.” 

“By further partnering with world leaders in plankton and sea floor imaging, we will deliver two operational networks for classifying plankton and sea floor image data. We intend the tool and the initial networks we form to act as catalysts, helping to build communities in which data producers coalesce around a set of shared standards, and cooperate in making marine image data suitable for operational biodiversity monitoring.” 

Dr Simon Gardner, Head of Digital Environment at NERC said: 

“The development of software systems for improved image analysis, both in the laboratory and in the field, will allow researchers to interpret a diversity of biodiversity data of increasing scale and complexity. The DEAL project led by Plymouth Marine Laboratory will take a highly collaborative approach to deliver much-needed automated classification and analysis tools that can handle the high volumes of data associated with plankton and sea floor images.” 

Related information

Automated, in situ Plankton Imaging and Classification System (APICS) 

Dr James Clark – Staff Spotlight 

Western Channel Observatory (WCO) 

Share this story: