project

Automating workflows for processing and classification of drone imagery to monitor grassland conservation easements

The University of Idaho’s Drone Lab has worked with the Montana Chapter of The Nature Conservancy to develop and refine workflows for automated processing of drone imagery into indicators of ecosystem health.

Monitoring of site conditions is an important part of adaptive management (Walters and Holling 1990, Allan 2009) and a critical step in administration of conservation easements (Merenlender et al. 2004). Outside of basic verification of compliance with the terms of the easement, however, monitoring of the ecological benefits of conservation easements and health of lands under easement can be difficult to sustain, unevenly applied, and inconsistent (Rissman et al. 2007, Rissman and Sayre 2012). The time required to collect monitoring data in the field continues to be an impediment to broader implementation of monitoring programs (Pyke et al. 2003, Booth and Cox 2011). This has led to the development of image-based techniques to collect vegetation monitoring data (e.g., Booth and Cox 2008, Karl et al. 2014, Gillan et al. 2020).

However, while the adoption of image-based monitoring techniques can greatly decrease the time and effort required to collect data in the field, it increases the time and workload for extracting monitoring indicators from the images in the office. Additionally, inconsistencies in how image products are created or analyzed could result in spurious differences in the monitoring indicators. Accordingly, for image-based monitoring techniques to be widely used in monitoring programs, consistent, repeatable, and easy-to-use methods for processing the data need to be developed.

The Montana Chapter of The Nature Conservancy (MT-TNC) has made substantial investments in unpiloted aerial vehicle (UAV, i.e., drone) technology as a core part of its grassland monitoring programs. The University of Idaho’s Drone Lab has worked with MT-TNC to develop and refine workflows for automated processing of drone imagery into indicators of ecosystem health. The MT-TNC and the UI Drone Lab will continue to collaborate on technologies to more effectively use drones for conservation monitoring by working together to develop a new drone imaging workflow and new image classification models. The goals of this collaboration are to develop techniques that MT-TNC can use for their monitoring needs, but that are also transferrable to other locations, ecosystems, and monitoring situations.