Portrait de Vincent Le Falher

Vincent Le Falher

Visiteur de recherche indépendant - Université de Montréal
Superviseur⋅e principal⋅e
Sujets de recherche
Apprentissage actif
Apprentissage automatique appliqué
Apprentissage profond
Biodiversité
Développement Full Stack
IA appliquée
Télédétection par satellite
Vision par ordinateur

Publications

Seeing the forest and the trees: a workflow for automatic acquisition of ultra-high resolution drone photos of tropical forest canopies to support botanical and ecological studies
Guillaume Tougas
Helene C. Muller-Landau
Gonzalo Rivas-Torres
Thomas R. Walla
Mélvin Hernandez
Adrian Buenaño
Anna Weber
Jeffrey Q. Chambers
Jomber Chota Inuma
Fernando Araúz
Jorge Valdes
Andrés Hernández
David Brassfield
P. Sérgio
Vicente Vasquez
Adriana Simonetti … (voir 7 de plus)
Daniel Magnabosco Marra
Caroline de Moura Vasconcelos
Jarol Fernando Vaca
Geovanny Rivadeneyra
José Illanes
Luis A. Salagaje-Muela
Jefferson Gualinga
Tropical forest canopies contain many tree and liana species, and foliar and reproductive characteristics useful for taxonomic identificatio… (voir plus)n are often difficult to see from the forest floor. As such, taxonomic identification often becomes a bottleneck in tropical forest inventories. Here we present a drone-based workflow to automatically acquire large volumes of close-up, ultra-high resolution photos of selected tree crowns (or specific locations over the canopy) to support tropical botanical and ecological studies ( https://youtu.be/80goMEifpc4 ). Our workflow is built around the small, easy-to-use DJI Mavic 3 Enterprise (M3E) drone, which is equipped with a wide-angle and a telephoto camera. On day one, the pilot maps a forest area of up to ∼200 ha with the wide-angle camera to generate a high-resolution digital surface model (DSM) and orthomosaic using structure-from-motion (SfM) photogrammetry. On subsequent days, the pilot acquires close-up photos with the telephoto camera from up to 300 selected canopy trees per day. These close-up photos are acquired from 6 m above the canopy and contain a high level of visual detail that allows botanists to reliably identify many tree and liana species. The photos are geolocated with survey-grade accuracy using RTK GNSS, thus facilitating spatial co-registration with other data sources, including the photogrammetry products. The primary operational challenge of our workflow is the need to maintain RTK corrections with the drone to ensure that close-up photos are acquired exactly at the predefined locations. The maximum operational range we achieved was 3 km, which would allow the pilot to reach any tree within a ∼2800 ha area from the take-off point. Although our workflow was developed to support taxonomic identification of tropical trees and lianas, it could be extended to any other forest or vegetation type to support botanical, phenological, and ecological studies. We provide harpia , an open-source Python library to program these automatic close-up photo missions with the M3E drone ( https://github.com/traitlab/harpia ). We provide harpia , an open-source Python library to program these automatic close-up photo missions ( https://github.com/traitlab/harpia ). Drone imagery and labelled close-up photo data are not yet publicly available because they were acquired with the goal of publishing benchmark machine learning datasets and models for tree and liana species classification and prior publication of the data would jeopardize this future publication.