August 21, 2022
Canada/Eastern timezone

Invited Speakers

Enabling Ecologically Curious Robots for Monitoring Coral Reefs

Yogesh A. Girdhar | Woods Hole Oceanographic Institution (WHOI)

Abstract:

Coral reef ecosystems host some of the highest diversity of life per unit area on Earth and harbor about one quarter to one third of all marine animals. Yet coral reefs worldwide are under tremendous pressure. We have lost more than a quarter of the world's coral reefs in the last 4 decades, and coral reef loss continues at an accelerating rate of 1-2 percent per year. There is a need to monitor these reefs, evaluating the change in their biodiversity, and also early detection of spread of disease to enable rapid potential interventions to present the loss. In this talk I will present ongoing efforts in my lab to scale up these monitoring efforts through the use of computer vision, AI, and, robots.

Bio:

Yogesh Girdhar is a computer scientist, roboticist, and the PI of the WARPLab (http://warp.whoi.edu) at Woods Hole Oceanographic Institution (WHOI), in the Applied Ocean Physics & Engineering department. He received his BS and MS from Rensselaer Polytechnic Institute in Troy, NY; and his Ph.D. from McGill University in Montreal, Canada. During his Ph.D. Girdhar developed an interest in ocean exploration using autonomous underwater vehicles, which motivated him to come to WHOI, initially as a postdoc, and then later continue as a scientist to start WARPLab. Girdhar’s research has since then focused on developing smarter autonomous exploration robots that, through the use of AI, can accelerate the scientific discovery process in oceans. Some notable recognition of his work includes the Best Paper Award in Service Robotics at ICRA 2020, finalist for Best Paper Award at IROS 2018, and honorable mention for 2014 CIPPRS Doctoral Dissertation Award.

Machine Learning and AI for Underwater Imagery Analysis: a benthic ecologist's perspective

Filippo Ferrario, Fisheries and Oceans Canada

The advent of Autonomous Underwater Vehicles, self-propelled panoramic cameras, as well as simple diver-operated cameras, are flooding marine benthic ecologists with data-rich imagery of the benthoscape. Similarly, machine learning and computer vision techniques offering solutions for image processing and data extraction are being developed at a fast pace. Despite the software toolbox for underwater imagery analysis is growing and becoming more and more promising, ecologists - not trained in computer science - often struggle to readily implement these tools in their data processing workflow. Here, I will present examples of how underwater imagery are being used in benthic ecology studies, discuss data extraction needs, the hurdles and potential of AI from the perspective of a field ecologist.

Bio:

Filippo Ferrario is a marine benthic ecologist at the Fisheries and Oceans Canada’s Maurice Lamontagne Institute. He received his BS and MS in biology from University of Milano-Bicocca, Italy, and his Ph.D. in ecology and biodiversity from the University of Bologna, Italy. As a scientific diver and field ecologist he used of underwater imagery a regular sampling method. During his Ph.D. and later as postdoc at Université Laval (QC), his research has focused on understanding how benthic species interact and use their habitat. He has since developed an expertise in habitat mapping through Structure-from-Motion photogrammetry to explicitly include the spatial dimension in his studies and move his research towards seascape and movement ecology in subtidal coastal ecosystems.