“Ensuring Robust Visual Perception for Underwater Robots in Adverse Sensing Conditions”
Friday, April 16 at 1:00pm
Email email@example.com for Zoom info
Visually-guided underwater robots are deployed alongside human divers for cooperative exploration, inspection, and monitoring tasks in numerous shallow-water and coaster-water applications. The most essential capabilities of such robots are to visually interpret their surroundings, record/collect interesting samples, and often assist divers during an underwater mission. Despite recent technological advancements, the existing systems and solutions for real-time visual perception are greatly affected by marine artifacts such as poor visibility, lighting variations, and the scarcity of salient features. The difficulties are exacerbated by a host of non-linear image distortions caused by the vulnerabilities of underwater light propagation. In this talk, I will delineate my research attempts to address these challenges by designing novel and improved vision-based solutions. I will further provide a broad overview of how the proposed perception solutions enable underwater robots to ‘see better’ in noisy conditions and ‘do better’ with limited on-board computational resources and real-time constraints. I will also highlight how these solutions connect to several multidisciplinary use cases and exciting new research directions.
Md Jahidul Islam is a Ph.D. candidate (ABD) at the Department of Computer Science and Engineering of the University of Minnesota, Twin Cities, advised by Junaed Sattar. Jahidul’s research focuses on solving challenging open problems in the domains of robot perception, machine vision, and underwater robotics. In particular, he is interested in the design and development of robust perception modules that enable underwater robots to accurately interpret their surroundings in real-time. His proposed methodologies have been deployed in real robots (e.g., Aqua AUV, LoCO AUV) and validated by field experiments for important applications such as subsea inspection, environmental monitoring, and autonomous explorations. Additionally, the novel algorithms and technological solutions behind these methods are published in premier robotics conferences (e.g., RSS, ICRA, IROS) and journals (e.g., IJRR, JFR, RA-L). Moreover, his work has been recognized by the University of Minnesota with its prestigious Doctoral Dissertation Fellowship (DDF) award in the 2019-20 academic year. More information about his research and academic milestones can be found here.