David Hyunchul Shim, Professor, School of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST), South Korea
Email: hcshim@kaist.ac.kr
Website: http://unmanned.kaist.ac.kr/bio.html
Title: Active Object Retrieval System using Cooperative Drone-Hexapod Team in GPS-denied Maritime Environment
Bio: Dr. David Hyunchul Shim received the B.S. and M.S. degrees in mechanical design and production engineering from Seoul National University, Seoul, Korea, in 1991 and 1993, respectively, and the Ph.D. degree in mechanical engineering from the University of California Berkeley, Berkeley, USA in 2000. From 1993 to 1994, he was with Hyundai Motor Company, Korea. From 2001 to 2005, he was with Maxtor Corporation, Milpitas, CA, USA as Staff Engineer. From 2005 to 2007, he was with the University of California Berkeley as Principal Engineer, in charge of Berkeley Aerobot Team. In 2007, he joined the Department of Aerospace Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, Korea, as an Assistant Professor. He is now Professor in School of Electrical Engineering. He served as Director of KI Robotics Institute at KAIST from ’19-’22. His interests centers on the combination of robotics and AI with aerial and ground vehicles. He has received a number of major awards from Korean government and global events including Hyundai Autonomous Challenges, AlphaPilot, and Indy Autonomous Challenges.
Abstract: In the MBZIRC 2024 Maritime Challenge, held in Abu Dhabi, participants were tasked with retrieving objects from a freely floating ship deck using drones, all without the aid of GPS. While visual navigation has become a reliable method when visual features are ample in the area of operation, the maritime setting offers very limited visual cues. Drones, which require precise state estimation for flight control and navigation, need to compute its position somehow. To tackle this challenge, our team developed a relative visual navigation system using markers installed on our USV. However, retrieving an object from a non-stationary ship posed additional difficulties, as both the ship and the object moved dynamically. Compounding the challenge, when a drone approaches the object, the downwash from its rotors can displace the object, complicating retrieval. To overcome this, our team devised a novel solution: a ground robot nicknamed "Case Hugger,” a hexapod robot that is winched down from the drone. When landing on the deck, while drone is hovering, it autonomously crawls to the target using its own visual recognition using its onboard computer. Once positioned over the object, the robot “hugs” it with its six legs before being winched back up for retrieval. We conducted extensive testing under conditions simulating the maritime environment. While we faced difficulties during the actual competition due to factors like weather and hardware issues, we are confident that our cooperative robot system has demonstrated significant potential for precision aerial retrieval of ground objects in real-world applications.