Duke University researchers have created a groundbreaking AI system that gives robots the ability to navigate challenging outdoor environments with unprecedented effectiveness. The framework, named WildFusion, was presented at the IEEE International Conference on Robotics and Automation (ICRA 2025) in Atlanta on May 19, 2025.
Unlike traditional robots that rely solely on visual data from cameras or LiDAR, WildFusion integrates multiple sensory inputs to create a more comprehensive understanding of the environment. Built on a quadruped robot platform, the system combines visual information with tactile sensors and contact microphones that detect vibrations as the robot walks. These additional senses allow the robot to distinguish between different surfaces—from the crunch of dry leaves to the soft squish of mud—and assess stability in real-time.
"WildFusion opens a new chapter in robotic navigation and 3D mapping," said Boyuan Chen, the Dickinson Family Assistant Professor at Duke University. "It helps robots to operate more confidently in unstructured, unpredictable environments like forests, disaster zones and off-road terrain."
At the heart of WildFusion is a deep learning model based on implicit neural representations. This approach models the environment continuously rather than as discrete points, enabling the robot to make smarter decisions even when sensor data is incomplete or ambiguous. The system was successfully tested at Eno River State Park in North Carolina, where it navigated dense forests, grasslands, and gravel paths with remarkable stability.
On the same day, researchers from Pohang University of Science and Technology (POSTECH) announced a complementary innovation—novel haptic devices designed to enhance safety in industrial settings. Led by Professor Keehoon Kim and Ph.D. candidate Jaehyun Park, the team developed two types of haptic interfaces that allow workers to remotely control robots while receiving realistic tactile feedback.
The POSTECH devices—POstick-KF (Kinesthetic Feedback) and POstick-VF (Visuo-tactile Feedback)—transmit precise force changes that robots experience when manipulating objects, enabling more delicate and accurate remote operations in hazardous industrial environments.
Both innovations represent significant advancements in human-robot interaction, with WildFusion funded by DARPA and the Army Research Laboratory, highlighting the technology's potential applications in both civilian and defense sectors.