menu
close

Robot 'Senses' Like Humans with Groundbreaking WildFusion Technology

Duke University researchers have developed WildFusion, an innovative framework that enables robots to perceive complex environments through multiple senses including vision, touch, and vibration. This technology allows quadruped robots to navigate challenging terrains like forests and disaster zones with human-like perception capabilities. The system processes sensory data through specialized encoders and a deep learning model, creating a continuous representation of the environment even when sensor data is incomplete.
Robot 'Senses' Like Humans with Groundbreaking WildFusion Technology

Robots have traditionally relied solely on visual information to navigate their surroundings, severely limiting their effectiveness in complex, unpredictable environments. Now, researchers from Duke University have created a revolutionary framework called WildFusion that fundamentally changes how robots perceive and interact with the world around them.

WildFusion equips a four-legged robot with multiple sensory capabilities that mimic human perception. Beyond standard visual inputs from cameras and LiDAR, the system incorporates contact microphones that detect vibrations from each step, tactile sensors that measure applied force, and inertial sensors that track the robot's stability as it moves across uneven terrain.

"WildFusion opens a new chapter in robotic navigation and 3D mapping," explains Boyuan Chen, Assistant Professor at Duke University. "It helps robots to operate more confidently in unstructured, unpredictable environments like forests, disaster zones and off-road terrain."

At the heart of WildFusion is a sophisticated deep learning model based on implicit neural representations. Unlike traditional methods that treat environments as collections of discrete points, this approach models surfaces continuously, allowing the robot to make intuitive decisions even when visual data is blocked or ambiguous. The system effectively "fills in the blanks" when sensor data is incomplete, much like humans do.

The technology has been successfully tested at North Carolina's Eno River State Park, where the robot confidently navigated dense forests, grasslands, and gravel paths. According to lead student author Yanbaihui Liu, "These real-world tests proved WildFusion's remarkable ability to accurately predict traversability, significantly improving the robot's decision-making on safe paths through challenging terrain."

The research team has also developed a simulation method that allows them to test robot capabilities without direct human involvement in early development phases, making the research process faster and more scalable. This approach represents a significant advancement in robotics testing methodology.

With its modular design, WildFusion has vast potential applications beyond forest trails, including disaster response, inspection of remote infrastructure, and autonomous exploration. The technology, supported by DARPA and the Army Research Laboratory, will be presented at the IEEE International Conference on Robotics and Automation (ICRA 2025) in Atlanta this May.

Source:

Latest News