Vision-aided Dynamic Quadrupedal Locomotion on Discrete Terrain using Motion Libraries

July 11, 2022


In this paper, we present a framework rooted in control and planning that enables quadrupedal robots to traverse challenging terrains with discrete footholds using visual feedback. Navigating discrete terrain is challenging for quadrupeds because the motion of the robot can be aperiodic, highly dynamic, and blind for the hind legs of the robot. Additionally, the robot needs to reason over both the feasible footholds as well as the base velocity in order to speed up or slow down at different parts of the discrete terrain. To address these challenges, we build an offline library of periodic gaits which span two trotting steps, and switch between different motion primitives to achieve aperiodic motions of different step lengths on a quadrupedal robot. The motion library is used to provide targets to a geometric model predictive controller which outputs the contact forces at the stance feet. To incorporate visual feedback, we use terrain mapping tools and a forward facing depth camera to build a local height map of the terrain around the robot, and extract feasible foothold locations around both the front and hind legs of the robot. Our experiments show a small scale quadruped robot navigating multiple unknown, challenging and discrete terrains in the real world.

Download the Paper


Written by

Akshara Rai

Ayush Agrawal

Koushil Sreenath

Shuxiao Chen



Research Topics


Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.