New Framework for Bipedal Robots Allows Dynamic Movement on Stepping Stones


The South Hall of the University of California, Berkeley. / Photo by: Falcorian via Wikimedia Commons


Some robots designed for disaster response and search-and-rescue missions have legs to access uneven and rugged terrain. However, the control algorithms for these units are not simple to develop. So, scientists at the University of California, Berkeley used nonlinear control systems to optimize movement of legged robots.

Bipedal robots are systems with a high degree-of-freedom, ruled by complex nonlinear differential equations to get the hybrid dynamics of ground interaction. Meaning, these robots must interact with the environment through constant breaking and making contact with it to determine its surroundings.

In addition, certain bipedal models are underactuated or machines without real actuators at the ankle part. When the robots walk, they need to keep moving in order to maintain balance, but when stepping on stones raised from the ground, the chance of losing footwork is high.

At the Hybrid Robotics Group at the UC Berkeley, scientists created a framework that uses the middle-line between nonlinear control systems and hybrid dynamic robotics to improve the mobility of legged machines. The framework enables bipedal robots to move with high degree-of-freedom, more precise footsteps, and a more robust design to withstand external forces.

“Our goal is to design controllers for achieving dynamic, fast, energy-efficient, and robust maneuvers on hybrid and underactuated systems such as legged and aerial robots,” posted developers from the Hybrid Robotics Group at UC Berkeley.

The framework has been tested in simulation robot models such as ATRIAS created by the Oregon State University and DURUS built by Georgia Institute of Technology. Also, the robots used for testing the framework were clueless about the terrain -- a situation that rescue robots might actually encounter in reality--  but the application of depth segmentation and deep learning allowed the machines to receive some information about their surroundings.

The framework is still being improved and optimized and the team behind it hopes that the system will lead to a completely autonomous version.