Generation of motion dataset with HPP-LOCO
One long standing challenge in the domain of legged robotics (humanoid, quadrupedal,…) is the proposition of a generic method, able to automatically synthesize motions for arbitrary robots in arbitrary environments. Resolving this issue is required to achieve a long-term objective: the deployment of autonomous legged robots, able to navigate safely among unknown environments, outside of their research laboratories.
In the MEMMO project, funded in Horizon 2020 under grant agreement n°780684, a group of researchers are seeking to develop a unified, and yet tractable, approach for complex motion generation without relying on domain-specific knowledge. Their approach is to leverage the advantage of both control theory (or model predictive control MPC) and machine learning. For instance, MPC approaches can construct complex behaviours by selecting intuitive cost functions, but the computational load remains too high for real-time control (i.e. in order of milliseconds). Machine learning has been proposed to build off-line motor policies from purely data-driven methods; however it's hard to ensure safety behaviour performance. Indeed, in MEMMO consortium, led by Nicolas Mansard from LAAS-CNRS, we are combining these two powerful methods that we trust will lead to a significant step-change in the way robots are controlled.
The first step of every machine learning project is to gather a large enough dataset to train the method. In the case of the MEMMO project, this dataset should contain valid and optimal motions for various type of robots and in a large selection of environment and scenario. As there is no such dataset readily available, one of the first action of the MEMMO project taken by Pierre Fernbach, Mathieu Geisert and Guilhem Saurel was to build this dataset of motions.
A framework named HPP-LOCO have been built for this goal, using the work of Justin Carpentier, Steve Tonneau and Pierre Fernbach to propose a complete locomotion planning framework. HPP-LOCO is thus able to automatically generate valid motions for legged robots and is not restricted to walking motion. Indeed, it can solve scenario such as climbing stairs using a handrail or walking on uneven ground.
Using this framework, a first version of a dataset have already been build. This dataset contains motions for the humanoid robot TALOS (from PAL Robotics) for a couple of scenario, mostly for walking on a flat ground but also for going up or down a set of stairs. For each scenario, the motions produced by HPP-LOCO between thousands of randomly chosen initial and final position of the robot have been recorded. This dataset will then be used to train the machine learning methods developed within the MEMMO project.
Future works aims to improve the HPP-LOCO framework to be more robust and to deal with more challenging scenario. Then, a larger version of the dataset will be build, including a quadruped robot (ANYmal from ANYbotics) and more diverse scenarios.