What if we could generate complex movements for arbitrary robots with arms and legs interacting in a dynamic environment in real-time? Such a technology would certainly revolutionize the motion capabilities of robots and unlock a wide range of very concrete industrial and service applications: robots would be able to react in real-time to any change of the environment or unexpected disturbance during locomotion or manipulation tasks. However, the computation of complex movements for robots with arms and legs in multi-contact scenarios in unstructured environments is not realistically amenable to real-time with current computational capabilities and numerical algorithms.

The project Memmo aims to solve this problem by 1) relying on massive off-line caching of pre-computed optimal motions that are 2) recovered and adapted online to new situations with real-time tractable model predictive control and where 3) all available sensor modalities are exploited for feedback control going beyond the mere state of the robot for more robust behaviors. Memmo will develop a unified yet tractable approach to motion generation for complex robots with arms and legs.

Memmo is a collaborative project supported by European Union within the H2020 Program, under Grant Agreement No. 780684. The project starts in January 2018, for 4 years, and involve LAAS-CNRS (France), IDIAP (Swiss), Univ. Edinburgh (UK), Max-Planck Institute (Germany), Univ. Oxford (UK), PAL-Robotics (Spain), Wandercraft (France), Airbus (France), Costain (UK) and APAJH (France).

While the project targets advances in the fundamental methods used to control complex robots, it will also lead to the development of 3 industrial demonstrators in future of aircraft manufacturing, rehabilitation of paraplegic patients and inspection of large engineering structures. For academics, Memmo will also release the main results as open-source software packages and dataset of robot trajectories (in simulation) and robot sensor logs (measured by the robot during real experiments).