Control architecture for discrete and rhythmic movements

A control architecture for discrete and rhythmic movements

Humans are able to adapt their movements to almost any new situations in a very robust, seemingly effortless way. To explain both this adaptivity and robustness, a very promising perspective is the modular approach to movement generation: Movements results from combinations of a finite set of stable building blocks of movements organized at the spinal level and called motor primitives.

In the framework of the RobotCub project, we have developed an architecture for the generation of both discrete and rhythmic movements based on the concept of motor primitives. Up to now we have applied this architecture to two tasks: (i) drumming and (ii) crawling and reaching.

Please click on the pictures below to access the corresponding webpages.

 

Superimposition of discrete and rhythmic movements

Our goal is to develop an architecture that allows for the generation of complex movements, seen as superimpositions of discrete (goal-directed) and rhythmic movements. Our architecture is built on the hypothesis that complex movements can be generated through the superimposition and sequencing of simpler motor primitives. This work is part of the RobotCUB project.

Application to Drumming

We have developed an architecture that allows for the superimposition and the switch between discrete and rhythmic movements. While this architecture has been principally developed for adaptive locomotion, more precisely crawling, we have first applied it to drumming as this task is simpler but still involves many key issues, such as:

  • Superimposition of discrete and rhythmic movements
  • Precise coordination between the limbs
  • Robust online modulation of the parameters

Moreover, closed loop control can be studied through the interaction between the stick and the drums.

Application to Crawling (including Reaching)

The goal of this project is to study how babies learn to locomote and interact with their environment. As part of the RobotCUB project, our goal is to develop a controller for the locomotion of a humanoid robot, the iCub, that will look like a baby.