I worked on the robot ECCERobot during my final year research internship at the AI Lab of Zurich, in 2010.

my internship's thesis (fr)

EcceRobot

ecce1
EcceRobot, the anthropomimetic robot.


EcceRobot on theUniversity of Zurich's website.

ecce2

ecce4

ecce3
The motor interface.

The segmentation and tracking system

During my internship, I developed a segmentation and tracking system that can estimate translations and rotations of an object in space. This system is based on a two-levels multi-agent system: a first set of numerous agents track the image using an optic flow algorithm, and filter movements to remove noise. A second group of agents extract and manage data obtained with the first group, and manage the distribution of first group agents on the image. The system considers that a group of points of the image that move in a synchronized way belongs to a same object. The set of first level agents that detect an object are assigned to the analysis of movements of this object, under the supervision of a second level agent. This second level agent analyses the behavior of these agents to estimate the movements of the object (translation and rotation) in the three dimensions of space (c.f. internship's thesis for more details).

tracking
The tracking system: left, three object are detected. Yellow ellipses show an estimation of the size of objects. Colored squares show surfaces covered by agents. Right: the display device. Cardans allow to visualize observed movements.

Experiments on the robot ECCERobot

The aim of my internship was to detect which elements were part of the body of the robot. To achieve this, we compare movements detected by the tracking system and motor commands. A correlation between movements of an object and a motor command indicates that this object is a part of the body of the robot. It is even possible to define movements associated to each motor command.

ecce_vision
The robot is equipped with the segmentation and motion analysis mechanism that I developed during my internship, that allows the robot to determine which elements are parts of its own body.

ecce_vision
Blue: movements detected by the tracking system for one of objects, after filtering to eliminate noise. Green: motor commands. Columns represent X, Y and Z axis, and lines represent position, linear speed and angular speed.

ecce_vision
Correlation between two tested motor commands and components of movement of three detected objects. It appears that object 1 (blue) is not a part of the robot. Object 2 (green) is strongly correlated to motor command 1 (left arm) and object 3 (red) is strongly correlated to motor command 2 (right arm). It is even possible to define to which component of the movement is associated to a motor command.

ecce_tool
The tracking mechanism can integrate body modifications: here, a tool is attached to the hand of the robot after its detection and segmentation. The mechanism now consider the tool as a part of the robot's body.