DESIGN IDEAS
Even though the robot hand is strong enough to crush the apple, by connecting tactile sensors developed at CITEC with intelligent software it doles out their strength for a fine-touch grip that also won’t damage delicate objects; as part of the Famula research project, Wachsmuth, Ritter and Koester (from left) are developing robot hands that figure out on their own how to grasp and move unfamiliar objects, below
SELF-LEARNING ROBOT HANDS GRASP REALITY
Researchers have developed a grasp system with robot hands that autonomously familiarises itselfwith novel objects. The newsystemworkswithout previously knowing the characteristics of objects, such as pieces of fruit or tools. The newgrasp systemwas developed as
part of the Famula research project at Bielefeld University’s Cluster of Excellence Cognitive Interaction Technology (CITEC) in Germany. The knowledge gained fromthis project could contribute to future service robots, for instance, that are able to independently adapt toworking in new households. CITEC has invested approximately €1million in Famula. With the Famula project, CITEC
researchers are conducting basic research that can benefit self-learning robots of the future in both the home and industry. “We want to literally understand howwe learn to ‘grasp’ our environmentwith our hands. The robotmakes it possible for us to test our findings in reality and to rigorously expose the gaps in our understanding. In doing so, we are contributing to the future use of complex,multi-fingered robot hands,which today are still too costly or complex to be used, for instance, in industry,” explained neuroinformatics Professor Helge Ritter, who heads the Famula project togetherwith sports scientist and cognitive psychologist Professor Thomas Schack and robotics Privatdozent SvenWachsmuth. “Our system learns by trying out and exploring on its own – just as babies approach newobjects,”
8 /// Environmental Engineering /// August 2017
says Ritter. “The systemlearns to recognise such possibilities as characteristics, and constructs amodel for interacting and re- identifying the object.” The CITEC researchers areworking on a
robotwith two hands that are based on human hands in terms of both shape and mobility. The robot brain for these hands has to learn howeveryday objects like pieces of fruit, dishes, or stuffed animals can be distinguished on the basis of their colour or shape, aswell aswhatmatters when attempting to grasp the object. The interdisciplinary project brings
togetherwork in artificial intelligencewith research fromother disciplines. Schack’s research group, for instance, investigated which characteristics study participants perceived to be significant in grasping actions. In one study, test subjects had to compare the similarity ofmore than 100 objects. “Itwas surprising thatweight hardly
plays a role.We humans relymostly on shape and sizewhenwe differentiate objects,” he says. In another study, test subjects’ eyes were covered and they had to handle cubes of differentweight and size. Infrared cameras recorded their handmovements. Dr Robert Haschke, a colleague of Ritter,
stands in front of a largemetal cagewith both robot arms and a tablewith various test objects. In his role as a human learning mentor, Haschke helps the systemto acquire familiaritywith novel objects, telling the robot handswhich object on the table they should inspect next. To do this, Haschke points to individual objects, or gives spoken hints, such as inwhich direction an interesting object for the robot can be found (e.g. “behind, left”). Using colour cameras and depth sensors, twomonitors display howthe systemperceives its surroundings and reacts to instructions fromhumans. Wachsmuth and his teamof CITEC’s
Central Labs are not only responsible for the system’s language capabilities. “In order to understandwhich objects they shouldwork with, the robot hands have to be able to interpret not only spoken language, but also gestures,” explainsWachsmuth. “And they also have to be able to put themselves in the position of a human to also ask themselves if they have correctly understood.”
To readmore online about Robotics, scan the QR code or visit
http://goo.gl/Xhioau
PICTURES: BIELEFELD UNIVERSITY
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60