News

Oh, a new object! I wonder how it works? A robot can learn by experimentation

Aalto University is involved in a research project which aims to make robots try out new things and thus learn how to use them without precise pre-programming.
Aalto University / A robot / photo: Linda Koskinen

A robotic hand with three fingers grips a ball point pen and lifts it up in the air, feeling its weight and movement.  

‘Testing objects is a bit like playing’, Professor Ville Kyrki says. ‘While the robot handles new objects, it learns about them.’

When people are entrusted to cook in an unfamiliar kitchen, they do not necessarily know how to use the equipment and devices. An unfamiliar kettle or microwave oven may work slightly differently from the corresponding device at home. However, most of us have a general understanding of the operational logic behind some of the most common kitchen equipment and devices, and we learn how to use foreign objects very quickly through experimentation.

In order for robots to become part of everyday life, they also have to be able to handle items that they are not pre-programmed to use. The objective of Interactive Perception-Action-Learning for Modelling Objects (IPALM) research project is to find out how robots could learn to use new things with the help of a general model. Kyrki's research team is responsible for developing objects’ manipulation skills, such as gripping.

‘Robots are already pretty good at moving around and transporting goods. However, they do not quite know how to deal with the uncertainty related with the manipulation or processing of objects. For example, a T-shirt and a pillow change their shape when you lift them or poke them with your finger’, Kyrki explains.

Robots do not quite know how to deal with the uncertainty related with the manipulation or processing of objects.

Ville Kyrki

Observation through direct contact enables the development of more precise models which the robot can use when planning the implementation of the tasks assigned to it. Currently, learning through the manipulation of objects is one of the capabilities robots still need before they can be used in households.

‘This three-year project carries out basic research that aims to develop robots’ ability to learn about objects and their properties. It is one of the missing skills that a robot needs before it can function in an environment that it is not encrypted for, but it is by no means the only one. Robots do not have common sense, and therefore they are incapable of responding to unexpected situations. In other words, there is still a long way to go before robots can provide general domestic help’, says Kyrki.

 

Participants in the Interactive Perception-Action-Learning for Modelling Objects (IPALM) include:  Imperial College London (coordinator), Aalto University, University of Bordeaux, Polytechnic University of Catalonia and Czech Technical University.

  • Published:
  • Updated:
Share
URL copied!

Read more news

graphic illustratinig metal atoms
Research & Art Published:

Single-atom dopants in metallic nanoparticles can offer high tunability for plasmonic-catalytic applications

CEST researchers use TDDFT-based calculations to study the tunability of the plasmonic-catalytic properties of nanoparticles
african fabrics
Research & Art Published:

WiTLAB Side Event at UIA Copenhagen

Aalto WiT Programme launch and workshop - Side event at UIA Copenhagen 2023
Valentina Arrietta's photo by Valentina Arrietta.
Research & Art Published:

Valentina Arrietta: Feedback is always a gift

'Sometimes I feel like I’m struggling alone with rejection and expectations of publication. But if I talk to someone else, I find that they too have their own struggles. It’s important to talk to people and realise that at the end of the day, I’m not walking alone in these shoes.'
ANI Aalto logo
Campus, Research & Art Published:

ANI Open House on 15th of June 2023 at 12:00-15:00

Welcome to ANI Open House to celebrate 10th anniversary of Aalto NeuroImaging Infrastructure. Doors are open 12:00-15:00 on 15th of June 2023.