General

Robots have learned to recognize objects with a glance and a touch.

It is quite easy for people to determine the density andthe relief of the subject, just by looking at it. You might as well say what the object looks like, just touching it with your eyes closed. Such skills would help robots to interact better with objects, but, unfortunately, until now they were not available to them. Researchers from the MIT Artificial Intelligence Laboratory (CSAIL) solved this problem by equipping the KUKA robotic arm with a GelSight tactile sensor - thus, artificial intelligence was able to study the connection between visual and tactile information and combine them.

Used GelSight tactile sensor wasdeveloped by a group of engineers led by Ted Adelson in 2014. At its core, it is an electronic copy of the tip of a human finger, which uses a camera and a sensitive rubber film to create a three-dimensional map of the surface. The device has already been tested in real conditions more than once - for example, once it helped a robot to properly connect a USB cable to the port.

</ p>

Artificial intelligence has combined organs of touch and vision.

In the new project, the sensor was installed in the robotKUKA, and combined with artificial intelligence - in this way the robotic arm has learned by eye to determine the relief of objects, and blindly recognize their shape. A set of 12,000 videos with 200 objects, such as fabrics, tools and household items, were used to train the system. The videos were divided into frames, and it was on their basis that the robot combined tactile and visual information.

</ p>

At the moment, the robot is able to perform workonly in a controlled environment, and only with objects known to him in advance. System developers want to expand its capabilities, giving artificial intelligence more data to explore.

Looking at the scene, our model can presenta touch of a flat surface or a sharp edge. Touching blindly, she can determine the shape of objects solely by tactile sensations. Combining these two senses can expand the capabilities of a robot and reduce the amount of data it may need to complete tasks related to manipulating and capturing objects, ”explained CSAIL graduate student Yunzhu Li.

Robots are constantly being improved, and at thismoment even know how to work in a team. For example, roach-roaches VelociRoACH, developed at the University of California at Berkeley, recently learned to help each other stand on their feet. You can read about it and watch the video in our material.

If you are interested in the news of science and technology, be sure to subscribe to our channel in Yandex. Dzen. There you will find materials that have not been published on the site!