Artificial intelligence demonstrates all new andnew opportunities invested in his human genius, which is truly inherent in the development engineers working in this area. Recent developments in this direction, made by researchers from the Laboratory of Informatics and Artificial Intelligence of the Massachusetts Institute of Technology (MIT CSAIL), have revealed a completely new artificial intelligence to the world.
The originality of it lies in the fact that heIt can define various objects just by looking at their picture, but at the same time it also identifies sensations from touching them. This means that the AI calls the item, feeling it.
According to the TNW edition, to the hand of an already existingKUKA android researchers added the GelSight tactile sensor, the design of which belongs to Ted Adelson's group at CSAIL. Then, work was carried out on the accumulation of tactile experience by artificial intelligence: he was given 200 household objects (textiles, tools, household items) for feeling, and each AI feeling was described, conveying its tactile and visual sensations. This procedure was repeated more than 12 thousand times, and already from this data a set of 3 million visual-tactile images, named by the developers of VisGel, was derived.
After such training, the AI was able to create externalobject outlines only from tactile data, which he himself reproduces by touching the object under study. The reverse process is also subject to him - considering the image of an object, to assume what this object is to the touch.
CSAIL graduate student Yunzhu Li, who wrote an article aboutThe novelty in the field of development of artificial intelligence, marked the importance of recent developments: "Combining tactile and visual sensations can expand the capabilities of the robot and reduce the amount of data that may be needed to perform tasks related to the manipulation and capture of objects."