6.5 C
New York
Wednesday, June 16, 2021

Artificial intelligence has learned to predict tactile sensations in the eye

Must Read

Kamal Saini
Kamal S. has been Journalist and Writer for Business, Hardware and Gadgets at Revyuh.com since 2018. He deals with B2b, Funding, Blockchain, Law, IT security, privacy, surveillance, digital self-defense and network policy. As part of his studies of political science, sociology and law, he researched the impact of technology on human coexistence. Email: kamal (at) revyuh (dot) com

Researchers from the Laboratory of Informatics and Artificial Intelligence at the Massachusetts Institute of Technology (MIT CSAIL) have taught the neural network to predict the tactile sensations that may arise from touching it, and vice versa. According to The Next Web, the researchers briefly described their work at the CVPR-2019 conference, which started in Long Beach, California on June 16.

Despite the active development of machine vision and touch, modern robots are still limited in their interaction with the outside world. It is assumed that learning machines to predict tactile sensations from interacting with objects or predicting their appearance based on such sensations, will allow robots to manipulate objects faster, safer and more accurately.

For the neural network training, researchers from MIT CSAIL used the industrial robot Kuka, whose manipulator was equipped with the GelSight tactile system. It is a sensor with a camera, a multi-color LED backlight and a gel layer.

When interacting with objects, the gel layer is deformed. These deformations are highlighted by LEDs and recorded by the camera. Subsequently, on the basis of several deformation frames, the system constitutes a three-dimensional model of the part of the object with which the GelSight system was in contact.

When training, the Kuka robot touched various objects, collecting data on tactile sensations from such touches. Simultaneously, a video recording of such touches was made. In total, the Kuka robot overtook 200 different household items. The total number of touches was 12 thousand.

Then, based on the records received, the researchers compiled a database containing 3 million frames from video records associated with tactile data. The database researchers called VisGel. This base was later used to train the neural network.

In March this year, American engineers presented an algorithm under which a robotic manipulator is able to recognize unfamiliar objects, classify their main components and interact with them. For example, a robot is able to recognize a mug, identify its handle and hang a mug on the dryer.

Source: The Next Web

- Advertisement -


Please enter your comment!
Please enter your name here

- Advertisement -

Latest News

Scientists find out who is at higher risk of re-infection with COVID

Those who have suffered a severe covid, as well as people with weakened immune systems, should exercise maximum vigilance. Experts...
- Advertisement -

More Articles Like This

- Advertisement -