6.5 C
New York
Wednesday, August 17, 2022

Robot Learns Self-modeling Just Like An Infant

Must Read

Study Says That Preschoolers Who Have Larger Vocabulary Perform Better In Class

A new study published today in the peer-reviewed journal Early Education and Development found that children do better in...

Greenlandic Snailfish Has The Highest Expression Levels Of Antifreeze Proteins, Finds New Study

A study on snailfish finds the greatest levels of antifreeze protein expression yet observed, highlighting the significance...

New Study Finds The One That Causes Fear And Tells You – Be Afraid Or “Stay Away”

Scientists at Salk have found a molecular pathway that turns scary sights, sounds, and smells into a...
Revyuh Logo 120 x 120
Jiya Saini
Jiya Saini is a Journalist and Writer at Revyuh.com. She has been working with us since January 2018. After studying at Jamia Millia University, she is fascinated by smart lifestyle and smart living. She covers technology, games, sports and smart living, as well as good experience in press relations. She is also a freelance trainer for macOS and iOS, and In the past, she has worked with various online news magazines in India and Singapore. Email: jiya (at) revyuh (dot) com

We humans learn how our bodies work as infants, and now robots are doing the same.

As an athlete or fashionista knows, our body image is not always true or accurate, but it is a crucial piece of information that defines how we interact in the world.

Your brain continually plans ahead while you dress or play ball so that you may move your body without bumping, stumbling, or falling.

We develop our ideal body types as we grow and robots are doing the same. Today, a team from Columbia Engineering said that they had developed a robot that, for the first time, could learn a model of its whole body from scratch without the aid of humans.

In a new paper published in Science Robotics, the researchers describe how their robot constructed a kinematic model of itself and then utilized that model to plan motion, achieve objectives, and avoid obstacles in a range of settings.

Even damage to its body was automatically detected, repaired, and then detected again.

A robot sees itself as a baby exploring a hall of mirrors.

The researchers positioned a robotic arm within a ring of five live video cameras. Through the cameras, the robot observed itself as it freely oscillated.

The robot squirmed and twisted to discover precisely how its body moved in reaction to various motor inputs, like a baby experiencing itself for the first time in a hall of mirrors.

The robot eventually halted after roughly three hours. Its inbuilt deep neural network had finished figuring out how the robot’s movements related to how much space it took up in its surroundings.

Hod Lipson, professor of mechanical engineering and director of Columbia’s Creative Machines Lab, where the work was done, said, “We were really curious to see how the robot imagined itself. But you can’t just peek into a neural network, it’s a black box.” 

The self-image eventually came into existence after the researchers tried numerous visualization techniques.

“It was a sort of gently flickering cloud that appeared to engulf the robot’s three-dimensional body,” added Lipson. “As the robot moved, the flickering cloud gently followed it.” 

The self-model of the robot was accurate to 1% of its workspace.

Self-modeling robots will improve autonomous systems

It is important for robots to be able to model themselves without help from engineers for many reasons: It not only reduces labor costs but also enables the robot to maintain its own wear and tear, as well as identify and repair the damage.

The authors contend that this capability is crucial since increased independence is required of autonomous systems.

For example, a factory robot could see that something isn’t moving properly and make adjustments or request assistance.

Boyuan Chen, the study’s first author and an assistant professor at Duke University, said,  “We humans clearly have a notion of self.  “Close your eyes and try to imagine how your own body would move if you were to take some action, such as stretch your arms forward or take a step backwards. Somewhere inside our brain, we have a notion of self, a self-model that informs us what volume of our immediate surroundings we occupy, and how that volume changes as we move.”

Robot self-awareness

The project is a component of Lipson’s decades-long search for strategies to give robots a semblance of self-awareness.

“Self-modeling is a primitive form of self-awareness,” he said.

A robot, animal, or human that has a realistic self-model has an evolutionary advantage because it can function better in the real environment and make better decisions.

Researchers are aware of the problems, risks, and debates that come with giving machines more freedom through self-awareness.

Self-awareness on the level displayed in this study may be “trivial compared to that of humans,” as Lipson put it, but any endeavor requires a beginning.

“We have to go slowly and carefully, so we can reap the benefits while minimizing the risks.”

Image Credit: Getty

You were reading: Robot Learns Self-modeling Just Like An Infant

- Advertisement -
- Advertisement -

Latest News

- Advertisement -

More Articles Like This

- Advertisement -