Human Toddler Studies Bolster Robot Learning Advancements

human-toddler-studies-bolster-robot-learning-advancements

It is a thrilling moment to learn about robots. Organizations have spent decades amassing complex datasets and developing novel techniques for teaching systems to perform new tasks. We appear to be on the threshold of significant advancements in the deployment of technology that can adapt and learn on the move.

We have witnessed a large number of intriguing studies over the past year. Consider the Vision-Robotics Bridge (VRB) that Carnegie Mellon University presented in June. It is not necessary for a programmer to account for every conceivable variation because the system is able to adapt YouTube-based learnings to various environments.

Last month, the robotics team at Google DeepMind displayed its own remarkable work in the guise of RT-2 (Robotic Transformer 2). The system is capable of abstracting away task-specific details. In the provided example, a programmer is not required to teach a robot to identify specific pieces of garbage, gather it up, and discard it away in order for the robot to perform an apparently straightforward (for humans, at least) task.

This week, CMU highlighted additional research that compares its work to the earliest stages of human learning. Specifically, the artificial intelligence robot is compared to a three-year-old child. Active and passive learning are the two categories that comprise the context of learning.

Teaching a system to perform a task by exposing it to videos or training it on the aforementioned datasets is passive learning in this context. Active learning is precisely what it sounds like: conducting a task and modifying it until it is perfect.

Read Also: Innovative Technology Converts Keyboard Sounds into Text with 95% Precision

Hybrid Learning Approach of RoboAgent: Merging Internet Observation and Remote Teleoperation

human-toddler-studies-bolster-robot-learning-advancements
It is a thrilling moment to learn about robots.

RoboAgent, a collaboration between Carnegie Mellon University and Meta AI (yes, that Meta), integrates these two forms of learning in a manner similar to that of a human. This involves monitoring tasks being conducted via the Internet and actively learning by teleoperating the robot remotely. According to the team, the system is able to adapt lessons learned in one environment to another, comparable to the VRB system described previously.

Shubham Tulsiani of CMU’s Robotics Institute explains, “An agent capable of this sort of learning moves us closer to a general robot that can complete a variety of tasks in diverse unseen settings and continually evolve as it gathers more experiences.” “RoboAgent can quickly train a robot using limited in-domain data while relying primarily on abundantly available free data from the internet to learn a variety of tasks. This could make robots more useful in unstructured settings like homes, hospitals and other public spaces.”

One of the coolest aspects of this is that the dataset is open source and accessible to everyone. It is also designed to be used with readily available, off-the-shelf robotics hardware, allowing researchers and businesses equally to utilize and expand a growing repository of robot data and abilities.

All of this is extremely encouraging in terms of constructing and deploying multipurpose robotics systems with a view toward the eventual development of general-purpose robots. The objective is to develop technology capable of moving beyond the repetitive machines in highly structured environments that we typically associate with industrial robotics. Obviously, actual real-world application and scaling is much more difficult than it sounds.

When it comes to these approaches to robotic learning, we are much closer to the beginning, but we are in an exhilarating time for emergent multipurpose systems.

 

Read Also: NASA’s InSight Lander Finds Evidence of Increased Rotation Speed on Mars

Source: Techcrunch

Leave a Reply

Your email address will not be published. Required fields are marked *