Saturday, June 29, 2024

Researchers teach robots to move by sketching trajectories

Developing the ability for robots to execute even a basic task involves a significant amount of unseen effort. One of the difficulties lies in the planning and implementation of movements, which encompass everything from steering wheels to raising a robotic arm.

Roboticists work with programmers to establish a set of clear and achievable trajectories, ensuring they are free from obstacles. Carnegie Mellon University’s Robotics Institute (RI) researchers are innovating methods for constructing these trajectories.

RI’s postdoctoral fellow, William Zhi, collaborated with Ph.D. student Tianyi Zhang and RI Director Matthew Johnson-Roberson to develop a method for using sketches as a means to instruct robots on movement. This research will be presented at the upcoming IEEE International Conference on Robotics and Automation in Yokohama, Japan.

Each approach has its limitations. In particular, kinesthetic teaching necessitates the user to be present in the same location as the robot. Additionally, it can be challenging to manually adjust certain robots, and this challenge is exacerbated with mobile robots, such as a quadruped robot with an attached arm. Teleoperation requires precise control from the user and demands time to guide the robot through its activities.

While there has been some initial exploration into using natural language for robot control, the focus of researchers has mainly been on experimenting with various methods for teaching robots through demonstration. There are two primary approaches to achieving this.

One approach involves kinesthetic teaching, where a human observes the robot’s movements and physically adjusts its joints to achieve the desired positions. The alternative method involves teleoperation, in which the user utilizes a specialized remote controller or joystick to control the robot, and then records the demonstration for the robot to replicate.

Each approach has its limitations. In particular, kinesthetic teaching necessitates the user to be present in the same location as the robot. Additionally, it can be challenging to manually adjust certain robots, and this challenge is exacerbated with mobile robots, such as a quadruped robot with an attached arm. Teleoperation requires precise control from the user and demands time to guide the robot through its activities.

The RI team has developed a new method for teaching robots how to move, which involves sketching trajectories. This approach eliminates the limitations of kinesthetic teaching or teleoperation. The robot learns from movements that are drawn on an image of its intended working environment.

To capture the environmental image, the team positioned cameras at two different locations to capture images from varying perspectives. They then traced the robot’s desired movement trajectory onto the image and transformed the 2D images into 3D models that the robot could comprehend. This conversion was achieved using a technique known as ray tracing, which leverages light and shadows on objects to estimate their distance from the camera.

After creating the 3D models, the team provided them to the robot for guidance. For the quadruped robot with a robotic arm, the researchers drew three motion paths on the photos to illustrate the arm’s movement. They then converted the images to 3D models using ray tracing. Subsequently, the robotic arm learned to replicate these paths in the physical environment.

Through this method, the team has trained the quadruped robot to perform tasks such as closing drawers, drawing the letter “B,” and toppling a box. Additionally, they have programmed the robot to release objects at the end of specific paths, allowing it to deposit items into containers or cups.

“We’re able to teach the robot to do something and then switch it to a different starting position, and it can take the same action,” said Zhi. “We can get quite precise results.”

At the moment, this technique is only suitable for robots with stiff joints and is not compatible with soft robots, as it requires the consideration of joint angles and their relationship to different spatial points. However, working with hardware presents its own difficulties.

While conducting their tests, the four-legged robot occasionally struggled to maintain its stability after performing a movement such as reaching out to close a drawer. This is one of the aspects the team is addressing for the next version of the program.

“People in the field have focused more on the algorithm around generating better motions from the demonstration. This research is the inception of us using trajectory sketches to instruct robots,” said Zhi. “We envision in manufacturing settings, where you have someone unskilled at programming robots, enabling them to just sketch on an iPad and then do collaborative things with the robot, which is where this work is likely to go in the future.”