Robot vision on 6-D pose estimation

0
1408

6D pose estimation is the task of detecting the 6D pose of an object, which include its location and orientation.

This is an important task in robotics, where a robotic arm needs to know the location and orientation to detect and move objects in its vicinity successfully.

This allows the robot to operate safely and effectively alongside humans.

The awareness of the position and orientation of objects in a scene is sometimes referred to as 6D, where the D stands for degrees of freedom pose.

Robots are good at making identical repetitive movements, such as a simple task on an assembly line. (Pick up a cup. Turn it over. Put it down.)

But they lack the ability to perceive objects as they move through an environment.

(A human picks up a cup, puts it down in a random location, and the robot must retrieve it.)

A recent study was conducted by researchers at the University of Illinois at Urbana-Champaign, NVIDIA, the University of Washington, and Stanford University, on 6D object pose estimation to develop a filter to give robots greater spatial perception so they can manipulate objects and navigate through space more accurately.

Overview of the PoseRBPF framework for 6D object pose tracking. The method leverages a Rao-Blackwellized particle filter and an auto-encoder network to estimate the 3D translation and a full distribution of the 3D rotation of a target object from a video sequence. Credit: University of Illinois at Urbana-Champaign

While 3-D pose provides location information on X, Y, and Z axes – relative location of the object with respect to the camera – 6D pose gives a much more complete picture.

“Much like describing an airplane in flight, the robot also needs to know the three dimensions of the object’s orientation – its yaw, pitch, and roll,” said Xinke Deng, doctoral student studying with Timothy Bretl, an associate professor in the Dept. of Aerospace Engineering at U of I.

And in real-life environments, all six of those dimensions are constantly changing.

“We want a robot to keep tracking an object as it moves from one location to another,” Deng said.

Deng explained that the work was done to improve computer vision.

He and his colleagues developed a filter to help robots analyze spatial data.

The filter looks at each particle, or piece of image information collected by cameras aimed at an object to help reduce judgement errors.

“In an image-based 6D pose estimation framework, a particle filter uses a lot of samples to estimate the position and orientation,” Deng said. “Every particle is like a hypothesis, a guess about the position and orientation that we want to estimate.

The particle filter uses observation to compute the value of importance of the information from the other particles. The filter eliminates the incorrect estimations.

“Our program can estimate not just a single pose but also the uncertainty distribution of the orientation of an object,” Deng said. “Previously, there hasn’t been a system to estimate the full distribution of the orientation of the object. This gives important uncertainty information for robot manipulation.”

New filter enhances robot vision on 6D pose estimation
Overview of the PoseRBPF framework for 6D object pose tracking. The method leverages a Rao-Blackwellized particle filter and an auto-encoder network to estimate the 3D translation and a full distribution of the 3D rotation of a target object from a video sequence. Credit: University of Illinois Department of Aeropsace Engineering

The study uses 6D object pose tracking in the Rao-Blackwellized particle filtering framework, where the 3-D rotation and the 3-D translation of an object are separated.

This allows the researchers’ approach, called PoseRBPF, to efficiently estimate the 3-D translation of an object along with the full distribution over the 3-D rotation. As a result, PoseRBPF can track objects with arbitrary symmetries while still maintaining adequate posterior distributions.

“Our approach achieves state-of-the-art results on two 6D pose estimation benchmarks,” Deng said.

The study, “PoseRBPF: A Rao-Blackwellized Particle Filter for6D Object Pose Estimation,” was presented at the Robotics Science and Systems Conference in Freiburg, Germany. It is co-written by Xinke Deng, Arsala Mousavian, Yu Xiang, Fei Xia, Timothy Bretl, and Dieter Fox.

More information: PoseRBPF: A Rao-Blackwellized Particle Filter for 6D Object Pose Tracking, arXiv:1905.09304 [cs.CV] arxiv.org/abs/1905.09304

Provided by University of Illinois at Urbana-Champaign

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Questo sito usa Akismet per ridurre lo spam. Scopri come i tuoi dati vengono elaborati.