1 December 1997 Relative spatial pose estimation for autonomous grasping
Steve Roach, Michael Magee
Author Affiliations +
A technique for finding the relative spatial pose between a robotic end effector and a target object to be grasped without a priori knowledge of the spatial relationship between the camera and the robot is presented. The transformation between the coordinate system of the camera and the coordinate system of the robot is computed dynamically using knowledge about the location of the end effector relative to both the camera and the robot. A previously developed computer vision technique is used to determine the pose of the end effector relative to the camera. The robot geometry and data from the robot controller is used to determine the pose of the end effector relative to the robot. The spatial transformation between the robot end effector and the target object is computed with respect to the robot’s coordinate system. The algorithm was demonstrated using a five-degree-of-freedom robot and an RGB camera system. The camera can be dynamically positioned without concern for an assumed spatial relationship between the camera and robot, enabling optimization of the view of the object and the end effector. Further, the iterative nature of the grasping algorithm reduces the effects of camera calibration errors.
Steve Roach and Michael Magee "Relative spatial pose estimation for autonomous grasping," Optical Engineering 36(12), (1 December 1997). https://doi.org/10.1117/1.601586
Published: 1 December 1997
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Cameras

Imaging systems

Computing systems

Robotic systems

Detection and tracking algorithms

Systems modeling

Distance measurement

Back to Top