Grasp Affordance

Introduced by Ardón et al. in Learning Grasp Affordance Reasoning through Semantic Relations

This is a dataset for visual grasp affordance prediction that promotes more robust and heterogeneous robotic grasping methods. The dataset contains different attributes from 30 different objects. Each object instance is related not only to the semantic descriptions, but also the physical features describing visual attributes, locations and different grasping regions related to a variety of actions.

Papers


Paper Code Results Date Stars

Dataset Loaders


No data loaders found. You can submit your data loader here.

Tasks


Similar Datasets


License


  • Unknown

Modalities


Languages