Multimodal grasp data set: A novel visualâtactile data set for robotic manipulation
[摘要] This article introduces a visualâtactile multimodal grasp data set, aiming to further the research on robotic manipulation. The data set was built by the novel designed dexterous robot hand, the Intelâs Eagle Shoal robot hand (Intel Labs China, Beijing, China). The data set contains 2550 sets data, including tactile, joint, time label, image, and RGB and depth video. With the integration of visual and tactile data, researchers could be able to better understand the grasping process and analyze the deeper grasping issues. In this article, the building process of the data set was introduced, as well as the data set composition. In order to evaluate the quality of data set, the tactile data were analyzed by short-time Fourier transform. The tactile dataâbased slip detection was realized by long short-term memory and contrasted with visual data. The experiments compared the long short-term memory with the traditional classifiers, and generalization ability on different grasp directions and different objects is implemented. The results have proved that the data setâs value in promoting research on robotic manipulation area showed the effective slip detection and generalization ability of long short-term memory. Further work on visual and tactile data will be devoted to in the future.
[发布日期] [发布机构]
[效力级别] [学科分类] 自动化工程
[关键词] Grasping data set;visual;tactile;robotic manipulation;slip detection;long short-term memory [时效性]