Start your day with intelligence. Get The OODA Daily Pulse.
To assist humans with household chores and other everyday manual tasks, robots should be able to effectively manipulate objects that vary in composition, shape and size. The manipulation skills of robots have improved significantly over the past few years, in part due to the development of increasingly sophisticated cameras and tactile sensors. Researchers at Columbia University have developed a new system that simultaneously captures both visual and tactile information. The tactile sensor they developed, introduced in a paper presented at the Conference on Robot Learning (CoRL) 2024 in Munich, could be integrated onto robotic grippers and hands, to further enhance the manipulation skills of robots with varying body structures. The paper was published on the arXiv preprint server. “Humans perceive the environment from multiple sensory modalities, among which touch plays a critical role in understanding physical interactions,” Yunzhu Li, senior author of the paper, told Tech Xplore. “Our goal is to equip robots with similar capabilities, enabling them to sense the environment through both vision and touch for fine-grained robotic manipulation tasks.”
Full feature : Integrated multi-modal sensing and learning system could give robots new capabilities.