In its 3rd edition, this workshop aims at exploring and showing the potential that proximity perception has for cognitive robotics. Active proximity perception has great potential for Human-Robot Interaction (HRI) as well as for modeling objects and the environment in a multi-modal context. Proximity perception is complementary to vision and touch and today the technology is mature enough to be deployed alongside cameras and tactile sensors. For example, many researchers have already successfully addressed the challenge of multi-modal skins that include tactile and proximity perception. However, not much research has been directed towards active perception and sensor fusion that includes the proximity modality. To address this, in this workshop we feature experts from multi-modal HRI and visio-haptic perception as well as from the industry, who will foster the discussion with their experience in these domains. At the same time, we expect that their contributions will inspire multi-modal cognitive approaches for proximity perception. Furthermore, we will feature a demo-session that will anchor these ideas to concrete hardware, already available today from both research and industry participants.
|