Smart Grasping with a Dexterous Robot Hand

This project focuses on showing how a robotic hand can grasp and classify objects using only finger positions, without relying on additional sensors. Using NVIDIA’s Isaac Sim simulator the project integrates LEAP HAND, a robotic hand, and tests its ability to identify object type, position, and orientation using a with a MACE-based algorithm. A set of kitchenware items was chosen to showcase this capability. The MACE algorithm is a key part of the project: it suggests to fine tune models by comparing simulated interactions with actual real-world observations. The results show that the method is highly effective at identifying object type and position, with improvements over repeated iterations. However, accurately estimating an object’s rotation remains a challenge, likely due to its symmetry. This approach highlights the potential for cost-effective, sensor-free robotic grasping methods in real-world applications.