
The prospect of integrating the ability to utilize the power of AI and machine learning into robotics for understanding incoming data collected from the surrounding environment by providing them with various types of sensors and cameras, as is the case in image and object segmentation for example, means that we are heading towards a future of even greater dependency on intelligent connected devices in our daily lives. The aim of this project is to design and develop an autonomous robotic hand capable of rotating and grabbing objects to help hand amputees. This is an excellent example of how robotics and AI can be used to benefit society. By automating the task of grabbing objects, the robotic hand reduces the physical burden on hand amputees and allows them to live and function easier daily. This project demonstrates the potential of robotics and AI to improve the quality of life for people with disabilities and highlights the importance of responsible development and deployment of these technologies. The robotic hand, which was made by Haifa3D association and the Oak-D Lite camera mounted to it, are controlled by a Jetson Nano microcontroller. The Jetson Nano processes the input from the camera and utilizes a segmentation model to detect objects, their orientation and distance. Once the object is detected, the Jetson Nano sends signals to the robotic hand to either rotate to an appropriate angle or grab the object. The robotic hand is designed to substitute the amputated hand. This project is a step towards restoring functionality and independence for amputees, improving their quality of life through advanced prosthetic technology.
Image generated