How I Created A ML Model That Identifies Hand Gestures

Hand-gesture detection and recognition are one of the hottest topics around the last few decades and many data scientists and researchers were successful in implementing this for the blind-interpreter, augmentation-reality and hand-controlled robots.  In general definition, Gesture is “a movement of a part of a body like hand or head which intends to express an idea or a meaning”. The research on evolution suggests that manual gestures was the first step taken towards the process of communication in human history. And the fact is even the newborns use hand gestures to express their desires which is long before they start speaking. Similarly, gestures can also be used to communicate with machines to express or make any action. The traditional method used for gesture recognition was only possible with the use of external hardware controllers or it required wired gloves which can register the user's intentions from hand and arm movements. The Microsoft’s Kinect, introduce
Subscribe or log in to Continue Reading

Uncompromising innovation. Timeless influence. Your support powers the future of independent tech journalism.

Already have an account? Sign In.

📣 Want to advertise in AIM? Book here

Picture of Rushi Bhagat
Rushi Bhagat
Amateur Data Scientist with a demonstrated history of working on multiple projects and always keen to research and contribute to this field.
Related Posts
AIM Print and TV
Don’t Miss the Next Big Shift in AI.
Get one year subscription for ₹5999
Download the easiest way to
stay informed