-
Notifications
You must be signed in to change notification settings - Fork 2
Home
This gesture recognition algorithm in terms of machine learning is the task of classifying by searching for the nearest neighbour.
The first stage of the recognition is the transformation from Input space to Feature space. This transformation is based on the classic model of a hand with 27 degrees of freedom. 23 degrees of freedom have been used in this work (4 for palm orientation and 19 for fingers position).
For each gesture about 50 measurements of degrees of freedom have been taken. Expected value and standard deviation of each degree of freedom for each gesture have been calculated.
To find the nearest neighbour I used several different methods and metrics such as: Euclidean distance, L1 distance, Correlation, Cosine similarity, custom function based on Gaussian function and Fuzzy set theory.
Experiments have shown that the function based on Gaussian function gives the best quality of recognition.
Thsi function looks as follows:
In conclusion it's worth mentioning that precision of gesture recognition depends on the quality of gesture capture done by LeapMotion controller. The current level of quality of gesture capture is enough for successful recognition of the majority of gestures. But there are some gestures that are difficult to distinguish (distinguish the gesture meaning "S" from the gesture "A" or the gesture "V" from "K", "M", "N").