##plugins.themes.bootstrap3.article.main##

T. B. Sivakumar

S. Rahini Sudha

Lakshminarayanan S

M. Anand

A Fayaz Ahamed

Abstract

The term "human-machine interaction" (HMI) describes how a human and a machine communicate and interact through a user interface. An emotion identification system based on hand gestures and facial expressions is described in this paper. Natural user interfaces, like gestures, are becoming more and more popular these days since they let people use machines with instinctive and spontaneous behaviors. The ability to identify significant motions made by an individual using their hands, arms, face, head, and body is known as gesture recognition. The present techniques for recognizing gestures are data-glove-based, depending on how hands are input. To address some of the issues that the data glove currently has. The proposal is for a data glove system that uses a computer vision and pattern recognition algorithm and basic Electromyography (EMG) sensors to accurately recognize gestures. The hand data glove for gesture recognition is proposed in real-time human-computer Interaction (HCI). When creating a clever and effective human–computer interaction, hand gestures are more significant. Gesture recognition has a wide range of uses, from virtual reality to medical rehabilitation to sign languages. The range of recognition accuracy is large, ranging from 70% to 98%, with an average of 89.6 %. The restrictions taken into account include various textual interpretations, movements, and intricate non-rigid hand features. This work differs from other recent research in that it covers every kind of gesture recognition method.

##plugins.themes.bootstrap3.article.details##