The main aim of the project is to implement a real-time system that converts hand gestures from the American Sign Language(ASL) to voice signals. This could be used by the people with speaking disabilities to communicate with other people. For this purpose, surface Electromyogram(EMG) which is the measure of electrical activity from the forearm muscle and accelerometer which measures the acceleration of the hand motions were used. A wearable data collection device called Shimmer was used for recording these signals from the user and transmitting the collected data to Matlab using Bluetooth. A pattern recognition algorithm was implemented in Matlab which processed the signals and displayed output decisions based on the gesture that was performed. Several tests were performed on one-human subject to identify multiple single-handed and two-handed gestures to recognize various word as well as basic sentences in the ASL. The experimental results were recorded and discussed.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.