Please use this identifier to cite or link to this item:
https://repository.sustech.edu/handle/123456789/24071
Title: | A Hand Gesture Recognition System for Deaf-Mute Individuals |
Other Titles: | نظام للتعرف على إيماءة اليد للأفراد ذوي الاعاقة السمعية الكلامية |
Authors: | Eldeen, Rugia Said Thabit Kamal Supervisor, -Ebtihal Haider Gismalla Yousif |
Keywords: | Engineering Mechatronics A Hand Gesture Recognition System Deaf-Mute Individuals |
Issue Date: | 22-May-2019 |
Publisher: | Sudan University of Science and Technology |
Citation: | Eldeen, Rugia Said Thabit Kamal . A Hand Gesture Recognition System for Deaf-Mute Individuals \ Rugia Said Thabit KamalEldeen ; Ebtihal Haider Gismalla Yousif .- Khartoum: Sudan University of Science and Technology, College of Engineering, 2019 .- 76 p. :ill. ;28cm .- M.Sc |
Abstract: | A deaf-dumb individual always uses gestures to convey his/her ideas to others. However, it is always difficult for a normal person to understand this gesture language. The basic objective of this project is to develop a computer based system to recognize 26 gestures from American Sign Language (ASL) using MATLAB, which will enable deaf-dumb individuals significantly to communicate with all other people using their natural hand gestures. The proposed system in this thesis is composed of four modules, which are preprocessing and hand segmentation, feature extraction, sign recognition and text of sign voice conversion. Segmentation is done by converting image to Hue-Saturation-Value (HSV) formate and using color threshold APP. Blob features are extracted by using Bag of feature that is used the Speed Up Robust Features (SURF) algorithm. Furthermore, the K-Nearest Neighbor (KNN) and Support Vector Machine (SVM) algorithms are used for gesture recognition. Finally, the Recognized gesture is converted into voice format. To make the system more user friendly, a Graphical User Interface (GUI) is designed. To train and test the proposed model, a self-collected dataset for ASL was prepared using hand gestures from both male and female volunteers, who have different ages and skin color in different background and postures by an ordinary phone camera. The implemented model demonstrates that the average accuracy from evaluation set is (89%), whereas the average accuracy obtained from test set is (84.6%). |
Description: | Thesis |
URI: | http://repository.sustech.edu/handle/123456789/24071 |
Appears in Collections: | Masters Dissertations : Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
A Hand Gesture ... .pdf | Research | 2.7 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.