dc.contributor.author |
Eldeen, Rugia Said Thabit Kamal |
|
dc.contributor.author |
Supervisor, -Ebtihal Haider Gismalla Yousif |
|
dc.date.accessioned |
2019-12-11T10:22:35Z |
|
dc.date.available |
2019-12-11T10:22:35Z |
|
dc.date.issued |
2019-05-22 |
|
dc.identifier.citation |
Eldeen, Rugia Said Thabit Kamal . A Hand Gesture Recognition System for Deaf-Mute Individuals \ Rugia Said Thabit KamalEldeen ; Ebtihal Haider Gismalla Yousif .- Khartoum: Sudan University of Science and Technology, College of Engineering, 2019 .- 76 p. :ill. ;28cm .- M.Sc |
en_US |
dc.identifier.uri |
http://repository.sustech.edu/handle/123456789/24071 |
|
dc.description |
Thesis |
en_US |
dc.description.abstract |
A deaf-dumb individual always uses gestures to convey his/her ideas to others.
However, it is always difficult for a normal person to understand this
gesture language. The basic objective of this project is to develop a computer
based system to recognize 26 gestures from American Sign Language
(ASL) using MATLAB, which will enable deaf-dumb individuals significantly
to communicate with all other people using their natural hand gestures. The
proposed system in this thesis is composed of four modules, which are preprocessing
and hand segmentation, feature extraction, sign recognition and
text of sign voice conversion. Segmentation is done by converting image to
Hue-Saturation-Value (HSV) formate and using color threshold APP. Blob
features are extracted by using Bag of feature that is used the Speed Up
Robust Features (SURF) algorithm. Furthermore, the K-Nearest Neighbor
(KNN) and Support Vector Machine (SVM) algorithms are used for gesture
recognition. Finally, the Recognized gesture is converted into voice format.
To make the system more user friendly, a Graphical User Interface (GUI) is
designed. To train and test the proposed model, a self-collected dataset for
ASL was prepared using hand gestures from both male and female volunteers,
who have different ages and skin color in different background and postures
by an ordinary phone camera. The implemented model demonstrates that the
average accuracy from evaluation set is (89%), whereas the average accuracy
obtained from test set is (84.6%). |
en_US |
dc.description.sponsorship |
Sudan University of Sciences and Technology |
en_US |
dc.language.iso |
en |
en_US |
dc.publisher |
Sudan University of Science and Technology |
en_US |
dc.subject |
Engineering |
en_US |
dc.subject |
Mechatronics |
en_US |
dc.subject |
A Hand Gesture Recognition System |
en_US |
dc.subject |
Deaf-Mute Individuals |
en_US |
dc.title |
A Hand Gesture Recognition System for Deaf-Mute Individuals |
en_US |
dc.title.alternative |
نظام للتعرف على إيماءة اليد للأفراد ذوي الاعاقة السمعية الكلامية |
en_US |
dc.type |
Thesis |
en_US |