This is the project which we have been proceeded in Sungkyunkwan University for Graduation Project.
Professor : Intelligent & Biomedical electronic Engineer 유재천
Team Members : Electronic electrical Engineer 박준호 김효준 류지환 Mathematician 박소정
According to the result of the 2017 survey of the people who have deafness,
69.3% of all respondents responded that sign language was the most used communication method.
When using facilities such as restaurants, it is impossible for deaf people to order through conversation,
and it is hard to communicate through writing each time.
Therefore, if a sign language interpreter is installed in each store,
deaf people will also be able to use the service conveniently.
- autolabel_0_createMask.m
- autolabel_1_objectlabel.m
- autolabel_2_HandDetection.m
- function_jitterImageColorAndWarp.m
- main_0_Get_Image.m
- main_0_ImageSegment.m
- main_1_Data_construct_seg.m
- main_2_ResNet_based_YOLO.m
- main_3_Training_seg.m
- main_4_Evaluation.m
- main_4_Realtime_handseg.m
- main_0_Get_Image.m : Get image from webcam or camera
- main_0_ImageSegment.m : Segment images obtained from "main_0_Get_Image"
- Execute ImageLabeler from matlab tool
- Click AutoLabeling button(labeling settings are coded in #1 autolabel_0_createMask #2 autolabel_1_objectlabel #3 autolabel_2_HandDetection)
- main_1_Data_construct_seg.m : Store labeled Images as data
- main_2_ResNet_based_YOLO.m : Model to be trained
- main_3_Training_seg.m : Training Model with data
- main_4_Evaluation.m : Visualize train result
- main_4_Realtime_handseg.m : Detect sign language real-time