- Download the whole repository and extract it to a folder.
- Run main.py script via console prompt.
- If you place your hand accurately enough, the model will classify the gesture. You should put your hand to visible green area in the opened webcam window
Note: model is developed by using american sign language dataset from here. Therefore, it is recommended for you to follow the following diagram while using the model: