-
Our sign language recognition project involved creating a custom dataset, preprocessing images, training a model, integrating with React, and hosting with Firebase.
-
The result is a real-time sign language recognition application that recognizes a variety of sign language gestures.
-
Our Model is trained for 26 alphabets and 16 words of ASL and which are commonly used in general communication.
-
Real-Time Recognition
-
Easy-to-Use Interface
-
Adaptive Learning
-
High Accuracy
-
Real Time User Progress Data
Front-end:
- React
- Redux
Back-end:
- Firebase (for hosting, authentication, and storage)
Machine Learning Framework:
- MediaPipe
NPM Packages:
- @mediapipe/drawing_utils
- @mediapipe/hands
- @mediapipe/tasks-vision
- @redux-devtools/extension
- chart.js
- firebase
- js-cookie
- react-chartjs-2
- react-icons
- react-redux
- react-router-dom
- react-toastify
- react-webcam
- redux
- redux-thunk
- uuid
- So this project is a group project done in collaboration with the members mentioned below.
Name | Email-id |
---|---|
Shubham More | [email protected] |
Sameer Metkar | [email protected] |
Omkar Mandavkar | [email protected] |
Durgesh Kolhe | [email protected] |
-
Our Project Report: Report.pdf
-
Published Paper: Paper.pdf
-
Dataset Link: Sign_Dataset
-
Gesture Recogition Documentation: Mediapipe
-
The Model Training File is located in root folder
- Check this document for project setup: SetUpDoc.docx
For support, contact
- email: [email protected]
- LinkedIn: Shubham More