SIGNificant bridges the communication gap between people with vocal disabilities and the world around them by making sign language easier to understand.
- Computer vision for translating signs to words using American Sign Language (ASL)
- Machine learning for sign language recognition
- OpenAI Realtime API for instant generation of suggestions of sentences
- Text-to-speech upon selection of sentence
- Python 3.12 (required for compatibility with Mediapipe)
-
Clone the repository:
git clone https://github.com/CrestyDY/CodeJam.git cd CodeJam -
Create a virtual environment and install uv for package management:
python -m venv .venv source .venv/bin/activate # Linux .venv/Scripts/activate # Windows # Sorry mac users pip install uv
-
Install dependencies:
uv sync
-
Add your OpenAI API key inside of
etc/.envfollowing the template given byetc/template.env
You can run and train your own dataset for sign language:
-
Modify the json files from
src/configto modify a specific id's word/letter/number -
Modify the camera indices in both
src/util/collects_imgs.py(line 85) andsrc/util/inference_classifier.py(line 33) -
Run the following commands in sequence:
Collect training images:
python src/util/collect_imgs.py # Use --help to see optional arguments python src/util/collect_imgs.py --helpCreate datasets:
python src/util/create_dataset_models.py # Use --help to see optional arguments python src/util/create_dataset_models.py --helpTrain the models:
python src/util/train_classifier.py # Use --help to see optional arguments python src/util/train_classifier.py --helpTest inference:
python src/util/inference_classifier.py
You will now be able to test whether your hand gestures get detected for the right text conversion.
You can also run the Flask server to have the app running locally on your browser:
- Modify your camera inside of
src/app.py(line 545) if the default settings do not work - Start the server:
python src/app.py
- Open
localhost:8000in your browser to see the app
