Skip to content

Latest commit

 

History

History
40 lines (39 loc) · 1.8 KB

README.md

File metadata and controls

40 lines (39 loc) · 1.8 KB

SignSense

Empowering accessibility with SignSense

Through SignSense, we aim to make it possible for disabled people to be able to communicate effectively by detecting and translating sign language based on the American Sign Language system.

**Table of contents

General info

This project is a part of the BitBox Hackathon. **Problem Statement: Let's say you have a friend named John who has Aphasia. John has a disorder that affects his communication. He has difficulty in communicating verbally. **How we plan to solve it: Through SignSense, other people will be able to understand the sign language John uses and at the same time John will have an application to translate his sign language into normal text.

Technologies

Project is created with:

  • Python
  • OpenCv
  • MediaPipe
  • TensorFlow
  • Pygame
  • NumPy

Usage

The interface provides translated text for the respective hand gesture detected by the webcam installed on your device. We also provide text to speech conversion if the user wants their message to be relayed in audible form.

Project Status

The project is still under development as there are various hand gestures yet to be trained and modelled. We are also striving to include various updates and improvements for faster and more effective communication.

Room for Improvement

We aim to improve our project by implementing the following features:

  1. Try to build an app interface to make this technology more accessible.
  2. Try to translate hand gestures in more than one language to help a wider diversity of people.
  3. Try to recognise gestures which are heavily motion-based and are complex to be distinguished. ##Contributors Soham Kukreti Yuvraj Rathi Satyam Rathi Sanvi Sharma