You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This project implements three machine learning models for real-time recognition of numbers performed with hand gestures. The models are designed to accurately interpret finger gestures, identifying numbers from 0 to 10 shown by the user in front of a webcam.
Auto-learning search framework based on a weighted double Q-learning algorithm:"Integrated block-wise neural network with auto-learning search framework for finger gesture recognition using sEMG signals"
This project is a presentation slideshow application that uses hand gestures for navigation and annotation. It tracks the user's hand to move between images and create annotations. Real-time feedback is provided through annotated images and live camera feed, offering an interactive and intuitive presentation experience.
Build a system that can correctly identify American Sign Language signs that corresponds to the hand gestures. Our proposed system will help the deaf and hard-of-hearing communicate better with members of the community.
Virtual zoom AI hand gestuer project is a AI based project in which you can detect hand and fingers and with the help of your index finger and thumb you can zoom in and out. The project is made using python language.