Inspiration

While brainstorming ideas for the hackathon we noticed yesterday's Google doodle which was a tribute to Stephan Hawking and it inspired us to come up with EyeAssist. ALS is a nervous system disease that weakens muscles and impacts physical function. There are many other related diseases that cause loss of muscle function in the whole body. It can be caused by accidents or by birth as well. So we wanted to make a cheap solution that will help such people and make their life easier.

Our solution, Eye Assist provides communication and control for people living with ALS / MND; spinal muscular atrophy; cerebral palsy; non-speaking aphasia, MS, cancer, traumatic brain injury, or spinal cord injury; or locked-in syndrome. (or any sort of paralysis that may require additional support.) Eye Assist allows individuals to communicate and interact. Users are enabled to write books, attend school, and communicate with their loved ones – all through the power of their eyes.

Inspired by the Women in STEM, we wanted to acknowledge that girls with disabilities face additional barriers as a result of intersectionality and wanted to integrate an learning environment that has more accessible tutorials for technology within an accessible interface. Thus, Eye Assist has a Learning section that focuses on making it simple for girls with disabilities especially to navigate learning concepts. We also identified a coding editor with fill in the blanks / common phrases on a keyboard with our Eye Assist would be able to make coding more convenient.

What it does

Eye assist makes people with any kind of paralysis more independent and helps them in navigating their day-to-day life. The app can be operated using eye movement and doesn't require any physical touch for navigation. With eye tracker already present in the market and apps it can work smoothly.

When in use, it can Track eye movement. Some features we worked on include the ability to control mouse cursor position and text entry.

Eye tracking devices help individuals with no control, or only limited control, over their hand movements. Eye trackers follow the movement of the eyes to allow the person to navigate the web and to type on custom screens. People living with disabilities or degenerative diseases are benefiting from eye tracking technology, including patients with ALS, multiple sclerosis, brain injuries, muscular dystrophy, cerebral palsy, spinal cord injuries and more. Eye tracking devices allow users to harness the power of their eyes to communicate. Patients with mobility-limiting diseases, such as ALS, can benefit greatly from eye tracking technology such as our Eye Assist.

How we built it

We built this app using flutter. We initially started with building Figma design and then coded UI in a flutter, a cross-platform framework that allows us to make apps for multiple platforms.

Challenges we ran into

We spend a lot of time doing research on eye tracking as we wanted to make one of our own. As we were not that familiar with OpenCV so we were not able to fully implement it but we found some eye tracker apps and devices that can be integrated with our app. A lot of time went into debugging. Collaborating with team members in different time zones was a little tricky but we figured it out and divided work nicely.

Accomplishments that we're proud of

We are happy that we were able to finish a working prototype and submit it in SheHacks.

What we learned

We learned a lot of different technical aspects of the project.

What's next for Eye Assist

Implement more features Bundle eye-tracking feature to make it a standalone app Make it more accessible Improve UI

Built With

Share this project:

Updates