Skip to content

iOS app, uses CV to help the blind recognize objects and voice their descriptions. @ MenloHacks IV

Notifications You must be signed in to change notification settings

shrutix/AEye-MH

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AEye-MH

Inspiration

We have always believed in equal accessibility for all. Even with all of the amazing technologies available today, we found that there is not much to aid the blind in recognizing everyday objects. We realized that this is a very prevalent issue, which inspired us to create this app.

What it does

AEye is an iOS application that uses machine learning and artificial intelligence to assist the visually impaired in recognizing objects. First, the user has to take a picture of the object through our camera function and then, our custom trained machine learning model will identify the object and display its description on the screen (i.e. an orange). For the blind to identify the object, the app will use trained text-to-speak modules to voice the description, displayed on the screen.

How we built it

We used Swift and Xcode to create the back-end and functionality of the app. We created a custom machine learning image classifying model using Apple's CoreML and CreateML modules. For the text-to-speech functionality, we used IBM Watson's API to include this in our app. We prototyped and designed the app using Sketch and Adobe XD.

Challenges we ran into

We ran into multiple challenges throughout the hackathon, which included not having the very latest Xcode update, issues with the classification of images, multiple syntax errors, collaborating with Git in Terminal, and more. However, we always stayed calm and consulted with multiple mentors throughout the day and night who were a tremendous help!

Accomplishments that we're proud of

We are very proud of creating a machine learning model from scratch, which was a tedious process and creating an aesthetically-pleasing design. We are also proud of creating an app that can aid millions of blind people and making a contribution to the world around us.

What we learned

As a diverse group of students ranging from 13 to 17, we learned to use each of our strengths and come together to create an awesome product, working together. As some of us were very new to programming and development, we were able to learn how to use Swift and Xcode, as well as create a functional app design.

What's next for AEye

In the future, we hope to market this app to the disabled community and improve the user interface to make it easier for people to use. In addition, we plan to better train our machine learning model with deeper neural networks and significantly improve the text-to-speech feature. Check the app store soon to see the launch of our awesome app!

Authors

Shruti J, Avishi G, Akshay S

see the linked Devpost for a demo and designs

About

iOS app, uses CV to help the blind recognize objects and voice their descriptions. @ MenloHacks IV

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages