WWDC 19 Scholarship Recipient

Won the Apple WWDC19 Scholarship in 2019 (This is my second time!). My submission this year was a live sign language translator called Project Gesuture, you can find out more about it below!

This year was pretty eventful! I met Tim Cook and iJustine. Met so many cool new entrepreneurs over in Silicon Valley, visited Palo Alto, and SF MOMA.

Gesture

CoreML Powered Live Sign Language Translations, using a Convolutional Neural Network that can predict basic sign language alphabet.

Talk To The Hand

In the US, Sign Language is the preferred mode of communication for over 2 Million People.

However, in a world where translating English into Chinese can be done in seconds, we’ve overlooked a language that’s used by countless individuals around the world.

Actions Speak Louder Than Words

This video Can You Read My Lips (https://www.hearinglikeme.com/can-you-read-my-lips-lipreading/) was one of my big inspirations. Lipreading is a common technique used by people who have – or are developing – hearing loss. But, lipreading and straining to hear can be hard work.

Building Process

I painstakingly collected a dataset of over 2000 images from friends and family in order to train the model. I was unable to account for all skin tones and hand shapes and lighting conditions, the model is biased towards asian skin tones, and indoor lighting conditions.

Score

97% Precision Score 90.8% Recall Score

Due to the nature of the training data used, the submission is not open sourced at the moment.

Leave a Reply

Your email address will not be published. Required fields are marked *