Skip to content

AkshayChan/CreateED

Repository files navigation

CreateED

Our repository for the hackathon CreateEd.

We built CrossCom, a device that recognises hand gestures for 26 alphabets, 10 letters and a few phrases like I love you/how are you etc using a convolutional neural network based on tensorflow, keras, and numpy. It then converts those hand gestures to text and text to speech using pyttsx (sample given here).

Also, the user can speak to crosscom as there as is a microphone in the camera (we couldn’t get a USB microphone unfortunately to plug into our raspberry pi) and that gets converted to text using the google cloud speech API, and comes out through the speaker behind.

The robot has a webcam and a raspberry pi 7' Touchscreen display. The back of the robot has a rasbperry pi which is wired up to the display and has a USB connection to the webcam and speaker. The screen and raspberry pi are both connected to a power bank which is at the back too.

Follow the project here : https://devpost.com/software/rebly

It is a simple communication device between somebody who’s deaf and mute, and somebody who’s blind.

alt text

CrossCom won the Accenture Challenge for best use of Machine Learning at the hackathon!

alt text

About

Our repository for the hackathon CreateEd.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages