Skip to content

A react app that tracks your hand and gestures to play rock-paper-scissors with you and detect the outcome of each game.

Notifications You must be signed in to change notification settings

radkinz/rps-ai-with-hand-recognition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Real-Time Hand Gesture Interface (Rock–Paper–Scissors Demo)

A React + TensorFlow.js demo that detects hand gestures from live webcam video and maps them to discrete intent signals (rock / paper / scissors). The project uses real-time landmark tracking and gesture classification to drive an interactive feedback loop (prediction + game outcome).

System Overview

Webcam → Hand Landmarks → Gesture Classification → Intent Output → UI Feedback

  1. Hand tracking: TensorFlow.js Handpose detects 21 hand landmarks per frame.
  2. Gesture recognition: Fingerpose compares landmark geometry against gesture templates.
  3. Interaction loop: The UI displays the recognized gesture and computes the game outcome.

Tech Stack

  • React (UI + webcam loop)
  • TensorFlow.js Handpose (hand landmark detection)
  • Fingerpose (gesture classification)

Model: TensorFlow.js Handpose

TensorFlow.js is a JavaScript ML library that provides pretrained models for the browser. This project uses the Handpose model to estimate hand landmarks from live video:

Gesture Classifier: Fingerpose

Fingerpose uses Handpose landmarks to classify gestures. This repo defines and detects three gestures:

Demo Video

rpsdemo.mp4

Running Locally

npm install
npm start

About

A react app that tracks your hand and gestures to play rock-paper-scissors with you and detect the outcome of each game.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published