Skip to content

Using TensorflowJS for recognising gestures

Notifications You must be signed in to change notification settings

zaleos/pet-2021p2-mr

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gestures recognition

logo

  • Study project to play around with Machine learning.
  • Using TensorflowJS for recognising gestures

The Project works on top of the MediaPipe hands recognition model.

demo

The project is deployed at GitHub pages.

Instructions

You can use preloaded data for training model, it can differentiate an open palm, and a "Thumbs up" gestures. To train model with preloaded data just hit the "Train" button.

If you want to train model with your own inputs:

  • make sure you have camera running
  • (Optional) hit F12 to open the dev console to get better insigt about what is going on
  • unselect "Use preloaded data" check input
  • Add around 40 inputs of a getsure by hitting the "Add Palm entry" button
  • Add around 40 inputs of your second gesture by hitting the "Add Fist entry" button
  • Hit the "Train" button

Now when training is done you can check analysis on the right. Now when program recognises your gesture it shows it in the label at the top

For those who are interested, all ML meat is here.

Technologies used:

  • TensorflowJS
  • MediaPipe ML models
  • ReactJS for visualisation
  • NodeJS express server for data manipulation (not used at prod)

About

Using TensorflowJS for recognising gestures

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 95.6%
  • SCSS 2.0%
  • HTML 1.6%
  • CSS 0.8%