Skip to content
This repository has been archived by the owner on Oct 1, 2024. It is now read-only.

Creating a multi-threaded full body tracking solution supporting arbitrary humanoid avatars for Unity using Google Mediapipe Pose Python bindings.

License

Notifications You must be signed in to change notification settings

ganeshsar/UnityPythonMediaPipeAvatar

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Unity + Python Google MediaPipe Avatar

NOTICE: this project has been replaced by Tracking4All which is actively supported and features much better quality and performance then this project.

Overview

This project is an attempt at binding the pose generated by MediaPipePose to arbitrary humanoid avatars inside of Unity. MediaPipe runs fully in Python and the results are piped to Unity for the avatar and visualization. The model, WebCam reading, and game all run on different threads.
image showing waving

Installation

  1. Install Python and Unity (2021.3.24f1 was used, but any version close to that should be fine).
  2. pip install mediapipe
  3. Clone/download this repository.
  4. Run main.py using Python.
  5. Run the Unity project while inside CalibrationScene.
  6. Follow the on-screen instructions. You must copy the pose of the avatar using your webcam for calibration.
  7. Move around after the calibration to control the avatar. You may find this video helpful.

Tracking for Arbitrary Humanoid Avatars

By default, this project includes Unity-chan for testing. If you wanted to use other 3D models, see below:

  1. The humanoid avatar must be rigged for animation and configured as humanoid in the Unity Rig import settings. Also, the following bones are required: Upper Arm -> Hand, Upper Leg -> Foot, Hips -> Chest, Neck -> Head. Most humanoid avatars will have this by default.
  2. The resting pose of the avatar should be a T-pose.
  3. Add the Avatar script to the GameObject and correctly assign all the fields.

Contact me if you're having trouble with this.

Current Limitations

  • Currently only supports front facing movement (you must be facing the camera). I will work on upgrading this solution over time. This is my first attempt, but it will work quite well for general pose detection.
  • If you are interested in only the body pose and not avatar see this other project.

Notes:

  • See global_vars.py for some basic configuration options to speed up/improve precision of the detection/customize the WebCam reading.
  • Wearing clothing that contrasts with a background helps a fair bit. Also, good lighting in the room.
  • DO NOT minimize the Python application when running Windows as this causes the processing to become a background task creating a lot of lag.
  • There is no limitation on the number of Avatars you can have in scene (up to performance limitations).

Unity-chan is licensed under the Unity-chan License Terms.
© Unity Technologies Japan/UCL

image showing waving

About

Creating a multi-threaded full body tracking solution supporting arbitrary humanoid avatars for Unity using Google Mediapipe Pose Python bindings.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published