-
Notifications
You must be signed in to change notification settings - Fork 0
Embodiment
I developed a Kinect Connector using this API that is accessing the data, converting it into the correct the unit and coordinate system for Unreal and is accessible from Unreal Blueprints, so that it can be easily used. After first testing performance and stability are best with this approach.
Basically the connector pulls data of the different joints in a similar way as a VRPN connector does. Kinect v2 supports 24 different joints, newly including index finger and thumb. The pulled joint data is stored in a map that can be accessed via unreal later. Furthermore the connector already provides the data structure and update logic to save and pass data of multiple tracked users.
Alongside to the grabbing and saving of the data, the connector also does some pre-calculations for easier use. It transforms the vector from Kinect V2's coordinate system into the Unreal coordinate system and multiplies all the values by 100, since the raw data is coming in meters while Unreal needs centimeters (Unreal Units are now equivalent to cm). Furthermore the connector provides the delta between the torso's position at time t and time t-1, so that it's easy to map the genereal body movement into an application.
For the use of the data in Unreal the connector can accessed both by C++ source code and by Blueprints. The functions that are supposed to be called from Unreal application all use the special Unreal C++ types (e.g. FVector for three dimensional vectors), so that there are no compatibility problems.
One of the next step regarding the Kinect V2 connector is to provide as an Unreal Engine 4 Plugin instead of being directly coded into the framework. In this way the connector could be used in any Unreal project dynamically. Furthermore I want to setup a second repository for the connector itself, so that I can provide it to the Unreal community.
Neccessary coordinate system transformation:
Kinect X => Unreal -Y
Kinect Y => Unreal -Z
Kinect Z => Unreal X
Multiply all dimensions by 100, since Kinect returns data in meters and Unreal needs centimeteres (Unreal Units are equivalent to cm now).
The VRPN-Client of this project's predecessor (The Unreal Pit, written for Unreal Engine 3) was enhanced. One of the main weaknesses of the VRPN-Client was the way how it included the VRPN library. The client was dependent on the actual code projects instead of libraries. To alter this issue I now compiled the neccessary parts of the VRPN library myself as static 64-bit C++ libararies (in total three libraries). These static libraries are now included by the client so the client itself and the development project are vey lightweight and ease to reuse and share now.
The client is implemented using the singleton pattern, so that there'll only be one instance throughout runtime. When connecting to a tracking system, the VRPN client takes multiple tracker names (e.g. the names of the rigid body targets to use) and an address. Therefore the client is flexibly adjustable to tracking systems using both one and multiple trackers.
Both the Kinect V2 connector and the VRPN-Client are designed to be used either in C++ Code or in Blueprints (the visual scripting system of Unreal Engine 4).
The connector provides functions to grab a joint's current position (relative to the torso) and rotation. An example of accessing and mapping the joint positions can be found in the Blueprint UnrealMeKinectCharacter which utilizes the positions to render a schematic embodiments (spheres for all the joints). On the other hand an example of accessing the joint rotations can be found in the C++ class UnrealMeAnimInstance
Avatar emodiment in UnrealMe is done via Unreal's animation system. Animation Blueprints that map data to an avatar need to be derived from the C++ class UnrealMeAnimInstance. This is done by the animation Blueprint UnrealMeKinectRotationAnimation for instance. The animation blueprint uses the rotations saved by UnrealMeAnimInstance to map them to different bones of the Unreal skeleton. In terms of avatar embodiment there's no direct use of joint positions, instead every joint movement is realized by relative rotations.
-
Introduction
-
Requirements & Approach
- Hard- & Software Setup
3.1 Tracking
3.2 Display
- Implementation
4.1 Core Application
4.2 Embodiment
4.3 Calibration (tbd)
- Tutorials
5.1 Useful links
5.2 General UE4 information/experiences
5.3 Connecting and using a tracking device
5.4 Basic Kinect Demo
5.5 Linking static and dynamic C++ libraries
5.6 Joint IDs
- Future potential