Unity Manual

1. Installing VicoVR on Android device

Install VicoVRManager.apk on your device or install it from Google Play Alpha testing:

  1. Join community on G+: https://plus.google.com/communities/111347441687865742131
  2. Agree to alpha test VicoVR app here: https://play.google.com/apps/testing/com.vicovr.manager
  3. After 5-10 minutes application will be available in Google Play: https://play.google.com/store/apps/details?id=com.vicovr.manager

Then select sensor from the list of available sensors.

2. Adding Nuitrack plugin to Unity project.

You need to import nuitrack package into project:

Assets -> Import package -> Custom Package... -> nuitrack.unitypackage -> Import.


You should add to section in your Android Manifest file to add your application to Installed Apps list.

3. Usage.

3.1. Initialization, events, updates and release.

First we need to load libraries from installed service (usually in Awake):

NuitrackLoader.InitNuitrackLibraries ();

Modules creation changed in SDK v1.2 ( from new Module() to Module.Create() ).

Then initialize native Nuitrack and create necessary modules (DepthSensor / UserTracker / SkeletonTracker / HandTracker / GestureRecognizer):

nuitrack.DepthSensor depthSensor             = nuitrack.DepthSensor.Create();
nuitrack.UserTracker userTracker             = nuitrack.UserTracker.Create();
nuitrack.SkeletonTracker skeletonTracker     = nuitrack.SkeletonTracker.Create();
nuitrack.Handtracker handTracker             = nuitrack.HandTracker.Create();
nuitrack.GestureRecognizer gestureRecognizer = nuitrack.GestureRecognizer.Create();

And register event handlers for receiving data:
depthSensor.OnUpdateEvent +=                DepthUpdate;
userTracker.OnUpdateEvent +=                UserUpdate;
skeletonTracker.OnSkeletonUpdateEvent +=    SkeletonUpdate;
handTracker.OnUpdateEvent +=                HandsUpdate;
gestureRecognizer.OnNewGesturesEvent +=     GesturesUpdate;

nuitrack.DepthFrame depthFrame;
nuitrack.UserFrame userFrame;
nuitrack.SkeletonData skeletonData;
nuitrack.HandTrackerData handTrackerData;
nuitrack.GestureData gesturesData;

void DepthUpdate(nuitrack.DepthFrame _depthFrame)
	//do something with _depthFrame

void UserUpdate(nuitrack.UserFrame _userFrame)
	//do something with _userFrame

void SkeletonUpdate (nuitrack.SkeletonData _skeletonData)
	//do something with _skeletonData;

void HandsUpdate (nuitrack.HandTrackerData _handTrackerData)
	//do something with _handTrackerData;

void GesturesUpdate (nuitrack.GestureData _gestureUpdateData)
	//do something with _gestureUpdateData;


start generating data from modules.


nuitrack.Nuitrack.Update() - non-blocking method that will raise update events for all modules that have new data (DepthFrame, UserFrame and SkeletonData may be not in sync).

nuitrack.Nuitrack.Update(Module) - non-blocking method that will raise update events for all modules required for specified module that will have syncronised data, for not required modules events may hold not syncronised data. Events will be raised in order of dependency ( first for DepthSensor, then for UserTracker and finally for SkeletonTracker).

UserTracker requires DepthSensor, and SkeletonTracker requires UserTracker. So, for example, if we have all three and want them to have syncronised data we should call:


nuitrack.Nuitrack.WaitUpdate(Module) - blocking version of method above that will raise update events when all required modules get new syncronised data.

Events are raised when corresponding module has new data AND only when calling nuitrack.Nuitrack.Update, so it's preferable to call it on a regular basis (MonoBehaviour.Update() may be a good place to put it in):

void Update()
And when we don't need Nuitrack functionality anymore (or when program closes) we should release it ( for example in MonoBehaviour.OnDestroy() ):
void OnDestroy()

3.2. DepthSensor and UserTracker data (nuitrack.DepthFrame, nuitrack.UserFrame).

DepthFrame.Cols, UserFrame.Cols - number of columns in frame

DepthFrame.Rows, UserFrame.Rows - number of rows in frame.

ushort DepthFrame[ posX, posY] returns depth value at point (mm).

DepthSensor.ConvertProjToRealCoords(posX, posY, DepthFrame[posX, posY]) can be used to convert depth value, posX and posY values to real world coordinates.

int UserFrame.DepthFrame[posX, posY] returns userId at point (0 if there is no user).

3.3. SkeletonData, Skeleton, Joint and Orientation.


public int NumUsers;            // number of users
public Skeleton[] Skeletons;    // array of skeletons
public Skeleton getSkeletonByID (int id); //returns Skeleton with ID == id if it exists in Skeletons, else returns null;


public int ID;
public Joint[] Joints; 

public Joint getJoint (JointType jointType); // returns joint by its type (JointType enum contains values like  (Head, Neck, Torso etc.)), 
                                             //equivalent to this.Joints [(int)jointType];


public float Confidence;        // confidence of recognition (0..1) >0.5f should be enough to consider Joint data reliable
public Orientation Orient;      // orientation matrix
public Vector3 Proj;            // nuitrack.Vector3 != UnityEngine.Vector3, 
                                // joint coordinates on projection plane
public Vector3 Real;            // real world coordinates (in mm)
public JointType Type;          // joint type (Head, Neck, Torso, Waist etc.)


public float[] Matrix; // rotation matrix of joint 

In T-pose rotation matrices for all joints are equal to identity matrix.

Rotations for wrists, feet and head are always identity.


Creation of Quaternion in unity from rotation matrix:

Vector3 jointRight =    new Vector3( Joint.Orient.Matrix[0], Joint.Orient.Matrix[3], Joint.Orient.Matrix[6] );   //X(Right)
Vector3 jointUp =       new Vector3( Joint.Orient.Matrix[1], Joint.Orient.Matrix[4], Joint.Orient.Matrix[7] );   //Y(Up)
Vector3 jointForward =  new Vector3( Joint.Orient.Matrix[2], Joint.Orient.Matrix[5], Joint.Orient.Matrix[8] );   //Z(Forward)
Quaternion result = Quaternion.LookRotation(jointForward, jointUp);

3.4. HandTrackerData.


public int NumUsers;
public ulong Timestamp;
public UserHands[] UsersHands;

public UserHands GetUserHandsByID (int id); // returns UserHands with ID == id if it exists in UserHands, else returns null;


public HandContent? LeftHand;
public HandContent? RightHand;
public int UserId;


// Properties
public bool Click; // if hand considered clicking (hand grab)
public int Pressure; // less then 100 – not pressed (not clicked)
public float X; // x coordinate [0; 1] , if outside (-1) then hand is out of borders of virtual plane
public float Y; // y coordinate [0; 1]

3.5. GestureData.


// Properties
public Gesture[] Gestures; // array of recognized gestures
public int NumGestures;    // Gestures.Length


// Properties
public GestureType Type;  // type of gesture
public int UserID;        // 

public enum GestureType  :

3.6. Issues.

You may want to process some issues that may occur when using sensor (user occluded by objects / stays close to sensor FOV borders):

nuitrack.Nuitrack.onIssueUpdateEvent += OnIssuesUpdate;

void OnIssuesUpdate (nuitrack.issues.IssuesData issuesData)
    //do something with issuesData

IssuesData has 2 methods to get issues:

public T GetIssue () where T : Issue; // to get scene or sensor related issues (not implemented yet)
public T GetUserIssue (int userId) where T : Issue; // to get user related issues

Currently there are 2 possible issues: OcclusionIssue, FrameBorderIssue.

Check if user is occluded (OcclusionIssue):

if (issuesData.GetUserIssue(userId) != null)
    //user is occluded

Check if user touches FOV borders (FrameBorderIssue):

FrameBorderIssue borderIssue = issuesData.GetUserIssue(userId)
if (borderIssue != null)
    if (borderIssue.Left)
      //user touches left border
    if (borderIssue.Right)
      //user touches right border
    if (borderIssue.Top)
      //user touches top border